2
votes

I have an ASP.NET web application, which get's data from a SQL Server 2008 R2 database and displays it to the users. The tables in the database gets populated with latest data from another service, every 30 minutes. This service runs for 5 mins.

The problem I am facing is, whenever a user access the application while the tables are being updated, the following error is thrown,

System.Web.HttpUnhandledException (0x80004005): Exception of type 'System.Web.HttpUnhandledException' was thrown. ---> System.Data.EntityCommandExecutionException: An error occurred while reading from the store provider's data reader. See the inner exception for details. ---> System.Data.SqlClient.SqlException: Transaction (Process ID 209) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

How to overcome this situation?

The steps I follow is,

using (SqlConnection con = new SqlConnection(db.Database.Connection.ConnectionString))
                {
                    con.Open();
                    using (SqlTransaction tran = con.BeginTransaction())
                    {       
                        string queryText = string.Format("DELETE FROM TABLE_NAME", DataDateTime);
                        SqlCommand cmd = new SqlCommand(queryText, con, tran);
                        cmd.ExecuteNonQuery(); 

                        SqlBulkCopy bc = new SqlBulkCopy(con,
                          SqlBulkCopyOptions.KeepNulls, tran);

                        bc.BatchSize = 5000;
                        bc.DestinationTableName = "TABLE_NAME";
                        DataTable data = newIncidentExtractList.ToDataTable();
                        bc.WriteToServer(data);    

                        string updateText = @"Update TABLE_NAME WITH CONDITION";
                        SqlCommand cmdActive = new SqlCommand(updateText, con, tran);
                        cmdActive.ExecuteNonQuery();    

                        string updateTextInactive = @"Update TABLE_NAME WITH CONDITION";
                        SqlCommand cmdInActive = new SqlCommand(updateTextInactive, con, tran);
                        cmdInActive.ExecuteNonQuery();    

                        string updateUnAssigned = @"Update TABLE_NAME WITH CONDITION";
                        SqlCommand cmdUnAssigned = new SqlCommand(updateUnAssigned, con, tran);
                        cmdUnAssigned.ExecuteNonQuery(); 

                        tran.Commit();
                        insertResult = true;
                    }
                    con.Close();
                }
3
Are you dropping and recreating tables, deleting all records and inserting new ones or something else?ste-fu
I am not dropping the tables but, deleting all records and inserting new ones.Vim
how does that application work currently when you are deleting all entries and re-adding. You do this in 1 Transaction / Commit?phil soady
Yes, I am doing this in 1 transaction. I just added the code structure. Please verify.Vim
And the app can be updating TABLE_NAME at the same time you are logically dropping and re-adding with a different set of entries.? Wowphil soady

3 Answers

2
votes

The only thing you can do is to review your data migration service. Keep transactions as short as possible. Do not open transaction untill you really going to write data and close it right after writing. This may be helpful.

1
votes

Deadlocks can occur when two connections update tables in opposite order. For example, one connection inserts into table "example1" first and then into "example2," while another connection inserts into table "example2" first and then into "example1" within a transaction. There are few things, that you can do to avoid Deadlocks:

  • Change the sequence of the ]conflicted connections updating the tables.
  • enable IRL (insert row lock) on your table (check this out)
  • Re-create the clustered index on this table with a large fill factor. (Check this out)
  • Eliminating the non-clustered secondary index (If possible)

Check out this link from MS for more detailed info

1
votes

Given the nature of the problem I would consider Using a shadow DB concept.

You prepare DB Y , when ready all new connections for data go to DB Y, Then you update DB X whilst Y is live. When X is ready redirect all new connections back to DB X. EF supports dynamic connection strings.

If the table isnt involved in constraints and foreign keys you could use shadow tables and dynamically access table A or Table B based on a current Live table flag.