3
votes

I have a legacy application which should be refactored and made testable. It consists of C# code separated into model and view code. The C# model code essentially exposes a public API with high level CRUD methods.

In itself the C# is doing rather little but delegates the actual work to SQL Server stored procedures. The C# code mostly concerns itself with configuring a large array of parameters which then directs what to process in the SQL database - there is very little input/output as such.

The bulk of the processing occurs in these stored procedures. Because this is hundreds of thousands of database rows, it does not seem practical to convert this all to an ORM approach (where essentially the data would be loaded into memory in C#) - the processing really should stay on the SQL Server.

My question is how to unit-test something that. The obvious choice seems to be:

  • write unit tests for SQL Server procedures
  • write unit tests for the C# code (mocking SQL procedures)

Then one will have to write integration tests which tests the overall functionality. Since there is so little functionality in the C# code (some are mere stubs calling the SQL proc), does this even have any value in testing? It sounds somewhat redundant to me.

It sounds I should do only integration testing which allows me to use better testing frameworks than what I can do within SQL Server (I could pretend they really are unit tests by ignoring my knowledge of having an SQL backend) - but it does violate the "unit testing" methodology.

What would be the most workable solution for this scenario?

Edit: Most of the C# methods look similar to the sample given below - some do a few lines more processing, but in essence it boils down to ADO.NET code:

public void LogImport(int PeriodID, String orgCode, int userID)
{
    TryOpenSQLConnection("ImportSQL.LogImport");

    try
    {
        SqlCommand cmd = new SqlCommand("dbo.LogImport", sqlCn);
        cmd.CommandType = CommandType.StoredProcedure;
        cmd.Parameters.AddWithValue("@periodID", PeriodID);
        cmd.Parameters.AddWithValue("@orgCode", orgCode);
        cmd.Parameters.AddWithValue("@userID", userID);
        cmd.ExecuteNonQuery();
    }
    catch (Exception ex)
    {
        throw new DataException(" ... ")
    }
}
2
I question the validity of this crucial statement: " not seem practical to convert this all to an ORM approach (where essentially the data would be loaded into memory in C#) - the processing really should stay on the SQL Server." It is highly dependent on the type of "processing" that is actually done. Yes if you really do read each and every column and each and every row and process each one in .Net, then yes. But I highly doubt that to be the case.Casper Leon Nielsen
Unfortunately, we can't make blanket statements about your project. We don't know what you know; and we don't know all the considerations you have, or how testable your system is (we're not seeing any actual code, just your word for what the code does). Since we're looking to provide a library of useful answers, "ancedotes" don't work well, especially in situations where there is no concrete objective code to work off of. And "What would be the most workable solution for this scenario?" assumes a level of familiarity with your project we can't have.George Stocker
I have updated to give example code. Pretty much all of the C# code looks this way.nepdev
FYI But that try catch is HORRIBLE. Just delete it and let it bubble on its own. You are doing nothing but shooting yourself in the foot when you try to debug with a catch like that. If you have some reason that truely requires a custom error message for any type of failure then at least pass the original exception as an inner exception so you can still debug it later.Marie

2 Answers

0
votes

I agreed you should do only Integration tests. If your C# code isn't doing much more than dealing with ADO.net, simply test the call to those classes, doing the round-trip to DB. Ideally wrap it all in a TransactionScope so you can rollback it at the end:

using (new TransactionScope()) {
    // integration test
}
0
votes

It's obviously a matter of opinion at this point, but here's my 2 cents:

I'd unit test the C# portion to make sure you are able to catch potential regressions in the future. There's still things to test there: variable names can't change, mandatory vs optional variables, maybe you want to add more checks in there for invalid values, etc etc. All of that can be done by mocking the DB portion.

You should also be unit testing the SQL stored procs for sure since that's where the bulk of the logic is happening. I've never actually unit tested SQL beore, but I'm sure there are decent tools out there to help with that.

Then you fall in to integration testing, which at this point will merely prove that your c# code is properly talking with the DB. That should only happen after you've proven that both the C# portion and the SQL portion are behaving as expected. (Integration tests are always a bit more "expensive" in the sense that at this point, you'll need an actual environment set up to test this, whereas unit testing is usually done either at the build level or right after).

Hope this helps!