0
votes

I am using SpecFlow to power an API integration test harness that will provide living documentation and test coverage of a new UI API. I have a few feature files written and have finally gotten to the point that I'm trying to run around 60 tests in parallel. However, while I can run the features individually with no issue I run into intermittent failures when trying to run them all in parallel using the Visual Studio 2019 test runner and the XUnit runner plugin.

Any given SpecFlow scenario will utilize steps from two different step binding classes, and each of those step binding classes may call for the injection of up to three context objects meant to capture state during the scenario and clean up the environment after the scenario completes. For example, a feature might look like this:

Scenario: Retrieving Message records returns data
Given I have created the following ClientAccounts:
    | Index | SiteID | IsActive |
    | 1     | 1      | 1        |
And I have created the following Logins:
    | Index | IsActive |
    | 1     | 1        |
And I have created the following Messages:
    | MessageID | MessageText |
    | 1         | Asdfasdf    |
When I send an authentication request using the first Login and the IP Address 127.0.0.1
And I send a read request to the v1 Message endpoint for the first Message record created:
Then the first Message response should be equivalent to the following data for the first Message record created:
    | MessageTest |
    | Asdfasdf    |

The first three steps belong to a class called DatabaseSteps whose constructor accepts an instance of a class DataUtility that facilitates CRUD operations to/from the database and keeps track of what records have been created as a part of the test execution. There are also some [StepArgumentTransformation] bindings that transform those tables into database objects that can be inserted into the db.

The fourth, fifth, and sixth steps belong to additional step classes that have constructors taking as dependencies both a DataUtility for db access and an ApiClientContext which stores session information as well as info about API responses that have previously been saved, in order to assert against the actual responses obtained during the "Then" stage. The DataUtility class implements IDisposable to simplify post-test cleanup.

Based on the documentation I expected that the context classes injected through the built-in DI container would be thread-safe. However, I found that both with DataUtility implementing IDisposable or by not declaring the interface and simply calling Dispose() directly from an [AfterScenario] hook, tests having to do with expected returned data fail during most test runs. It's hard to say for sure because troubleshooting concurrency issues is awful, but it seems as though the DataUtility instance is being shared among scenarios and when any given scenario calls Dispose() on the data utility all of the test data I've scaffolded up is purged - even that in other unrelated scenarios. When I have neither IDisposable declared on the DataUtility class and nor call Dispose() from a hook, the tests execute without incident. Is there a particular way I need to set up an injected context class so that each scenario gets its own instance of that class?

Other Details: VS2019, SpecFlow 3, XUnit Test Runner

1
Is DataUtility a class you created?Greg Burghardt
Yes - DataUtility is a class I wrote to facilitate database interactions. It takes as a constructor parameter an instance of IConfiguration (which is registered to the ConfigurationBuilder in a BeforeScenario hook) and is used to get the database connection strings. It contains as members instances of two classes ProductDatabase and STSDatabase that themselves contain both database provider classes (which execute CRUD operations in their respective databases) and various List<T> where T is a database object POCO that have been created so they can then be deleted when a scenario finishes.Thomas Parikka
Context injection is supposed to be thread safe. Without seeing the step definitions and code in the DataUtility class, along with an error message and stack trace, I don't think we can help you.Greg Burghardt
I think a cut down version would be enough.Greg Burghardt
I spent a bunch of time digging into this yesterday and today adding logging in my DAL to figure out what was being done with object creation and deletion in the database, and ultimately found a very sneaky bug in the assist library I am using with Dapper; it was picking up a field whose name ended with GUID and using it in an auto-built delete query instead of using the actual ID field. Since the object library I'm using for the database is shared with the product team I can't decorate it with my own attributes, and fell folly to that issue. Thank you @GregBurghardt for the help!Thomas Parikka

1 Answers

0
votes

I spent a bunch of time digging into this yesterday and today adding logging in my DAL to figure out what was being done with object creation and deletion in the database, and ultimately found a very sneaky bug in the assist library I am using with Dapper; it was picking up a field whose name ended with GUID and using it in an auto-built delete query instead of using the actual ID field. Since the object library I'm using for the database is shared with the product team I can't decorate it with my own attributes, and fell folly to that issue. Thank you @GregBurghardt for the help!