I'm going to post what I did to achieve getting SpecFlow test results to show in the Azure DevOps Builds->Tests area, however be forewarned that this is a giant hack.
I believe SpecFlow/NUnit tests are supposed to be supported just as regular NUnit tests are, and I do think it is around test adapter configuration as Andreas Willich stated, however I was neither able to get it to work for me, nor was I able to find examples of where anyone has gotten this to work through normal pipeline configuration. SpecFlow+ may also have ways of working, but I don't use the plus version. If/when I can learn the proper way to do this, I'll stop using the below.
Create a simple process to write SpecFlow test name and test result values to disc in CSV format. I did this because I wanted to separate feature testing solutions from any other logic going on with this hack. This process should be as simple as using an [AfterScenario] hook step to pull the Scenario title and result values from Scenario.Context - or wherever you can get them - and writing them to a text file in comma-separated value format.
Integrate that CSV result logging process into the "SpecFlow" project that contains tests for which you want results to show in Azure DevOps.
The CSV should look something like this:
DemoScenario_01 Lorem ipsum dolor sit amet consectetur adipiscing elit, Pass
DemoScenario_02 Sed do eiusmod tempor incididunt ut labore et dolore, Pass
DemoScenario_03 Magna aliqua Ut enim ad minim veniam quis, Pass
DemoScenario_04 Nostrud exercitation ullamco laboris nisi ut aliquip, Pass
DemoScenario_05 Ex ea commodo consequat Duis aute irure dolor in, Pass
DemoScenario_06 Reprehenderit in voluptate velit esse cillum dolore eu, Pass
DemoScenario_07 Fugiat nulla pariatur Excepteur sint occaecat cupidatat, Pass
DemoScenario_08 Non proident sunt in culpa qui officia semper, Pass
DemoScenario_09 Deserunt mollit anim id est laborum arcu semper, Pass
DemoScenario_10 Orci a scelerisque purus semper eget Ornare arcu dui vivamus, Pass
- Create a separate simple "dummy" project containing only a single class having one working NUnit "test" method that does nothing but Assert.Pass(). This project will need the NUnit and NUnit3TestAdapter NuGet packages installed.
The class should look something like this:
namespace DemoNunit
{
public class Tests
{
[Test]
public static void DemoTest005()
{
Assert.Pass();
}
}
}
Create an Azure DevOps Git Code Repository for this "dummy" project's solution and push it to the repo.
Create a new Azure Pipeline configured for CI builds which will auto-trigger when commits to the "dummy" NUnit project are pushed to its the repo. Configure the pipeline to have a Visual Studio Test step called "SpecFlow Tests" which looks for the dll containing that single NUnit test. Name this pipeline per your "SpecFlow" project or the functionality it tests, as this pipeline will actually end up showing those results in the Builds->Tests area.
In a new and separate "convert" project, create a process which will read the simple CSV output result file from your "SpecFlow" project, and will write a new .cs file over the existing one in the "dummy" project that houses the NUnit test.
The single NUnit test method in the "dummy" project will now be replaced with multiple methods, one for each result that was logged in the CSV. These methods will be named as guids minus any dashes, and prefixed with a letter to make them valid C# method names. The goal is just to have non-repeating method names. The actual SpecFlow test scenario names will be stored in the NUnit TestCase TestName attributes. Assert.Pass() or Assert.Fail() will be used per the associated result values read from the CSV file. Compile this "convert" project into an exe.
I've left out the code for reading the CSV results...
namespace CreateCsFile
{
public static class CsFile
{
public static string OpenClass =
"using NUnit.Framework;" +
"namespace DemoNunit" +
"{" +
" public class Tests" +
" {";
public static string CloseClass =
" }" +
"}";
public static string TestMethod =
" [Test, TestCase(TestName = \"UniqueNameAttribute\")]" +
" public static void MethodName()" +
" {" +
" Assert.Result();" +
" }";
public static void LogListOfResults(
List<Test> resultsList)
{
Log.CsharpFile(OpenClass);
foreach (var result in resultsList)
{
Outcome.IsValid(result.Result);
var testMethod =
TestMethod.Replace(
"UniqueNameAttribute",
result.Name).
Replace(
"Result",
result.Result).
Replace("MethodName",
"a" + Guid.NewGuid().
ToString().
Replace("-",""));
Log.CsharpFile(testMethod);
}
Log.CsharpFile(CloseClass);
}
public static void ConvertCsvResultsToNunitResults()
{
LogListOfResults(
ParseCsv.ResultsSheet());
}
}
}
Log class...
namespace CreateCsFile
{
public class Log
{
public static string WasFileRemoved
= "";
public static string CsFileToggle
= "True";
public static string CsFile
= "C:\\Projects\\DemoNunit\\Tests.cs";
public static void CsharpFile(
string nunitData)
{
if (CsFileToggle.ToUpper()
== "TRUE")
{
if (string.IsNullOrEmpty(
WasFileRemoved))
{
RemoveExistingCsFile();
}
var log = !File.Exists(CsFile) ?
new StreamWriter(CsFile) :
File.AppendText(CsFile);
log.WriteLine(nunitData);
log.Close();
}
}
public static void RemoveExistingCsFile()
{
if (CsFileToggle.ToUpper()
== "TRUE")
{
WasFileRemoved = "True";
try
{
var fileInfo =
new FileInfo(CsFile);
fileInfo.Attributes =
FileAttributes.Normal;
File.Delete(fileInfo.FullName);
}
catch
{
throw new Exception(
@"Unable to delete existing csharp file...");
}
}
}
}
}
Ugly Output:
namespace DemoNunit{ public class Tests {
[Test, TestCase(TestName = "DemoScenario_01 Lorem ipsum dolor sit amet consectetur adipiscing elit")] public static void aa5f7fd239d6a40878780bc6c81f3a18b() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_02 Sed do eiusmod tempor incididunt ut labore et dolore")] public static void aa9882fa95b17499eb9386b20a7ff303d() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_03 Magna aliqua Ut enim ad minim veniam quis")] public static void a440c25f8c3c24e92ad90224da56bafda() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_04 Nostrud exercitation ullamco laboris nisi ut aliquip")] public static void ab2c3cc6997df4a42b0992128f63358f7() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_05 Ex ea commodo consequat Duis aute irure dolor in")] public static void a2c9f744dcd2c42c99cb6e288cf09fc78() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_06 Reprehenderit in voluptate velit esse cillum dolore eu")] public static void a294422ae029049f9ac4be6f9bb4529cc() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_07 Fugiat nulla pariatur Excepteur sint occaecat cupidatat")] public static void aa2dcdf889ffe4e46b57a73882d6f1a68() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_08 Non proident sunt in culpa qui officia semper")] public static void a891c43376a5049f89ad75b70fa0a543f() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_09 Deserunt mollit anim id est laborum arcu semper")] public static void aa8da317895214abc966c229c832c162f() { Assert. Pass(); }
[Test, TestCase(TestName = "DemoScenario_10 Orci a scelerisque purus semper eget Ornare arcu dui vivamus")] public static void aa9accb0c9c1b4918b76bc75ff2f2e835() { Assert. Pass(); }
}}
Now perform these steps:
- execute your "SpecFlow" project's tests via command line using the
NUnit console TestRunner (it should be configured to drop the CSV
output to some place like c:\Temp)
- execute your "convert" project which should be configured to read the
CSV in c:\Temp and update the .cs file in the "dummy" project
wherever its cloned repo exists on disc
- via command line arguments, have git commit and push to the repo the
change that the "convert" exe made to the "dummy" project's .cs file
The Azure Pipeline will now build and show the results from the "SpecFlow" project's test executions as "SpecFlow Tests" using the names and results that it stored in the CSV (mine have some demo Ids as well)
There are different ways this can be set up; one way is:
- put your "SpecFlow" project into a git repo and create a CI pipeline
for it
- add a build step set to execute the tests via command line
- add a step to call the "convert" exe which has been placed into a
fixed location on the build box and updates the "dummy" project's file where it is housed in a separate VM in a shared location
- add a step to push the latest update to the "dummy" project to the git repo
This pipeline would now auto-trigger upon commits pushed to its repo, and the process would automatically convert the SpecFlow results to NUnit results, trigger the secondary pipeline, and show the SpecFlow results in the Builds->Tests area of that second pipleine. The first pipeline would show no results, you would always look at the second pipeline to see them.
To support more/other "SpecFlow" projects in this way...they could probably all share the "convert" piece, but it would need to be improved to have the ability to have passed into it the location of the .cs file it was updating. Then you could create separate "dummy" projects (and of course associated repos/pipelines) for each "SpecFlow" project you wanted to see results for in this fashion. I haven't gotten this far yet as I'm only doing this for a single project currently.