Provides helper functions for setting up and tearing down SQL Server database fakes for use in integration/acceptance testing.
Available on NuGet at https://www.nuget.org/packages/IntegrationTestingLibraryForSqlServer
Ideal for use with SQL Server Local DB which is deployed as part of Visual Studio but can also be installed on Integration Test servers. Specflow is fully supported and is our preferred method for creating integration tests, see further down this document for Specflow integration best practices.
using System.Data;
using IntegrationTestingLibraryForSqlServer;
There are some breaking changes in v2, see Migrating from v1.
SQL Server databases can be created and dropped. Windows Authentication access can be given to the user that the system under test will be running as (a website or web service for example).
var database = new DatabaseActions(connectionString);
database.CreateOrReplace();
database.GrantUserAccess(new DomainAccount(username));
database.Drop();
DomainAccount
can include a Domain however if none is specified the domain of the account running the test is assumedSqlAccount
can be used for SQL Authentication- Schemas are supported, see Schemas for usage
Tables can be created with the same structure as the 'real' table.
var definition = new TableDefinition(tableName);
definition.Columns.AddInteger("c1", SqlDbType.Int);
definition.CreateOrReplace(database);
- A convenient way to create columns of the correct type is to use the
ColumnDefinitionFactory
factory although the resultant object will need to be casted to the specific type so that the extended properties can be changed. - A create
TableDefinition
statement can be generated from a 'real' table using the C# code generator.
Most columns have no special properties.
var column = new StandardColumnDefinition("c1", SqlDbType.DateTime);
or
var column = tableDefinition.Columns.AddStandard("c1", SqlDbType.DateTime);
Columns that can be used as Identity columns have a SqlDbType
of Int
, BigInt
, SmallInt
, TinyInt
and provide an IdentitySeed
column.
var column = new IntegerColumnDefinition("c1", SqlDbType.Int)
{
IdentitySeed = 1
};
tableDefinition.Columns.Add(column);
or
var column = tableDefinition.Columns.AddInteger("c1", SqlDbType.Int);
column.IdentitySeed = 1;
Columns with a SqlDbType
of Decimal
(also shown in SQL Server as Numeric) can include Precision
and Scale
. See decimal and numeric (Transact-SQL) for more details on usage.
var column = new DecimalColumnDefinition("c1")
{
Precision = 18,
Scale = 0
};
or
var column = tableDefinition.Columns.AddDecimal("c1");
String-like columns, that is with a SqlDbType
of Char
, VarChar
, NChar
, NVarChar
can include Size
.
The property IsMaximumSize
is a convenient way to set the column to the maximum size.
var column = new StringColumnDefinition("c1", SqlDbType.NVarChar)
{
Size = 100
};
or
var column = tableDefinition.Columns.AddString("c1", SqlDbType.NVarChar);
Variable length binary columns (SqlDbType
of Binary
or VarBinary
) can include Size
.
The property IsMaximumSize
is a convenient way to set the column to the maximum size.
var column = new BinaryColumnDefinition("c1", SqlDbType.Binary)
{
Size = 1000
};
or
var column = tableDefinition.Columns.AddBinary("c1", SqlDbType.Binary);
Tables can be loaded with initial data using List.
var tableActions = new TableActions(database.ConnectionString);
var rows = new List<List<object>>();
rows.Add(new List<object>() { 5, "name1" });
rows.Add(new List<object>() { 6, "name2" });
var tableData = new TableData
{
ColumnNames = new[] { "c1", "c2" },
Rows = rows
};
tableActions.Insert(tableName, tableData);
or, if you have a TableDefinition
object:
tableDefinition.Insert(database, tableData);
### Populating tables with data - object properties
Tables can be loaded with initial data using the properties on objects.
```C#
class Custom
{
public int Value1 { get; set; }
public string Value2 { get; set; }
}
var tableActions = new TableActions(database.ConnectionString);
var items = new List<Custom>
{
new Custom { Value1 = 10, Value2 = "a" },
new Custom { Value2 = 20, Value2 = "b" }
};
var tableData = new ObjectPopulatedTableData(items);
tableActions.Insert(tableName, tableData);
or, if you have a TableDefinition
object:
tableDefinition.Insert(database, tableData);
Views can be created as a front to a single table; the single table can model the same structure as the 'real' view.
var tableActions = new TableActions(database.ConnectionString);
tableActions.CreateView("t1", "v1");
or, if you have a TableDefinition object:
tableDefinition.CreateView(database, "v1");
Dependency tests can be created that will compare the expected table structure with that of the 'real' table
to ensure that it has not changed structure (and therefore invalidating the primary test cases). VerifyMatch
will throw an exception if the two structures don't match.
A create TableDefinition
statement can be generated from a 'real' table using the C# code generator.
var column1 = new IntegerColumnDefinition("c1", SqlDbType.Int);
var column2 = new StringColumnDefinition("c2", SqlDbType.NVarChar) { Size = 100 };
var definition = new TableDefinition(tableName, new[] { column1, column2 });
definition.VerifyMatch(database);
Some systems under test will write data (to a fake table) this can be verified by comparing the expected data with the results from a data reader:
var expected = new TableData();
expected.ColumnNames.Add("c1");
expected.Rows.Add(new List<object> { "v1" });
TableData actual;
using (SqlConnection connection = new SqlConnection(database.ConnectionString))
{
using (SqlCommand command = connection.CreateCommand())
{
command.CommandText = "SELECT c1 FROM t1";
connection.Open();
using (var reader = command.ExecuteReader())
{
actual = new DataReaderPopulatedTableData(reader);
}
}
}
expected.VerifyMatch(actual, TableDataComparers.UnorderedRowNamedColumn);
There are a number of different matching strategies depending on how strict you want to be, how your system will access the real data, and what changes to the 'real' table are tolerable. the built-in comparers allow for combinations of:
- Columns: by ordinal, by name, by name as a subset of the returned columns
- Rows: by ordinal, by any sequence, by any sequence as a subset of the returned rows
- Values: currently the only matcher for values is by
ToString()
Custom comparers can be built using the classes in the TableComparision namespace.
- If your system is executing a
SELECT *
you'll probably want the ordinal column comparer - If your system can tolerate new columns to a table or view you can use the subset column comparer
- If you're inserting rows, you probably want your test to fail if new columns are added so match equals column comparer would be best
- If you're not selecting with an
ORDER BY
you'll probably want the unordered rows comparer as the order is not guaranteed subset rows is probably only useful for checking against a 'real' table with known values
As for table structures, views can be tested to ensure that the 'real' view matches an expected structure. The method of retrieving the structure of a view differs from that of a table in that for a view the first row is selected and the resulting data reader is used for the comparision. VerifyMatch
will throw an exception if the two structures don't match.
var column1 = new IntegerColumnDefinition("c1", SqlDbType.Int);
var column2 = new StringColumnDefinition("c2", SqlDbType.NVarChar) { Size = 100 };
var viewDefinition = new TableDefinition(viewName, new[] { column1, column2 });
var checker = new ViewCheck(this.database.ConnectionString);
checker.VerifyMatch(viewDefinition);
Procedures can be created with the same definition as the 'real' stored procedure but with predictable return values.
ProcedureDefinition definition = new ProcedureDefinition(procedureName)
{
Body = @"set @p2 = 'ok'
return 5"
};
definition.AddStandard("@p1", SqlDbType.Int).Direction = ParameterDirection.Input;
definition.AddString("@p2", SqlDbType.NVarChar);
definition.CreateOrReplace(database);
- A convenient way to create parameters of the correct type is to use the
ProcedureParameterFactory
factory although the resultant object will need to be casted to the specific type so that the extended properties can be changed. - The default direction when using AddStandard, AddString, etc is InputOutput.
Most parameters have no special properties.
var parameter = new StandardProcedureParameter("c1", SqlDbType.DateTime, ParameterDirection.InputOutput);
procedureDefinition.Parameters.Add(parameter);
or
var parameter = procedureDefinition.Parameters.AddStandard("@c1", SqlDbType.DateTime);
Parameter for SqlDbType
with values Int
, BigInt
, SmallInt
and TinyInt
.
var parameter = new IntegerProcedureParameter("c1", SqlDbType.Int, ParameterDirection.InputOutput)
{
IdentitySeed = 1
};
or
var parameter = procedureDefinition.Parameters.AddInteger("@c1", SqlDbType.Int);
parameter.IdentitySeed = 1;
Parameters with a SqlDbType
of Decimal
(and also can shown in SQL Server as Numeric) can include Precision
and Scale
. See decimal and numeric (Transact-SQL) for more details on usage.
var parameter = new DecimalProcedureParameter("c1", ParameterDirection.InputOutput)
{
Precision = 18,
Scale = 0
};
or
var parameter = procedureDefinition.Parameters.AddDecimal("@c1");
Variable size string-like parameters, that is with a SqlDbType
of Char
, VarChar
, NChar
, NVarChar
can include Size
.
The property IsMaximumSize
is a convenient way to set the column to the maximum size.
var parameter = new StringProcedureParameter("c1", SqlDbType.NVarChar, ParameterDirection.InputOutput)
{
Size = 100
};
or
var parameter = procedureDefinition.Parameters.AddString("@c1", SqlDbType.NVarChar);
Variable length binary parameters (SqlDbType
of Binary
or VarBinary
) can include Size
.
The property IsMaximumSize
is a convenient way to set the column to the maximum size.
var parameter = new BinaryProcedureParameter("c1", SqlDbType.Binary, ParameterDirection.InputOutput)
{
Size = 1000
};
or
var parameter = procedureDefinition.Parameters.AddBinary("@c1", SqlDbType.Binary);
Dependency tests can be created that will compare the expected stored procedure definition with that of the 'real' procedure to ensure that it has not changed definition (and therefore invalidating the primary test cases). VerifyMatch
will throw an exception if the two definitions don't match.
var column1 = new StandardProcedureParameter("c1", SqlDbType.Int);
ProcedureDefinition definition = new ProcedureDefinition(procedureName, new[] { column1 });
definition.VerifyMatch(database);
var database = new DatabaseActions(connectionString);
database.CreateSchema("schemaName");
Objects can be created in schemas other than dbo (the default schema) by creating a schema and then passing DatabaseObjectName instead of a string.
For example to create a table definition for the 'Test' schema (assuming it had already been created):
var definition = new TableDefinition(new DatabaseObjectName("Test", "Table1"));
An alternative would be to use the combined schema and object name:
var definition = new TableDefinition(DatabaseObjectName.FromName("Test.Table1"));
Specflow provides a behaviour-driven development structure ideally suited to integration/acceptance test as such this library has been designed to work well with it. There are helper extension methods included with Specflow which can be access by using the namespace TechTalk.SpecFlow.Assist
.
Given the table "test" is created
| Name | Data Type | Size | Decimal Places | Allow Nulls |
| Id | int | | | false |
| Name | nvarchar | 50 | | true |
| Number | decimal | 10 | 5 | true |
[Given(@"the table ""(.*)"" is created")]
public void GivenTheTableIsCreated(string tableName, Table table)
{
var definition = new TableDefinition(tableName);
definition.Columns.AddFromRaw(table.CreateSet<ColumnDefinitionRaw>());
definition.CreateOrReplace(database);
}
Then the definition of table "test" should match
| Name | Data Type | Size | Decimal Places | Allow Nulls |
| Id | int | | | false |
| Name | nvarchar | 50 | | true |
| Number | decimal | 10 | 5 | true |
[Then(@"the definition of table ""(.*)"" should match")]
public void ThenTheDefinitionOfTableShouldMatch(string tableName, Table table)
{
var definition = new TableDefinition(tableName);
definition.Columns.AddFromRaw(table.CreateSet<ColumnDefinitionRaw>());
definition.VerifyMatch(database);
}
Given table "test" is populated
| Id | Name |
| 1 | First |
| 2 | Second |
[Given(@"table ""(.*)"" is populated")]
public void GivenTableIsPopulated(string tableName, Table table)
{
var tableActions = new TableActions(database.ConnectionString);
var tableData = new CollectionPopulatedTableData(table.Header, table.Rows.Select(x => x.Values));
tableActions.Insert(tableName, tableData);
}
Then the definition of view "test" should match
| Name | Data Type | Size | Decimal Places |
| Id | int | | |
| Name | nvarchar | 50 | |
| Number | decimal | 10 | 5 |
[Then(@"the definition of view ""(.*)"" should match")]
public void ThenTheDefinitionOfViewShouldMatch(string viewName, Table table)
{
var definition = new TableDefinition(tableName);
definition.Columns.AddFromRaw(table.CreateSet<ColumnDefinitionRaw>());
var checker = new ViewCheck(this.database.ConnectionString);
checker.VerifyMatch(definition);
}
Given the procedure "test" is created with body "return 0"
| Name | Data Type | Size | Decimal Places | Direction |
| Id | int | | | Input |
| Name | nvarchar | 50 | | Input |
| Number | decimal | 10 | 5 | Input |
[Given(@"the procedure ""(.*)"" is created with body ""(.*)""")]
public void GivenTheProcedureIsCreatedWithBody(string procedureName, string body, Table table)
{
ProcedureDefinition definition = new ProcedureDefinition(procedureName);
definition.Parameters.AddFromRaw(table.CreateSet<ProcedureParameterRaw>());
definition.Body = body;
definition.CreateOrReplace(database);
}
Then the definition of procedure "test" should match
| Name | Data Type | Size | Decimal Places | Direction |
| Id | int | | | Input |
| Name | nvarchar | 50 | | Input |
| Number | decimal | 10 | 5 | Input |
[Then(@"the definition of procedure ""(.*)"" should match")]
public void ThenTheDefinitionOfProcedureShouldMatch(string procedureName, Table table)
{
ProcedureDefinition definition = new ProcedureDefinition(procedureName);
definition.Parameters.AddFromRaw(table.CreateSet<ProcedureParameterRaw>());
definition.VerifyMatch(database);
}
To make creating tests easier, the code snippet below can be adapted and pasted into the C# Interactive window to
generate a TableDefinition
code blob for an existing table. This code can then used as a 'fake' table or in a dependency
test to verify the captured structure matches the current table structure.
#r "IntegrationTestingLibraryForSqlServer.dll"
using IntegrationTestingLibraryForSqlServer;
string connectionString = @"server=(localdb)\MSSQLLocalDB;database=Test;integrated security=True";
string tableName = "T1";
string output = TableCodeBuilder.CSharpTableDefinition(DatabaseObjectName.FromName(tableName), connectionString);
Console.WriteLine(output);
There are a few breaking changes between version 1 and 2 specifically:
ColumnDefinition
class can no longer be initialised; useColumnDefinitionRaw
instead and convert it (see Table creation with Specflow for example usage) or a specific concrete class instead (see Creating tables for usage)- Likewise
ProcedureParameter
class can now be initialised by converting theProcedureParameterRaw
class (see Procedure creation with Specflow for example usage) or again use a specific concrete class (see Creating procedures for usage) - Database schemas are now better supported and standardised through the
DatabaseObjectName
class, existing overloads have been replaced to use this class (see Schemas for usage) - To grant users access to the database the method
GrantDomainUserAccess
has been replaced withGrantUserAccess
which accepts a newDomainAccount
class (see Setting up and tearing down databases for usage); it is expected that SQL authentication will be supported in the future