Integration testing with C# and RoundhousE

3 minute read

Recently at work, I was tasked with creating some integration tests from scratch for a project which only had unit tests. The current way we run integration tests on other projects was too slow (adding data via Entity Framework) so they wanted a new approach. A colleague suggested RoundhousE.

RoundhousE

https://github.com/chucknorris/roundhouse

RoundhousE (that E is really annoying me) is, in their own words “a database migrations engine that uses plain old SQL Scripts to transition a database from one version to another”. It has versioning so you can easily know whether a database needs upgrading, or rollback to a previous version if necessary (and you’ve set it up properly).

However, all I want it to do is set up local databases for integration testing, then delete them afterwards.

Creating a database

Firstly, wherever you’re running this should have MS SQL Server installed and running.

RoundhousE packages rh.exe, which when called will do the following:

Easy right? I needed 2 databases for my integration tests, so I ran through the process twice.

This is the bat script for creating a database:

    SET DIR=%~d0%~p0%

    SET database.name=%1
    SET sql.files.directory=%DIR%db\%1
    SET server.database="(local)"

    :: roundhouse call
    "%DIR%console\rh.exe" /d=%database.name% /f=%sql.files.directory% /s=%server.database% /silent /drop
    "%DIR%console\rh.exe" /d=%database.name% /f=%sql.files.directory% /s=%server.database% /silent /simple

Note that I am dropping the table first, just in case it is still hanging around (for example, if the last run failed before tear down).

Generating data scripts

I’m using Microsoft SQL Server, so I used its functionality to generate scripts for creating the tables. One of my databases was very small, so I dumped the entirety of the data into scripts for RoundhousE to use.

The other database - our main database at work - was much too large to dump the data. Technically it would work, but the tests would be too slow to set up. I dumped individual tables of static data and created some scripts of specific user data. This was quite slow - I need a better tool to do this. After my proof of concept is finished, I will try to find something better.

Ignoring versioning

Part of the appeal of RoundhousE is its versioning tools. If you try to deploy the same version number to a database that you already deployed there, and any of the files have changed, RoundhousE will complain. Equally, if it detects the files haven’t changed, it will assume it doesn’t need to update.

In my use case, I just want it to deploy afresh each time. RoundhousE stores this versioning info in tables in the database, so before each test, I drop the databases (if they don’t exist then it just carries on). This way it redeploys every time and I pay no attention to RoundhousE’s versioning tools.

Piecing it all together

I used XUnit to run my tests. For this, you define a ‘database fixture’ like so:

    [CollectionDefinition("Database collection")]
    public class IntegrationTestBase : ICollectionFixture<DatabaseFixture>
    {
        // deliberately left empty
    }

We define our DatabaseFixture class which is responsible for the creation and dropping of databases. In the constructor, I call the bat file for creating the database (one call for each database).

DatabaseFixture will implement IDisposable, so in your Dispose() method you can add calls to a bat file to drop these databases (this is assuming you want them dropped after each run).

For any test class, we annotate it with:

    [Collection("Database collection")]

This way XUnit treats them as part of the same collection and will run the creation/deletion once.

End result - all working.

Using RoundhousE to run tests in parallel

If you want to run tests in parallel, just decouple the database setup and tear down from the tests running. Say you have 4 processes all running different tests, you would create the database, run each process, and once they’ve all finished, drop the database again. I did play around with creating different databases for each test run and got it working - but it was a bit messy and ultimately pointless.