Continuous integration
Published 10 March 2016
This page provides an overview of the DLM Automation cmdlets for setting up continuous integration on a development database.
By setting up a trigger and build step in your build server, each time a change to the development database schema is committed to source control, your build server can run DLM Automation cmdlets that carry out the following tasks:
- Validate
Validate the schema in the source control scripts folder by checking the database can be built successfully from scratch. - Test
Run tSQLt tests on the development schema. - Sync
Update a test database to match the development schema. - Package and publish
Add the development schema to a database package and publish this for later use in your deployment process.
Validate
The first task in the continuous integration process is to validate the scripts folder in your source control repository by checking the database can be built successfully from scratch. You can use the Invoke-DlmDatabaseSchemaValidation
cmdlet to do this.
Invoke-DlmDatabaseSchemaValidation
Example
$validatedSchema = Invoke-DlmDatabaseSchemaValidation "C:\Work\scripts"
In the example above, we've assigned the output of the Invoke-DlmDatabaseSchemaValidation
cmdlet to the $validatedSchema
variable, so we can reuse it as the input for other DLM Automation cmdlets.
The DLM Automation add-ons for build servers, such as TeamCity, have a "Build" step. As well as validating the scripts folder, the Build step also places the schema in a database package, ready to be published to a release management system.
If you want to replicate the behavior of the Build step, you can add cmdlets that build the database package and export it to disk, as in the example below:
Example
$validatedSchema = Invoke-DlmDatabaseSchemaValidation "C:\Work\scripts" $databasePackage = New-DlmDatabasePackage $validatedSchema -PackageId "WidgetShop" -PackageVersion "1.0" Export-DlmDatabasePackage $databasePackage -Path "c:\packages"
However the DLM Automation cmdlets are more versatile than the DLM Automation add-ons, and we recommend you leave building your database package until after the other continuous integration tasks of testing and syncing have succeeded. To find out more about building your database, see Package and publish.
Test
There are two DLM Automation cmdlets related to running tSQLt tests: Invoke-DlmDatabaseTests
and Export-DlmDatabaseTestResults
.
Invoke-DlmDatabaseTests
This cmdlet runs any tSQLt tests included in the scripts folder, such as static analysis, unit tests or integration tests.
Example
$testResults = Invoke-DlmDatabaseTests "C:\Work\scripts"
Don't confuse Invoke-DlmDatabaseTests
with the Test-DlmDatabaseConnection
cmdlet that's used for testing a database connection (see Setting up database connections).
Export-DlmDatabaseTestResults
This cmdlet exports the output of the Invoke-DlmDatabaseTest
cmdlet to disk.
Example
Invoke-DlmDatabaseTests "C:\Work\scripts" | Export-DlmDatabaseTestResults -OutputFile "C:\Work\TestResults\scripts.junit.xml"
In the example above, we've used the pipe (|) symbol to take the output of Invoke-DlmDatabaseTests
and use it as the input for Export-DlmDatabaseTestResults
.
Sync
Once you've validated the development schema and run tSQLt tests on it, you may want to use it to update a target database, which you can then use to run integration tests with your application. You can use the Sync-DlmDatabaseSchema
cmdlet to do this.
Sync-DlmDatabaseSchema
This cmdlet updates a target database to match a source schema.
Example
$testdb = New-DlmDatabaseConnection -ServerInstance "test01\sql2014" -Database "WidgetIntegration" Sync-DlmDatabaseSchema -Source "C:\Work\scripts" -Target $testdb
In the example above, we've used Sync-DlmDatabaseSchema
to update the schema of a test database to match the schema of the scripts folder.
The Sync-DlmDatabaseSchema
cmdlet is a versatile tool. You can also use it to update a target database with a source database:
Example
$targetdb = New-DlmDatabaseConnection -ServerInstance "test01\sql2014" -Database "WidgetProduction" $sourcedb = New-DlmDatabaseConnection -ServerInstance "test01\sql2014" -Database "WidgetPreProduction" Sync-DlmDatabaseSchema -Source $sourcedb -Target $targetdb
In the example above, we've used Sync-DlmDatabaseSchema
to update the WidgetProduction database so that it matches the schema of WidgetPreProduction.
However, when you're deploying your production database, we recommend you use the New-DlmDatabaseRelease
cmdlet. This has a number of advantages, such as allowing you to review the changes to your production database before proceeding with the deployment (see Deployment).
Package and publish
After you've validated, tested and synced the latest version of your development schema, you can use DLM Automation cmdlets to add the validated schema to a database package for later use. If you use a tool with a NuGet feed, such as Octopus Deploy, you can publish the package to this feed.
You can also use these cmdlets to publish schema and database documentation information to Redgate's free DLM monitoring tool, DLM Dashboard.
New-DlmDatabasePackage
This cmdlet packages the validated database schema that's produced by the Invoke-DlmDatabaseSchemaValidation
cmdlet.
Example
$scriptsFolder = "C:\Work\scripts" $validatedScriptsFolder = Invoke-DlmDatabaseSchemaValidation $scriptsFolder $databasePackage = New-DlmDatabasePackage $validatedScriptsFolder -PackageId "MyDatabase" -PackageVersion "1.0.0"
In the example above, we've used the New-DlmDatabasePackage
cmdlet to create a NuGet package from the output of Invoke-DlmDatabaseSchemaValidation
.
The PackageID
parameter specifies the unique identifier for the package. You also need to specify the version of the package.
Export-DlmDatabasePackage
Use this cmdlet if you want to export the NuGet database package to an output folder.
Example
$scriptsFolder = "C:\Work\scripts" $validatedScriptsFolder = Invoke-DlmDatabaseSchemaValidation $scriptsFolder $databasePackage = New-DlmDatabasePackage $validatedScriptsFolder -PackageId "MyDatabase" -PackageVersion "1.0.0" Export-DlmDatabasePackage $databasePackage -Path "C:\packages"
In the example above, on line 4, the Export-DlmDatabasePackage
cmdlet will produce the file MyDatabase.1.0.0.nupkg in the folder C:\packages. The file name for database package is produced by combining the PackageId
and PackageVersion
parameters of the New-DlmDatabasePackage
cmdlet.
Publish-DlmDatabasePackage
Use this cmdlet if you want to publish the database package to the NuGet feed of a release management tool, such as Octopus Deploy.
Example
$scriptsFolder = "C:\Work\scripts" $validatedScriptsFolder = Invoke-DlmDatabaseSchemaValidation $scriptsFolder $databasePackage = New-DlmDatabasePackage $validatedScriptsFolder -PackageId "MyDatabase" -PackageVersion "1.0.0" Publish-DlmDatabasePackage $databasePackage -NuGetFeedUrl "http://localhost:4000/nuget/packages" -NuGetApiKey "ed6d7c98-9847-4599-b5a8-323f215b5c89"
In the example above, on line 4, we've used the Publish-DlmDatabasePackage cmdlet to send the database package to a NuGet feed that requires an API key.
You can also use Publish-DlmDatabasePackage
to send schema and database documentation information to DLM Dashboard. For more information, see the Publish-DlmDatabasePackage reference page.
An example script for continuous integration
Let's now combine all the continuous integration tasks we've looked at into a single PowerShell script:
Example
# Validate the scripts folder $scriptsFolder = "C:\Work\scripts" $validatedScriptsFolder = Invoke-DlmDatabaseSchemaValidation $scriptsFolder # Run tSQLt tests Invoke-DlmDatabaseTests $validatedScriptsFolder | Export-DlmDatabaseTestResults -OutputFile "C:\Work\TestResults\scripts.junit.xml" # Sync a test database $testdb = New-DlmDatabaseConnection -ServerInstance "test01\sql2014" -Database "WidgetIntegration" Test-DlmDatabaseConnection $testdb Sync-DlmDatabaseSchema -Source $validatedScriptsFolder -Target $testdb # Package and Publish the schema $databasePackage = New-DlmDatabasePackage $validatedScriptsFolder -PackageId "MyDatabase" -PackageVersion "1.0.0" Publish-DlmDatabasePackage $databasePackage -NuGetFeedUrl "http://localhost:4000/nuget/packages" -NuGetApiKey "ed6d7c98-9847-4599-b5a8-323f215b5c89"
In the example above:
- On line 2, we've assigned the path to the scripts folder to the
$scriptsFolder
variable. - On line 3, we've used
Invoke-DlmDatabaseSchemaValidation
to validate the scripts folder and assigned the output of this cmdlet to the$validatedScriptsFolder
variable. - On line 6, we've used
Invoke-DlmDatabaseTests
to run tSQLt tests on the validated scripts folder and then usedExport-DlmDatabaseTestResults
to export the results to disk. - On line 9, we've used
NewDlmDatabaseConnection
to assign the connection details for our test database to the$testdb
variable. - On line 10, we've used
Test-DlmDatabaseConnection
to check we can connect to the test database. - On line 11, we've used
Sync-DlmDatabaseSchema
to update the schema of the test database with the schema of the validated scripts folder. - On line 14, we've used
New-DlmDatabasePackage
to package the validated schema. - On line 15, we've used
Publish-DlmDatabasePackage
to publish the package to a NuGet feed.
What next?
We've now looked at the full continuous integration process that validates, test, syncs, packages and publishes the schema in a scripts folder. You can use your build server to set up a trigger and build step to run these tasks every time there's a change to the schema in source control. To learn how to do this using TeamCity, see setting up continuous integration with TeamCity.
However, that's not the end of using the DLM Automation PowerShell module. It also has a set of cmdlets for deploying your database. To find out more, see Deployment.
Cmdlet reference
For full details about all the DLM Automation cmdlets, see the DLM Automation cmdlet reference.