Skip to content

Update both SQL Server 2019 and Azure SQL Database objects using one commit

Reading Time: 8 minutes

In this post I will show you how you can update both SQL Server 2019 and Azure SQL Database objects using one commit. Because I tested various ways myself in Azure DevOps the other day.

To clarify, what I mean by a commit is when you first save changes to a database project that is part of a git repository. From there you commit your change and can sync it with a repository hosted online. For example, Azure DevOps or GitHub.

In my last post here I talked about updating multiple databases in both Azure DevOps and GitHub using just one dacpac. However, to update objects in SQL Server and Azure SQL databases using one source you might have to do things differently.

I cover the easy way and building blocks at the start of this post. If you want to go straight to the more technical Kevin method in this post instead just click here.

In this post I use SQL Server 2019 as an example. However, as you can see here you can use this with other versions of SQL Server.

The easy way

To easily Update both SQL Server 2019 and Azure SQL Database objects using one commit you can set your database project to use the SQL Azure V12 Database Schema Provider.

You can easily change this in whichever GUI you are using to manage your database project. You can also change it by setting the DSP line in your sqlproj file to the below before creating the dacpac.

<DSP>Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider</DSP>

What if somebody is adamant they want the SQL Server 2019 databases updated using a SQL Server 2019 specific dacpac and Azure SQL Databases updated using an Azure SQL Database specific dacpac?

After putting in some thought about this I came up with a really simple method.

Solution

My solution for this is very simple in concept. Which I will demonstrate now in easy steps. In reality, my original pipeline is more complicated. However, I will show this one to make understanding the method easier.

First of all, I imported a database project in Azure Data Studio Insiders build based on a sample Azure SQL Database.

Once I had imported the project I changed the sqlproj file. I changed the DSP setting to set the database project to be for SQL Server 2019 as below.

<DSP>Microsoft.Data.Tools.Schema.Sql.Sql150DatabaseSchemaProvider</DSP>

Afterwards, I synchronized the local database project with a Git repository in Azure Repos as below:

SQL Server 2019 database project in Azure Repos

From there I created an initial pipeline. With one single stage which built the project and created a dacpac for me as per the below image. I used a Microsoft hosted agent pool to do this stage.

Build SQL 2019 stage completed

If you drill down into this artifact you can find the created dacpac.

Contents of dacpac

Before we go any further I just want to point out that throughout this process the files went to a subfolder in the artifacts staging directory. To keep my artifact staging directory nicely organized.

Deploying to SQL Server 2019

Next I wanted to test deployment to a SQL Server 2019 instance that I had running inside a local container. I found that the best way to do this was to configure my laptop as a self-hosted Azure DevOps agent. You can read how to do that in detail here.

Once I had done that I was able to deploy database updates to a local SQL Server 2019 container. When looking at the results you can see I did this in two separate stages within my pipeline as below.

Working SQL Server 2019 pipeline

From here you can update multiple SQL Servers within your network in multiple stages using the same dacpac. But what if your company starts migrating those servers to Azure?

How can you extend this pipeline so both SQL Servers 2019 installations and Azure SQL Databases get the same updates using different dacpacs?

Various tests

As I mentioned before attempting to deploy updates to SQL Server 2019 and Azure SQL Database using the dacpac for set for SQL Server does not work. To highlight this, I tested it especially for this demo.

To prove a point, I added a stage called ‘Deploy to Azure SQL DB’ to deploy the same dacpac and got the below error:

“[error]A project which specifies SQL Server 2016 as the target platform cannot be published to Microsoft Azure SQL Database v12.”

So, I tested changing the target platform to other on-premises versions of SQL Server and none of them worked. In addition, I also added an extra setting to additional arguments to add it as a SQL Datawarehouse. Still the same error occurred.

I also created the Azure SQL Database in advance and setting it to a higher version and changed the compatibility level. Which also failed.

Afterwards, I tested adding the ‘/p:AllowIncompatiblePlatform=true’ parameter to the AdditionalArguments section in the task. It works, however bear in mind that doing this could hide legitimate issues.

From there I went to try the other method I had thought of.

The Kevin Method

In case anybody was wondering, I call this the Kevin method because it’s the nickname I gave it whilst explaining it to Sander Stad one evening. As you will see below it is simple yet effective.

First I added a new stage to create a dacpac for Azure SQL Database updates. I decided to make this the first stage in the pipeline just in case the Azure SQL Database build has inconsistency errors. Doing this means your pipeline fails fast if there is a problem.

Of course, if using this method for older versions of SQL Server instead of SQL Server 2019 you would want to swap the stages around. Making sure they were complete before doing the Azure deployments.

I named it the ‘Build Azure SQL DB dacpac’ stage, and it does the below tasks.

  1. First sets all my variables for that stage and which agent pool to use.
  2. Afterwards it copies all the files in the repository to the staging area. Apart from the yaml file.
  3. It then runs an inline PowerShell task which replaces the DSP line in the copy of the sqlproj file in the staging area. Setting it to be for an Azure SQL Database instead.
  4. It then runs a build against the new sqlproj file with the changed DSP line.
  5. Once the build has completed a new artifact is published with only the contents of the ‘bin\release’ folder (release was what I called my configuration).

Once that stage completed I had three stages and two artifacts published as below.

Added Build Azure SQL Database dacpac stage

All that was left to do was to create a ‘Deploy to Azure SQL DB stage’. Using the ‘Azure SQL Database deployment’ task and the new dacpac I had just created. Which low and behold worked.

Update both SQL Server 2019 and Azure SQL Database objects using one commit
Fully working pipeline

Deploy to Azure DB stage

As shown above, you can update both SQL Server 2019 and Azure SQL Database objects using one commit. Which means that if you save a database project in Visual Studio and sync it with Azure DevOps both types of databases can be updated with their relevant dacpacs.

In this example, I did it using Azure Data Studio Insiders build instead of Visual Studio.

To make the pipeline above tidy I set the ‘Deploy to Azure DB Stage’ to run in parallel with the ‘Build SQL 2019 dacpac’ stage.

However, if you want full peace of mind you might want to set only stage to run after the Build Azure SQL DB stage. With the ‘Build SQL 2019 dacpac’ stage running after it. You can do this using the ‘dependsOn’ option.

Of course, like I said earlier if you decide to do this for older versions of SQL Server you can put their stages first.

One source of truth for database objects

Now, as you can see in the above images two artifacts got published. So that one dacpac is specific for SQL Server 2019 and the other is specific Azure SQL Databases. Below, you can view the contents of the two artifacts to prove it was only them used.

Contents of dacpacs after In this post I will show you how you can updating both SQL Server 2019 and Azure SQL Database objects using one commit.

To clarify though, both dacpacs used the SQL code from exactly the same source. Which is the database project files stored in the repository.

So instead of the one source of truth being a one dacpac like in my last post here, the one source of truth is now the database project in this repository. Plus, if any new files are added to the database project they are automatically included because of the syntax that I used in my copy files task.

Advantages

In reality, there are some advantages to using this method.

For instance, you can cater for any versions of SQL Server that do not work well with the SQL Azure V12 database schema provider instead of SQL Server 2019. By changing the DSP line to the relevant version.

Because the build files for the Azure SQL Database deployments are done behind the scenes you don’t have to duplicate the database schema in your repository.

So, you don’t have to worry about managing duplicate copies of the schema. In addition, you avoid the risk of somebody updating one of the copies by accident.

Use case

I think its strongest use case is in the workplace. Because you could use this method whilst you start migrating SQL Server 2019 databases to Azure and you want to keep the same database schema. Or any other version of SQL Server of course.

For example, during migration you could migrate your database in a staging environment to an Azure SQL Database whilst keeping your Production database on-premises. Once done you can simply change your pipeline and allow them both to be updated from the same source.

Additional logic

Now I must stress again here that this demo is only to give you a basic introduction to the method. Even though Azure SQL database has come a long way it may not agree with every database update.

Like I said earlier, that is why the Azure SQL database dacpac build is the first stage. To catch potential issues early in the pipeline.

You will have to put some additional tasks in place to cater for things like unsupported features in Azure SQL Database. Which you can find out more about in detail here.

Of course, if doing this for older versions of SQL Server you may also have to consider putting additional tasks in place.

Tips

Now I do have a couple of tips for you if you intend to do this.

Firstly, use variables in your Azure DevOps pipelines. Because they will make your life a lot easier.

Secondly, make sure you know the syntax for the Copy Files task in Azure DevOps. Which you can find out more about in detail here.

Thirdly, make sure you keep copying the sqlproj file when the database project is updated. Because this file keeps a reference of all the database objects required and is updated if new objects are added.

Finally, I had some interesting issues when attempting to use the Agent.TempDirectory location to create the staging files for the Azure build. So, I decided to use the System.ArtifactsDirectory location instead since it gets purged before each new build.

Final word

I hope this post on how to update both SQL Server 2019 and Azure SQL Database objects using one commit helps some of you. Because I think both methods can be useful for some of you.

Of course, if wanting to deploy the same update to SQL Server 2019 and Azure SQL Database easiest option is to set the project to use SQL Azure V12. However, the Kevin method is useful if you build on this basic demo for other versions of SQL Server.

I think the best thing about the Kevin method is that it is simple and effective. Of course, I would love to hear any alternative methods used. Feel free to add a comment if you use another method to update both at the same time.

In addition, if you have any questions about this feel free to reach out to me.

Published inAzure DevOpsSQL Server

5 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *