Skip to content

Same dacpac for Azure Pipelines and GitHub Actions deployments

Reading Time: 4 minutes

In this post I want to cover using the same dacpac for Azure Pipelines and GitHub Actions deployments. Because it is something I have done a demo about recently and can be useful.

Recap

Just a quick recap about what Azure Pipelines and GitHub Actions are. Azure Pipelines is a service within Azure DevOps that you can use to do your CI/CD deployments. Which you can read more about in detail here.

GitHub Actions is the GitHub alternative for doing the same thing. You can find out more about GitHub Actions in detail here.

Now these two are very similar. Because they are both based on yaml. However, the syntax you use for them are slightly different. For example, in Azure Pipelines you use the ‘task’ syntax to define a task whereas in GitHub Actions you use ‘uses’ to define an action to use.

I was told last week that somebody is working on something that will convert Azure Pipelines yaml to GitHub Actions syntax. I’m looking forward to that service appearing.

Back to the demo

Recently I done a demo where I first updated a database project in Azure Data Studio Insiders build. You can read more about using the Insiders build with database projects in detail here.

From there we synced the updates to a Git repository in Azure DevOps. Which would then start an Azure Pipeline which first created a dacpac from the SQL Server database project in the repository.

From there that dacpac file was published as an artifact within Azure DevOps. Which made it easier to share in other stages of the pipeline.

Afterwards, the created artifact also copied to a shared location outside of Azure DevOps. For reasons that will be clear later in this post.

From that single dacpac an update was done to a database on another SQL Server and was used for unit tests.

Afterwards the pipeline updated another three SQL Server databases that were on three SQL Server instances. With each instance hosted in a separate Docker containers on my laptop. Each instance was updated as part of a separate named stage within Azure DevOps. Representing staging, integration and production environments.

Which means all these SQL Server updates were done using that one single dacpac that had been created during the initial build process.

Each deployed in a different stage of the pipeline to represent staging, integration and production environments. All deployed using the ‘SQL Server Database deploy task’ within Azure DevOps.

GitHub Actions

In addition to the above, the very same repository was synchronized with a GitHub repository. After it had synchronized a workflow would start in GitHub Actions.

Now this GitHub Action was fairly simple. It runs the sqlpackage executable to update another database that is on one of the docker containers. Using the same dacpac as the other databases that I had published to a shared location.

To clarify, from one update in Azure Data Studio Insiders build I created a dacpac file. Which was used to update three separate databases in three separate Docker containers automatically using Azure Pipelines.

From there, another database in one of those Docker containers was automatically using GitHub Actions. All done using one single dacpac file.

Of course, this was purely for demo purposes and in reality it would be more complicated if done for production use. For instance, different branching strategies.

How

I was able to test deployments locally using Azure DevOps because I had my laptop configured as a self-hosted agent. In addition, I was able to do the same for GitHub Actions because I had my laptop also configured to be a self-hosted runner as well.

I can vouch for the fact that the installation of both are very similar.

Why

I updated all of the databases this way to show that this can all be done using one source of truth. Which was that one dacpac file that was created at the start of the Azure DevOps pipeline.

Which then went on to update all the SQL Server databases mentioned above. Making your support and auditing of database deployments a lot easier.

It’s important to strive for one source of truth these days to reduce complexity. When you look to create your pipelines it’s important to keep this in mind.

As you can see, even in scenarios like this it is possible with some imagination.

Alternative ways

Of course, there are alternative ways you can create one source of truth between Azure Pipelines and GitHub Actions.

For example, you can add another GitHub Action to create new dacpac file instead of sharing the one that was originally created. However, if you were to do this you would have to ensure nothing in GitHub would be able to change your database structure first.

Plus, it does make your support and auditing a bit more complicated.

Another way I could have done it was to create the dacpac file in Azure Data Studio Insiders build. Which you can read some tips about doing in detail here.

Which would have allowed that dacpac to have been the one source of truth through the complete flow.

However, it does mean you must create the dacpac locally every time and sync it with the main repository. Which might be OK for just one person, but can you expect that every time for a team of six?

Of course, you would also use branching strategies in both Azure DevOps and GitHub to manage the deployments in both better.

Final word

I hope you found my post about using the same dacpac for Azure Pipelines and GitHub Actions deployments useful. My main point of this post was to highlight the fact you should use once source of truth where possible. Even in complicated scenarios like the above example.

If you do this another way you are more than welcome to share below with a comment. Of course, the same goes for any other views you have about this post as well.

Same dacpac for Azure Pipelines and GitHub Actions deployments
Published inAzure DevOps

One Comment

Leave a Reply

Your email address will not be published. Required fields are marked *