You can consider this post a Fabricators guide to Microsoft Fabric deployment pipelines. Including my unique insights and how this ties in with Microsoft’s latest document relating to CI/CD workflow options in Fabric.
I make a point about stating Microsoft Fabric deployment pipelines in the title for this post because other types of pipelines can be interpreted as deployment pipelines. For example, pipelines created to created to perform CI/CD in Azure Pipelines within Azure DevOps.
In this post I provide insights about various topics relating to Microsoft Fabric deployment pipelines. Ranging from the initial setup to working with APIs.
In addition, I provide additional insights as far as their mention in Microsoft’s article about CI/CD workflow options is concerned.
Plus, content that can help those that studying for either the DP-600 exam or the DP-700 exam. Due to the fact that both exams require knowledge on how to create and configure deployment pipelines. Along the way I share plenty of links.
In fact, one of the reasons I published this guide to Microsoft Fabric deployment pipelines is that it completes a trilogy of posts related to the knowledge required to implement lifecycle management in Fabric for the DP-700 exam.
Those looking to take the DP-600 exam can view the offer to get certified for free. Plus, those looking to take the DP-700 beta exam can prepare yourselves now with my checklist for the DP-700 beta exam.
Microsoft Fabric deployment pipelines in a nutshell
Microsoft Fabric deployment pipelines were originally introduced in Power BI before the days of Microsoft Fabric. I know from experience as I remember that they were part of the PL-300 exam.
Microsoft has a reasonable of documentation explaining what is deployment pipelines. Which you can go through in detail.
However, the short version is that Microsoft Fabric deployment pipelines allow you to deploy updates to supported items to multiple workspaces that represent different environments in a GUI-based manner.
Within Microsoft Fabric deployment pipelines these environments are represented graphically as stages. As you can see below, where I highlight the workspace that is linked to the development stage.
In-between the development and test stages you can see a tick surrounded by a looping circle. Which indicates that the items in the Test stage are synchronized with items in the development stage.
Whereas the cross in-between the test and production stages indicates that items in the production stage are out of sync with items in the test stage.
When you go to synchronize supported items, you can either synchronize all the supported items at once or only specific ones.
Deployment rules
You can also perform additional tasks as well. For instance, you can create deployment rules which you can use for deployments to different stages. Depending on the items in your workspace you can create three types of deployment rules.
- Data source rules to change the data source to point to another location. Note that you can only change location, not the source type itself. For example, change the source database.
- Parameter rules to change the parameters for certain items.
- Default lakehouse rules to determine the default lakehouse for a notebook. Instead of the Spark configuration settings that I mentioned in a previous post relating to running notebooks.
For example, if a workspace that represents the development stage contains a notebook, you can click on the deployment rules icon (the lightning bolt) for the Test stage. Which shows you which deployment rules you can look to change.
Providing you are the owner of the notebook in the Test stage you can click on the notebook and add a rule to change the default lakehouse in the test workspace.
If the alternative lakehouse already exists and you are comfortable selecting the lakehouse this way that is great. However, you might want to consider working with the Spark session configuration magic command instead.
Additional supported items
Another key point to remember is that the list of supported Fabric items is constantly growing. In my opinion there are three main ways to check when new items are supported:
- Constantly check the list for updates.
- Configure a deployment pipeline with the default stages and add items in the development stage and check when they show as supported.
- Keep an eye on the Microsoft Fabric Updates Blog for announcements.
Microsoft Fabric deployment pipeline updates worth highlighting
Microsoft does a good job of announcing updates to Microsoft Fabric deployment pipelines in the Microsoft Fabric Updates Blog. I still want to highlight a couple of deployment pipeline updates that are worth knowing in this guide. Since they are both significant.
Additional deployment pipeline stages
When deployment pipelines were first introduced you only got the option for you pipeline to link workspaces to have three stages. To represent development, test and production environments as shown in the previous image in this post.
However, now you get the ability to add additional stages when you first create your deployment pipeline. Like in the below example that shows how you can setup a deployment pipeline for typical Development, Test, Acceptance and Production (DTAP) stages.
New deployment pipelines user interface
All the images in this post are based on the existing deployment pipelines user interface. Microsoft are currently working on a new deployment pipelines user interface. Which some of you may have seen in Microsoft Fabric.
I must admit that the new user interface looks good. However, be aware that at this moment in time the new interface is temporarily disabled in order for Microsoft to resolve an issue.
Keep an eye on the new deployment pipelines user interface page for further updates relating to this situation.
Deployment pipeline APIs
You can manage Microsoft Fabric deployment pipelines through a set of deployment pipeline APIs. For example, you can call an API to deploy stage content.
Microsoft published a post earlier this year that covered a variety of Fabric CI/CD announcements. Within that post there is a link to their article that covers how to automate your deployment pipeline with Fabric APIs.
Within that article is the link to the “DeploymentPipelines-DeployAll.ps1” PowerShell script. Which I covered in a previous post on how to update Fabric Deployment Pipeline stages with YAML Pipelines in Azure DevOps.
In addition, the post included a link to an end-to-end lifecycle management tutorial. Which covers concepts covered in Microsoft’s recent article about suggested CI/CD workflow options.
Microsoft Fabric deployment pipelines in relation to Microsoft’s CI/CD workflow options
Recently Microsoft released an article with their advice on how to choose the best Fabric CI/CD workflow for you.
Within that article they highlight four different release options. Including one to deploy using Microsoft Fabric deployment pipelines.
Microsoft recommends that you first implement the development process that I highlighted in my previous post about version control to populate a workspace that represents the development stage.
From there, manage the deployments to the workspaces representing other stages/environments with Microsoft Fabric deployment pipelines.
Like in the below diagram. Where you perform the typical development process in the left section to populate the workspace that represents the development stage. From there, orchestrate deployments to the different stages with Microsoft Fabric deployment pipelines.
Orchestrating deployments outside of Microsoft Fabric
Orchestrating the deployments with the user interface within Microsoft Fabric is fine, as long as working directly in Microsoft Fabric suits your requirements.
However, for more advanced scenarios you may want to consider orchestrating the deployments to the Microsoft Fabric deployment pipeline stages via Azure Pipelines or GitHub Actions instead.
Doing this introduces more possibilities. For instance, implementing an approvals process and the ability to implement DataOps to populate empty Lakehouses once they are deployed to a new stage.
Like in the below diagram. Which again highlights a workspace representing the development stage being updated in the left section. However, this time the right section highlights a couple of the advantages of orchestrating deployments with Azure Pipelines instead.
I mention YAML Pipelines as they are my preference. You can read why in my post where I share my thoughts about disabling classic pipelines in Azure DevOps.
To manage expectations, be aware that there are currently some limitations when looking to orchestrate through API’s. Due to the fact that there are some limitations as far as working with API calls and service principals are concerned.
Anyway, as you can see orchestrating the deployments with either Azure DevOps or GitHub Actions introduces more flexibility. So definitely something that I recommend considering.
Final words about this guide to Microsoft Fabric deployment pipelines
I do hope that this Fabricators guide to Microsoft Fabric deployment pipelines proves to be insightful.
Because the aim of this post was to provide further insights. As well covering knowledge that can be helpful for both the DP-600 and DP-700 exams.
Of course, if you have any comments or queries about this post feel free to reach out to me.
[…] Of course, do bear in mind that there are other ways you can update your SQL databases in Fabric to other environments. Such as Microsoft Fabric deployment pipelines which I covered in a previous post. […]