Skip to content

Deploying to Development, Test and Production environments in Microsoft Fabric

Reading Time: 7 minutes

In this post I want to cover some options for deploying to Development, Test and Production environments in Microsoft Fabric.

I wanted to do this post since I still see a lot of questions about this topic. So, I wanted to provide some solid options as to what you can use to deploy to different environments.

By the end of this post, you will know various options for deploying to Development, Test and Production environments in Microsoft Fabric.

To manage expectations, I focus on the mainstream methods to deploy to the different environments in this post. Specifically in relation to the DP-600 exam. Along with some honorable mentions of other methods.

You can see the sections covered in this post below.

  • Number of workspaces required
  • Microsoft Fabric Git integration
  • Working with Git integration via REST APIs
  • Deployment pipelines
  • Assigning a workspace to a stage
  • Using Git integration and Deployment Pipelines together
  • Honorable mentions

In reality, I have covered some of the content in other posts. With this in mind, you can consider this post as a summary that contains some updates and plenty of links.

In addition, this post is also a good refresher for those of you studying for the DP-600 exam. Since it covers some of the analytics development lifecycle requirements that I covered in a previous post.

Number of workspaces required to cater for Development, Test and Production environments

One question that I get frequently asked is how many workspaces are required? In reality, the answer is that it depends.

However, if you want your solution to be flexible and loosely coupled I do recommend at the very least one Microsoft Fabric workspace per environment.

Microsoft Fabric Git integration

First method I want to cover is Microsoft Fabric Git integration. Which allows you to introduce version control to various items within a Microsoft Fabric workspace.

It does this by allowing you to connect a workspace to a branch within a Git repository that stored in the Azure Repos service in Azure DevOps.

To help with some terms in this section you can go through my other post. Which covers Microsoft Fabric Git integration jargon guide for Fabricators.

Anyway, Microsoft Fabric Git integration has made some good progress since I first touched on it in my spreading your SQL Server wings with Microsoft Fabric post last year.

It currently supports Lakehouses, notebooks, reports, paginated reports and semantic models(datasets) at varying levels. I say varying levels because you can do more with some items than with others.

You can work with it to deploy supported items to multiple workspaces as well as some other interesting possibilities.

For example, thanks to the introduction of Power BI Projects you can work with a report locally in Power BI Desktop and save the report and the semantic model as metadata. In other words, save them as code.

You can then make the folder that the project is stored in a Git repository. Which you can then synchronize with Azure DevOps and work with it to deploy to multiple Microsoft Fabric workspaces. Like in the below diagram.

Git integration for Power BI objects. Which you can use to deploy to Development, Test and Production environments.
Git integration for Power BI objects

In reality, there are multiple variations for working solely with Git integration. For instance, you can look to introduce a level of continuous integration.

Microsoft published a continuous integration guide for Power BI Projects. In addition, I published a post about some continuous integration guide performance tests that I performed.

Working with Git integration via REST APIs

Last week, Microsoft announced that Microsoft Fabric REST APIs are now available.

Which allows you to configure Microsoft Fabric Git integration by using APIs without having to work manually in the Microsoft Fabric environment.

I highly recommend going through the guide on how to automate Git integration by using APIs and Azure DevOps.

Using Deployment pipelines to deploy to Development, Test and Production

Deployment pipelines allow you to deploy various items to multiple workspaces through various stages. For example, a workspace for the Development stage, another stage for Test workspace, etc.

Deployment pipelines are by no means new. In fact, they were initially introduced in Power BI and has been adopted to work with various items in Microsoft Fabric.

However, one thing those from a Power BI background might appreciate is that deployment pipelines work with any Fabric capacity.

Currently deployment pipelines supports various items such as DataMart’s, Lakehouses, notebooks and reports. However, make sure you check what the list of unsupported items.

Especially if you are looking to create your own reports and semantic models in Microsoft Fabric. Because that can be an issue.

One thing to note is that a while ago support was added to allow you to customize stages. However, be aware that this must be done when you first create the deployment pipeline.

Customizing deployment pipeline stages
Customizing deployment pipeline stages

You can work with the common Development, Test, Acceptance and Production (DTAP) environments thanks to this update.

Assigning a workspace to a stage

When you go to assign a workspace to your stage Microsoft Fabric will tell you if the workspace contains an item that is unsupported. In addition, you will get the same message if you attempt a manual deploy to the next stage.

Warning about unsupported items
Warning about unsupported items

After assigning workspaces to your stages, you can look to deploy from between stages.

Deployment pipeline example. Which you can work with to deploy to Development, Test and Production environments.
Deployment pipeline example

Depending on the items in your workspaces you may be able to configure some deployment rules as well. For example, change source databases.

Configuring deployment rules for various items
Configuring deployment rules for various items

Currently, there are some interesting nuances with deployment pipelines. For example, when a Lakehouse is deployed it is empty by default. Which is the same behavior that you get with Git integration.

In addition, I must highlight that deployment pipeline REST APIs are available.

As you can see, this can be a nice way to deploy to your development, test and production environments. Depending on which items you need to move.

Using Git integration and Deployment Pipelines together

One pattern that is being advertised is using Git integration and Deployment Pipelines together. Like in the below example.

  1. Work in a separate workspace that has Git integration configured.
  2. Commit the changes for the branch the workspace is working with.
  3. Create a pull request for the branch in Azure DevOps to the branch that the Development stage in in your Deployment Pipeline is working with.
  4. Wait for the Continuous Integration tests to complete for the pull request.
  5. Once the pull request has completed work with Deployment Pipelines to deploy to the other stages.

Now, this is a perfectly valid pattern. However, you can customize it to suit your needs. For example, you can look to do Continuous integration tests before any workspaces are updated.

Honorable mentions of other ways to deploy to Development, Test and Production environments

Now, there are various other methods you can use to deploy updates for specific items to Development, Test and Production environments.

For example, there are other methods to test and deploy updates to semantic models.

Like the deployment solution covered in the post about Power BI datasets CI/CD by Stephanie Bruno (l/x/b) or the Microsoft Fabric DataOps Patterns solution by John Kerski (l/x/b).

Alternatively, you can look to test the tmdl-devops example by Rui Romano (l/x) now that you can enable the Tabular Model Definition Language (TMDL) format feature in Power BI Desktop.

One key point to bear in mind when looking to implement various CI/CD methods for semantic models is that a lot of them require the XMLA endpoint enabled to read-write.

Another method I want to mention caters for Data Warehouses. Due to the fact that even though Data Warehouses are currently listed as being supported by Deployment Pipelines, they do not work for at least one of my tenants.

Instead, you can look to work with database projects and deploy updates to the different environments like you would with SQL Server related databases.

I cover how you can perform CI/CD for Microsoft Fabric Data Warehouses in various posts. Including one that covers CI/CD for Microsoft Fabric Data Warehouses using YAML Pipelines. Where I also shared a GitHub repository you can download and use as a template.

CI/CD for Microsoft Fabric Data Warehouses using YAML Pipelines. Which you can work with to deploy to dDevelopment, Test and Production environments.

I will mention that there is a new SqlPackage update available. However, it does not include any significant new updates as far as Microsoft Fabric Data Warehouses are concerned.

Final words about deploying to Development, Test and Production environments in Microsoft Fabric

I hope this post that covers some different ways that you can do deployments to dev, test and production environments in Microsoft Fabric has given some of you food for thought.

In reality, there are a variety of options to deploy to development, test and production environments. However, I wanted to make sure everybody had a good overview after being asked about this frequently. Plus, cover some features mentioned in the DP-600 study guide.

As I mentioned in a previous post, I am keeping an eye on the Microsoft Fabric release plan. Since I am looking forward to future updates for both Git integration and deployment pipelines.

Of course, if you have any comments or queries about this post feel free to reach out to me.

Published inDP-600Microsoft FabricVersion Control

3 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *