Skip to content

Git integration and deployment pipeline updates on the Microsoft Fabric roadmap

Reading Time: 5 minutes

In this post I want to cover some interesting Git integration and deployment pipeline updates on the Microsoft Fabric roadmap that I am looking forward to.

Just so that everybody is aware, the official name for the Microsoft Fabric roadmap is the Microsoft Fabric release plan. Which Microsoft announced earlier this month.

As some of you can probably tell, I am looking forward to the expected Git integration and deployment pipeline updates. Mainly due to my background with working with CI/CD within the Microsoft Data Platform

By the end of this post, you will know some that I find interesting, and why. Plus, I share plenty of links in this post as well.

I have split this post into two main sections. One for Git integration updates and the other for the deployment pipeline updates.

One key point to remember is that the dates provided in this post are estimates and are subject to change.

Another key point to note is that Microsoft Fabric is now generally available. You can read more about this in detail in the official post by Ryan Majidimehr.

Microsoft Fabric Git integration (ADO)

Some Microsoft Fabric Git integration (ADO) updates are estimated to be released in the fourth quarter of 2023.

Those of you who read my post about working with Microsoft Fabric Git integration and multiple workspaces know that some elements are already there.

Example of Microsoft Fabric Git integration for Power BI objects
Git integration for Power BI objects

However, Microsoft have publicly stated that they intend to add more items. Which I am excited about. In addition, public REST APIs are coming which will introduce some interesting possibilities. Like using REST APIs with Azure Pipelines or GitHub Actions.

In addition, they have stated some updates for individual items as below.

Git integration for Data Engineering items

Git integration for Data Engineering items is another update that is estimated to be released in the fourth quarter of this year.

One thing that caught my eye is that it will include additional source files. Which I assume means that other file types like Python scripts. Having worked with Databricks Repos in the past I am very keen to see this in action.

Data Factory Git integration for data pipelines and dataflows

I suspect those who already use native Git integration in Azure Data Factory or Azure Synapse Analytics will look forward to Data Factory Git integration for data pipelines as much as I do.

Even more so if you are looking to implement Metadata Driven Pipelines for Microsoft Fabric across multiple workspaces.

Plus, there’s going to be Data Factory integration for dataflows as well. However, I have noticed something interesting about the estimated release dates.

Currently, the Git integration support for data pipelines is expected in the first quarter of next year. However, the dataflows support is estimated for the second quarter of next year.

I really hope that they are both introduced around the same time to allow the hybrid scenario of working with both items.

It will be interesting to see if there are any graceful methods for migrating the pipelines from Azure Data Factory and Azure Synapse Analytics over from their respective Git repositories.

Git integration support for data science items

CI/CD support for data science items is expected in the second quarter of 2024.

I am curious to see exactly which data science items will be supported with Git integration.

I expect notebooks will be supported. Since that is the case with other services. However, it will be interesting to see if items such as models and experiments will also be supported.

Eventstream items in Real-Time Analytics

Eventstream items in Real-Time Analytics will be supported by Git integration. Currently CI/CD support for Eventstream items is estimated to be released in the first quarter of next year.

I think it is a promising first step towards other items within Real-Time Analytics eventually being supported.

Deployment pipelines

Finally we will get the chance to add more stages to deployment pipelines.

For those of you wondering why I am so excited about it, allow me to explain with four little words. Development, testing, acceptance and production. Also known as DTAP. Which are four common environments for those who work with developer technologies.

Those who watched myself and Sander Stad co-present sessions over the last few years will know what I am talking about. For those who don’t, here is a diagram to help you visualize a common scenario.

Power BI example that can be used with Git integration and deployment pipeline updates

In addition to the above update, support for more Microsoft Fabric items is coming. Some of which I cover below.

Deployment pipelines for Data Engineering items

I am really interested in seeing what items will be supported when deployment pipelines for Data Engineering items becomes available. Because it can introduce some interesting possibilities.

I am very curious to see what items can be used with deployment pipeline rules.

Data warehouse in deployment pipelines

I am really looking forward to seeing support for Data warehouse in deployment pipelines.

For years, Power BI folks have either relied on others to deploy updates to source databases or learned how to use a service like Azure DevOps to deploy their own schema updates.

Something which has caused a bit of friction over the years. Especially as far as working practices are concerned.

Because Microsoft Fabric is pushing forwards new innovations, it is going to be interesting to see what happens when this is available for everybody. I suspect it will simplify the deployments of Data warehouse updates ongoing.

For example, when combined with the ability to add new stages you can orchestrate a Data warehouse the pipeline directly in Microsoft Fabric as below. Without the need to work with Azure Pipelines.

Data warehouse example

Of course, whether or not you work this way can change depending on your architecture principles. Because they might require you to work with infrastructure as code.

If you still need to work with Azure Pipelines now you can look to adopt the REST APIs. Another option is to wait until a version of SqlPackage is available that supports the new data warehouse target platform. Which I covered in a previous post.

Support for data science items in deployment pipelines

As I mentioned earlier, CI/CD support for data science items is expected in the second quarter of 2024. Which includes deployment pipeline support.

I am looking forward to seeing what items will be supported in deployment pipelines. Especially after looking into MLflow with another service in the past.

Support for eventstream items in Real-Time Analytics

As I mentioned earlier, the estimated release of CI/CD support for Eventstream items is in the first quarter of next year.

I am excited about the fact that support for eventstream items is coming to deployment pipelines. It will be interesting to see if you can change anything in eventstreams using deployment pipeline rules.

Personally, I hope that this is a first step towards deployment pipelines supporting KQL databases.

So that we can have a deployment pipeline that spans them across multiple workspaces and still have the flexibility to choose if we want to enable various options within them. For example, enable availability of data in OneLake.

Final words about Git integration and deployment pipeline updates

I hope my recommended some interesting Git integration and deployment pipeline updates on the Microsoft Fabric roadmap has been an interesting insight for some of you.

In reality, there are a lot of other good updates coming up so it is worth looking at the Microsoft Fabric release plan.

Of course, if you have any comments or queries about this post feel free to reach out to me.

Published inAzure DevOpsMicrosoft Fabric

5 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *