I wanted to do this post for a few reasons.
First reason is due to the number of questions I get about it. Since I tend to do a lot of work in the CI/CD area.
Second reason is because I wanted to do a follow up after my comment about this in a previous post. Where I shared my thoughts about the DP-600 exam for the new Microsoft Fabric certification.
Thirdly, because I think that there are some gaps in the recommended Microsoft Learn material for the exam. More noticeably, for Power BI projects which is still in preview. So, I want to highlight where some of my previous posts covered these items.
By the end of this post, you will know more about what is involved. Plus, I share plenty of links along the way. In addition, a mention a couple of upcoming sessions where I share details about this area as well.
But first, allow me to dispel a myth for those yet to read some of my previous posts.
CI/CD for Data Warehouses
Instead, you can do it in a similar fashion as you can for SQL Server databases and Azure Synapse SQL Pools.
I covered how you can do this in a couple of posts. Including one that covers CI/CD for Microsoft Fabric Data Warehouses using YAML Pipelines. Where I also shared a GitHub repository you can download and use as a template.
However, it is not mentioned at all in the study guide. Which suggests that you do not need to know this. I was just wanted to highlight that it is possible.
Anyway, now I will focus on what is required for the analytics development lifecycle according to the DP-600 study guide.
Below is an overview of the different statements in the study guide for the analytics development lifecycle. However, I cover them in a slightly different order.
Create and manage a Power BI project (.pbip)
However, Power BI projects are mentioned in the study guide. Even though the feature itself is still in preview.
You can think of Power BI projects along the same lines as SQL Server Database projects. Because when you save a report as a Power BI project is saves the metadata (code) of your Power BI report and the semantic model within a folder.
Which you can then work with in other applications such as Visual Studio Code. Like in the below example that I shared in a post about my initial Microsoft Fabric Git integration tests for Power BI Reports.
You can then take this one step further. By initializing the folder that the files are saved in to make it a Git repository. Which you can synchronize with a Git repository stored in the Azure Repos service in Azure DevOps and used along with the next section in this post.
I also cover that in my post about my initial Microsoft Fabric Git integration tests for Power BI Report.
Implement version control for a workspace
First item listed in the ‘Manage the analytics development lifecycle‘ section of the study guide mentions version control for a Microsoft Fabric workspace. In other words, Git integration. Where you connect a Microsoft Fabric workspace to a Git repository.
From there you can work directly with source control for various items in the Microsoft Fabric workspace. Not that the list of supported items changes often so it is worth checking occasionally.
Currently Git integration is only supported with only the Azure Repos service in Azure DevOps. I have covered how you can configure this in various posts. Including one that shows how you can work with Microsoft Fabric Git integration and multiple workspaces.
Plus, Git integration also introduces other possibilities as well. For example, allows you to perform Continuous Integration (CI) tests on Power BI reports as well. Utilizing Azure Pipelines within Azure DevOps.
Microsoft produced a Power BI Project (PBIP) and Azure DevOps build pipelines for continuous integration guide on how to do this. Covering how to automate checking reports with both Tabular Editor 2 and PBI-Inspector.
Plus, I covered Power BI Project (PBIP) and Azure DevOps CI performance tests in a previous post as well.
Plan and implement deployment solutions
In reality, the description for this section in the study guide can be interpreted in various ways. However, I think it predominantly refers to both Git integration and deployment pipelines.
I covered Git integration above. So, I will focus more on deployment pipelines here.
However, I do want to mention that you can work with both together by enabling Git integration for a workspace and then specifying it as the first stage in a deployment pipeline.
Anyway, deployment pipelines allow you to deploy various items to multiple workspaces through various stages. For example, a workspace for the Dev stage, another stage for Test workspace, etc.
It was initially introduced in Power BI and has been adopted to work with various items in Microsoft Fabric.
One of the recommended modules covers how to create and manage a Power BI deployment pipeline. However, that module was initially created for Power BI.
With this in mind I do recommend also going through the overview of Fabric deployment pipelines provided by Microsoft as well.
Plus, some of the updates I covered last year in a post about Git integration and deployment pipeline updates on the Microsoft Fabric roadmap are now available. So, it is worth keeping an eye on the list of supported items.
Perform impact analysis
Impact analysis is a very interesting topic within Microsoft Fabric for a variety of reasons. Yet it is not as popular as some other topics.
It basically allows you to view the potential impact of any changes to make to certain items. By showing you which items rely on them further downstream.
To help you avoid introducing any breaking changes or at least resolve them quickly.
Take for example a workspace that I use to test deployment pipelines. When I switch to lineage view within the workspace I see the below.
If I then click on the card for the SQL Server database as indicated above the below impact analysis appears.
As you can see, this can help a lot when identifying downstream dependencies.
Deploy and manage semantic models by using the XMLA endpoint
On that note, just a reminder that datasets were renamed to semantic models a while back. I kept the original title of the post above for context.
Anyway, this allows you to connect up to semantic models stored in Microsoft Fabric with a variety of tools. Including Tabular Editor and SQL Server Management Studio.
I showed how to enable read-write for XMLA endpoints within various capacities in my last post about the Microsoft Fabric admin portal.
Patrick LeBlanc shows the entire journey in action in a video that covers how to edit your Direct Lake Datasets (now semantic models) from Tabular Editor.
Create and deploy reusable assets
In reality, this section will be familiar to those who have studied for other Power BI exams. Because it states that you need to know about some of the different Power BI file types and assets.
In particular, the analytics development lifecycle requirements mention the below types of assets:
- Power BI template (.pbit) files which allows you to create report templates within Power BI desktop.
- Power BI data source (.pbids) files which allows you to save data sources in a file. So that others can open it up and connect to the data to start working on reports.
- Shared semantic models so that multiple people can create reports on the same one. Within a Microsoft Fabric workspace it is easy to share a semantic model. As long as you have the right permissions in the workspace of course.
Upcoming sessions relating to the analytics development lifecycle requirements
Towards the end of the week, I am speaking at two separate events relating to this post. In two slightly different ways.
On Thursday February 1st I am speaking online for the Birmingham Microsoft Data Platform Group. Where I am presenting a session called “Microsoft Fabric and Azure DevOps – The story so far”.
You can register to attend virtually on Meetup.
We are presenting a brand-new session called “Two-person boblet introduction to the DP-600 exam”. Which will be a unique online experience.
It is a topic that we both know about well. Due to the fact that we deliver custom training about the exam for out employer.
Final words about the Analytics development lifecycle requirements for the DP-600 exam
I do hope my own personal take on the analytics development lifecycle requirements for the DP-600 Microsoft Fabric exam helps some of you.
Because I wanted to fill in some gaps in this post. Plus, I wanted to make sure I shared plenty of links for those studying for the exam at the moment.
If you want other study resources for the exam I recommend that you read my previous post. Of course, if you have any comments or queries about this post feel free to reach out to me.