Skip to content

Analytics development lifecycle requirements for the DP-600 Microsoft Fabric exam

Reading Time: 7 minutes

In this post I want to cover the analytics development lifecycle requirements for the DP-600 exam for Microsoft Fabric. Based on the contents of the DP-600 study guide.

I wanted to do this post for a few reasons.

First reason is due to the number of questions I get about it. Since I tend to do a lot of work in the CI/CD area.

Second reason is because I wanted to do a follow up after my comment about this in a previous post. Where I shared my thoughts about the DP-600 exam for the new Microsoft Fabric certification.

Thirdly, because I think that there are some gaps in the recommended Microsoft Learn material for the exam. More noticeably, for Power BI projects which is still in preview. So, I want to highlight where some of my previous posts covered these items.

By the end of this post, you will know more about what is involved. Plus, I share plenty of links along the way. In addition, a mention a couple of upcoming sessions where I share details about this area as well.

But first, allow me to dispel a myth about CI/CD for Data Warehouses.

CI/CD for Data Warehouses

Before I go any further, I want to make it clear that CI/CD for Microsoft Fabric Data Warehouses is possible. However, it is currently not supported by Microsoft Fabric Git integration.

I cover how you can do this in a couple of posts. Including one that covers CI/CD for Microsoft Fabric Data Warehouses using YAML Pipelines. Where I also share details about a GitHub repository that you can download and use as a template.

However, CI/CD for Data Warehouses is currently not mentioned at all in the study guide. Which suggests that you do not need to know this. I was just wanted to highlight that it is possible.

Anyway, below I will focus on what is required for the analytics development lifecycle according to the DP-600 study guide. However, I cover them in a slightly different order.

Create and manage a Power BI project (.pbip)

I want to cover Power BI projects first because they are not covered in the recommended Microsoft Learn material for the exam. However, Power BI projects are mentioned in the study guide. Even though the feature itself is still in preview.

You can think of Power BI projects along the same lines as SQL Server Database projects. Because when you save a report as a Power BI project is saves the metadata (code) of your Power BI report and the semantic model within a folder.

Selecting Power BI project files as part of the Analytics development lifecycle requirements
Selecting Power BI project files

Which you can then work with in other applications such as Visual Studio Code. Like in the below example that I shared in a post about my initial Microsoft Fabric Git integration tests for Power BI Reports.

Viewing Power BI project in Visual Studio Code
Viewing Power BI project in Visual Studio Code

Alternatively, you can now enable the Tabular Model Definition Language (TMDL) format feature in Power BI Desktop for your semantic model. Which also allows you to upgrade an existing Power BI Project to include TMDL file format.

New TMDL folder structure

Note that the above format will differ if you create a new Power BI Project now. Due to the fact that the folder name now reflects the fact that datasets have been renamed to semantic models.

Anyway, you can take the above one step further. By initializing the folder that the files are saved in to make it a Git repository. Which you can synchronize with a Git repository stored in the Azure Repos service in Azure DevOps and used along with the next section in this post.

I also cover that in my post about my initial Microsoft Fabric Git integration tests for Power BI Report.

Implement version control for a workspace

First item listed in the ‘Manage the analytics development lifecycle‘ section of the study guide mentions version control for a Microsoft Fabric workspace. In other words, Git integration. Where you connect a Microsoft Fabric workspace to a Git repository.

Workspace settings for Git integration when working with the analytics development lifecycle
Workspace settings for Git integration

From there you can work directly with source control for various items in the Microsoft Fabric workspace. Not that the list of supported items changes often so it is worth checking occasionally.

Currently Git integration is only supported with only the Azure Repos service in Azure DevOps. I have covered how you can configure this in various posts. Including one that shows how you can work with Microsoft Fabric Git integration and multiple workspaces.

Plus, Git integration also introduces other possibilities as well. For example, allows you to perform Continuous Integration (CI) tests on Power BI reports as well. Utilizing Azure Pipelines within Azure DevOps.

Microsoft produced a Power BI Project (PBIP) and Azure DevOps build pipelines for continuous integration guide on how to do this. Covering how to automate checking reports with both Tabular Editor 2 and PBI-Inspector.

Plus, I covered Power BI Project (PBIP) and Azure DevOps CI performance tests in a previous post as well.

Plan and implement deployment solutions

In reality, the description for this section in the study guide can be interpreted in various ways. However, I think it predominantly refers to both Git integration and deployment pipelines.

I covered Git integration above. So, I will focus more on deployment pipelines here.

However, I do want to mention that you can work with both together by enabling Git integration for a workspace and then specifying it as the first stage in a deployment pipeline.

Anyway, deployment pipelines allow you to deploy various items to multiple workspaces through various stages. For example, a workspace for the Dev stage, another stage for Test workspace, etc.

It was initially introduced in Power BI and has been adopted to work with various items in Microsoft Fabric.

One of the recommended modules covers how to create and manage a Power BI deployment pipeline. However, that module was initially created for Power BI.

With this in mind I do recommend also going through the overview of Fabric deployment pipelines provided by Microsoft as well.

Plus, some of the updates I covered last year in a post about Git integration and deployment pipeline updates on the Microsoft Fabric roadmap are now available. So, it is worth keeping an eye on the list of supported items.

Perform impact analysis

Impact analysis is a very interesting topic within Microsoft Fabric for a variety of reasons. Yet it is not as popular as some other topics.

It basically allows you to view the potential impact of any changes to make to certain items. By showing you which items rely on them further downstream.

To help you avoid introducing any breaking changes or at least resolve them quickly.

Take for example a workspace that I use to test deployment pipelines. When I switch to lineage view within the workspace I see the below.

Lineage view in a workspace
Lineage view in a workspace

If I then click on the card for the SQL Server database as indicated above the below impact analysis appears.

Impact analysis for changes to database
Impact analysis for changes to database

As you can see, this can help a lot when identifying downstream dependencies.

Deploy and manage semantic models by using the XMLA endpoint

Last year Microsoft introduced the ability to connect up to XMLA endpoints to work with semantic models directly within Microsoft Fabric. Including XMLA support for Direct Lake datasets.

On that note, just a reminder that datasets were renamed to semantic models a while back. I kept the original title of the post above for context.

Anyway, this allows you to connect up to semantic models stored in Microsoft Fabric with a variety of tools. Including Tabular Editor and SQL Server Management Studio.

I showed how to enable read-write for XMLA endpoints within various capacities in my last post about the Microsoft Fabric admin portal.

Patrick LeBlanc shows the entire journey in action in a video that covers how to edit your Direct Lake Datasets (now semantic models) from Tabular Editor.

Create and deploy reusable assets

In reality, this section will be familiar to those who have studied for other Power BI exams. Because it states that you need to know about some of the different Power BI file types and assets.

In particular, the analytics development lifecycle requirements mention the below types of assets:

  • Power BI template (.pbit) files which allows you to create report templates within Power BI desktop.
  • Power BI data source (.pbids) files which allows you to save data sources in a file. So that others can open it up and connect to the data to start working on reports.
  • Shared semantic models so that multiple people can create reports on the same one. Within a Microsoft Fabric workspace it is easy to share a semantic model. As long as you have the right permissions in the workspace of course.

Upcoming sessions relating to the analytics development lifecycle requirements

Towards the end of the week, I am speaking at two separate events relating to this post. In two slightly different ways.

On Thursday February 1st I am speaking online for the Birmingham Microsoft Data Platform Group. Where I am presenting a session called “Microsoft Fabric and Azure DevOps – The story so far”.

You can register to attend virtually on Meetup.

Plus, two days later on the Saturday I am presenting again with Pragati Jain (l/t) at Data Toboggan.

We are presenting a brand-new session called “Two-person boblet introduction to the DP-600 exam”. Which will be a unique online experience.

It is a topic that we both know about well. Due to the fact that we deliver custom training about the exam for out employer.

Final words about the Analytics development lifecycle requirements for the DP-600 exam

I do hope my own personal take on the analytics development lifecycle requirements for the DP-600 Microsoft Fabric exam helps some of you.

Because I wanted to fill in some gaps in this post. Plus, I wanted to make sure I shared plenty of links for those studying for the exam at the moment.

If you want other study resources for the exam I recommend that you read my previous post. Of course, if you have any comments or queries about this post feel free to reach out to me.

Published inAzure DevOpsDP-600Microsoft Fabric

One Comment

Leave a Reply

Your email address will not be published. Required fields are marked *