Skip to content

Initial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration

Reading Time: 7 minutes

In this post I want to share the results of some initial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration.

For example, the Development, Test and Production environments. Just like my previous post about deploying to Development, Test and Production environments.

I wanted to cover this since I have been asked a couple of times about this about topics relating to this.

By the end of this post, you will know how to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration. Along the way I share plenty of links.

If you intend to follow along with this post it might be a good idea to have files from another Power BI Project locally. Like the one I mentioned in my previous post about upgrading an existing Power BI Project to include TMDL file format.

In addition, a separate Microsoft Fabric workspace to create the new objects.

The current issue with copying a Direct Lake semantic model

Currently, semantic models that use Direct Lake are not natively supported by Git integration.

For example, below I have created a custom semantic model that uses Direct Lake mode. I can tell that it is using Direct Lake mode because when I hover it over the table it tells me. Plus, when I select semantic model is shows compatibility level 1604.

Initial Direct Lake semantic model
Initial Direct Lake semantic model

However, when I look in the workspace that has Git integration configured, I can see it is unsupported.

Manually created Direct Lake semantic model is unsupported
Manually created Direct Lake semantic model is unsupported

Which means that a manually created semantic model cannot be natively saved to the Azure Repos repository so that it can be setup in another workspace.

Initial tests to copy a Direct Lake semantic model by creating an additional report

However, I can create a report based on the semantic model and download the ‘.pbix’ file for it.

By selecting File -> Download this File. In the window that appears I selected the second option as below. Otherwise, the report will not open in Power BI Desktop.

Selecting second option in Power BI Desktop
Selecting second option in Power BI Desktop

Which opened in Power BI Desktop. When I went to save the report as a Power BI Project it will only contain the files for the report. However, in the ‘definition.pbir’ file it pointed to the semantic model used in Microsoft Fabric. As you can see below.

{
  "version": "1.0",
  "datasetReference": {
    "byPath": null,
    "byConnection": {
      "connectionString": "Data Source=powerbi://api.powerbi.com/v1.0/myorg/xxxxx;Initial Catalog=DLTestDev;Access Mode=readonly;Integrated Security=ClaimsToken",
      "pbiServiceModelId": null,
      "pbiModelVirtualServerName": "sobe_wowvirtualserver",
      "pbiModelDatabaseName": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
      "name": "EntityDataSource",
      "connectionType": "pbiServiceXmlaStyleLive"
    }
  }
}

Out of curiosity, I decided to test the ‘Make changes to this model’ option in the Model section and save it as a different Power BI Project. Doing this will give me the project files used in the semantic model as you can see below.

Folder structure after selecting to make changes to the model
Folder structure after selecting to make changes to the model

However, this also means that the semantic model is using DirectQuery mode to the semantic model when published into Fabric using Git integration.

In the ‘expressions.tmdl’ file I can see that the source in the expression references the original semantic model.

Source = AnalysisServices.Database("powerbi://api.powerbi.com/v1.0/myorg/xx", "DLTestDev"),

Copying the Direct Lake semantic model using Tabular Editor 3

I then decided to do some testing with Tabular Editor 3. So, I connected up to the original Direct Lake semantic model that I wanted to copy to another workspace. As you can see below.

Connection to original semantic model as part of nitial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration
Connection to original semantic model

I first tested deploying the model directly from Tabular Editor 3 to a workspace that has Git integration configured. Which works, however after committing the change in the workspace I saw in the Azure Repos service in Azure DevOps that the ‘model.bim’ file is created.

Contents of Git repository when deploying a semantic model into Microsoft Fabric using Tabular Editor 3
Contents of Git repository when deploying a semantic model into Microsoft Fabric using Tabular Editor 3

Which in itself is fine. However, I wanted a solution that used the TMDL file format. Which I covered in my previous post about upgrading an existing Power BI Project.

So, I decided to see what happens if I save the semantic model to a folder using Tabular Editor 3 instead.

Saving the semantic model in Tabular Editor 3

First I changed the save preference in Tabular Editor 3 to save as the 1604 compatibility level. Which is the current compatibility level for Direct Lake.

Changing compatibility level in Tabular Editor 3
Changing compatibility level

I then saved the files for the model locally into a ‘DLTest.dataset\definition’ folder structure that I had created.

Saved contents from Tabular Editor 3
Saved contents from Tabular Editor 3

When I checked the ‘database.tmdl’ file I saw that the correct compatibility level was there.

database DLTestDev
	id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
	compatibilityLevel: 1604
	language: 1033

In addition, I changed the name of the database value in the ‘database.tmdl’ file to reflect the new name I wanted for the semantic model.

database DLTest

Afterwards, I added the below three files from another Power BI Project that contained a dataset to the root of the ‘DLTest.dataset’ folder.

  • definition.pbidataset
  • item.config.json
  • item.metadata.json

Once done, I changed the displayName value in the ‘item.metadata.json’ file as below.


  "displayName": "DLTest"

I then saved the files, committed the changes and synchronized the repository with the one in Azure Repos.

When I refreshed my workspace in Microsoft Fabric, I got informed that I had pending updates.

So, I went into Source control and selected ‘Update all’.

Resolving UTF-8 issue

At first, I got a parsing error, which showed and error code of ‘Workload_FailedToParseTmdlFile’.

When I looked in Azure Data Studio, I noticed that the TMDL files that had been saved by Tabular Editor were being recognized as ‘UTF-8 with BOM’. So, I changed the TMDL files to UTF-8 using Azure Data Studio.

I then committed the change and synchronized again. Which this time worked.

When I looked in Microsoft Fabric it all seemed well. Until I decided to open up the data model and it showed an error.

Data model showing an error
Data model showing an error

In addition, when I hovered over the table it gave me a very interesting last refresh date from the year 1699.

So, I clicked on the ‘New report’ button to see if I got any immediate details. Which is when I got the below error.

This data source 2c01b432-24d0-43d6-9975-6edc4cfc3d52 is missing credentials. Add the data source credentials by using the dataset settings page .

At this stage I could have just amended the credentials to point to the original Lakehouse to resolve this. However, I wanted the semantic model to point to a new Lakehouse that was in the same workspace as the new semantic model.

Pointing the copied Direct Lake semantic model to a new Lakehouse

In order to test connecting to a new Lakehouse I created a new Lakehouse in the same workspace as the semantic model and uploaded the same data into it.

I then navigated to the SQL analytics endpoint for the Lakehouse and copied the SQL connection string for it.

Afterwards, I edited the ‘expressions.tmdl’ file in my local clone (copy) of the repository that stores the semantic model to the below.

expression DatabaseQuery =
		let
		    database = Sql.Database("{NEW CONNECTION STRING}", "2c01b432-24d0-43d6-9975-6edc4cfc3d52")
		in
		    database
	lineageTag: 617129de-adbe-4f7b-abea-edc4174518e2

	annotation PBI_IncludeFutureArtifacts = False

I then committed my changes locally and synchronized my local repository with the one in Azure Repos. Afterwards, I went back into Microsoft Fabric and committed and updated my outstanding changes.

I then went back to my new semantic model and opened up the data model again. It still showed an error, so I clicked on the ‘New report’ button again. However, this time the report worked, and I could select data.

So, I went back into the data model reporting an issue. It was still showing an interesting refresh date.

Data model showing interesting refresh date whilst performing initial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration
Data model showing interesting refresh date

To resolve this, I went back into the workspace and clicked on “Refresh now”.

Doing this resolved this particular issue. Because when I went to look at the data model again everything looked good.

When I hovered over the table is showed that it is using Direct Lake mode. Plus, it still showed the correct compatibility level for Direct Lake mode.

Data model with sensible refresh date during initial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration
Data model with sensible refresh date

Verifying my initial tests to copy a Direct Lake semantic model

I tested creating a couple of reports afterwards to check that it really was getting data from the model. Including one which was just a table that showed all the fields.

I then compared these reports with ones based on the original semantic model. To confirm that my attempt to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration worked. It all worked as expected.

Final words about my initial tests to copy a Direct Lake semantic model to another workspace

I hope that sharing my initial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration has inspired some of you. Especially since I know that some people were wondering about this.

As you can see, I performed some manual tasks to get this to perform gracefully. However, some of these can be automated. In addition, aware there are other ways that this can be achieved as well.

Of course, if you have any comments or queries about this post feel free to reach out to me. Especially if you test this yourself.

Published inAzure DevOpsMicrosoft Fabric

3 Comments

  1. Erfan Mahmoodnejad Erfan Mahmoodnejad

    Hi Kevin,

    Awesome post as always! good job!
    Would you please mention in your post that the TMDL option needs to be enabled on Tabular Editor as well? It took me a while to find it.
    Preferences -> File Formats -> Save-to-folder-> Serialization mode -> TMDL(preview)

    Thanks

    Erfan

    • Kevin Chant Kevin Chant

      Hi Erfan,

      I double checked today and it is already stated in the post at the start of the “Saving the semantic model in Tabular Editor 3” section.

      Kind regards

      Kevin

Leave a Reply

Your email address will not be published. Required fields are marked *