In this post I show one way you can deploy a working Direct Lake semantic model with fabric-cicd and Fabric CLI as part of your Microsoft Fabric CI/CD strategy.
To clarify, when I say working I mean that the Direct Lake semantic model points to the correct SQL analytics endpoint for a Lakehouse in the new workspace. Like in the below example for the popular FUAM monitoring solution.

Because when you deploy a Direct Lake semantic model based on the contents of a Git repository by default it still points to the original SQL endpoint and database. Which will show errors when opened and prevents any refreshes.

You must change both the SQL endpoint and the database in order for the Direct lake semantic model to work properly.
I show one way you can deploy a working Direct Lake semantic model with a GitHub workflow in this post. Based on the FUAM deployment accelerator that I created. Along the way I share plenty of links.
To manage expectations, I will only cover the important steps in this post. Instead of the entire GitHub workflow.
In addition, I do not cover how to create the tables in the deployed Lakehouse as part of this process. Which is a requirement for your Direct Lake semantic model to work. You can see an example of how to do that in my GitHub-FUAM-Deploymenator GitHub repository.
Neither I do not dive into the setup of variables and parameters in a GitHub Actions workflow. You can read more about those in my post about managing Fabric connections with Fabric CLI when performing CI/CD.
However, I do share plenty of details and links.
Deploy a Direct Lake semantic model with fabric-cicd and Fabric CLI
This example assumes that a workspace is available. Plus, a service principal has been provisioned. I split this section into three parts.
First part deals with what is required in the “parameter.yml” file. Second part deals with the typical preparation steps in the GitHub Actions workflow itself . Third deals with the logic steps for the main deployment.
Find_replace values required for parameter file
When you work with fabric-cicd you get to work with parameterization through a “parameter.yml” file. This particular example works with a combination of a generic find_replace operation and a dynamic replacement.
We require the generic find_replace operation to replace the database id value that we find by other means. In addition, the dynamic replacement to replace the SQL connection string for the SQL Endpoint.
With this in mind, the “parameter.yml” file should contain both of these. Like in the below example.
find_replace:
# Replace SQL Endpoint connection string
- find_value: "madeupconnection.datawarehouse.fabric.microsoft.com" # Original SQL endpoint connection string
replace_value:
Test: "$items.Lakehouse.FUAM_Lakehouse.sqlendpoint" # New SQL Endpoint connection string
Prod: "$items.Lakehouse.FUAM_Lakehouse.sqlendpoint" # New SQL Endpoint connection string
- find_value: "44444444-4444-4444-4444-444444444444" # Original SQL endpoint database id value
replace_value:
Test: "databaseid" # New SQL Endpoint database id
Prod: "databaseid" # New SQL Endpoint database id
item_type: "SemanticModel"
### Other replacement values
In reality, the “parameter.yml” will also contain other values. However, the above highlights the key two sets of replacement values required for a Direct Lake semantic model to work with this method.
Preparation steps in your GitHub workflow
First step specifies the Python version to work with.
- name: Setup Python
uses: actions/setup-python@v5.5.0
with:
# Version range or exact version of Python or PyPy to use, using SemVer's version range syntax. Reads from .python-version if unset.
python-version: 3.12
Once done, the GitHub workflow installs the necessary libraries on the GitHub Runner.
# Install necessary libraries
- name: Install necessary python libraries
run: |
python -m pip install --upgrade pip
pip install ms-fabric-cli
pip install fabric-cicd
Afterwards, a step authenticates as a service principal for Fabric CLI.
- name: Authenticate as Service Principal for Fabric-CLI
run: |
# Setting plain-test fallback which is recommended for CI/CD on GitHub runners
fab config set encryption_fallback_enabled true
fab auth login -u ${{github.event.inputs.Client_ID}} -p ${{github.event.inputs.Client_Secret}} --tenant ${{github.event.inputs.Azure_Tenant_ID}}
Another step checks out the Git repository in the GitHub Runner. To make sure that the workflow can change files in the repository properly.
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v4.2.2
Afterwards, a step authenticates as a service principal to work with fabric-cicd.
# Authenticate as Service Principal for fabric-cicd
- name: Authenticate as Service Principal for fabric-cicd
run: |
Install-Module -Name Az.Accounts -AllowClobber -Force
$SecureStringPwd = ConvertTo-SecureString ${{github.event.inputs.Client_Secret}} -AsPlainText -Force
$pscredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList ${{github.event.inputs.Client_ID}}, $SecureStringPwd
Connect-AzAccount -ServicePrincipal -Credential $pscredential -Tenant ${{github.event.inputs.Azure_Tenant_ID}}
Logic steps to deploy a Direct Lake semantic model with fabric-cicd and Fabric CLI
In order to deploy a Direct Lake semantic model that points to a newly deployed SQL Endpoint for a Lakehouse both the SQL Endpoint connection string and the database id value are required.
Currently, only the SQL endpoint connection string can be dynamically replaced with fabric-cicd. With this in mind, I came up with a double tap approach. Which requires two deployments with fabric-cicd as per the below diagram.

Performing two deployments means that the GitHub workflow can get the correct details for the SQL analytics endpoint that is deployed with the Lakehouse after the first deployment. So that it has the correct details for the Direct Lake semantic model deployment.
One thing I must stress is that the source tables for the Direct Lake semantic model must exist in the new Lakehouse before work is done with the semantic model. This can be done at any stage after doing the first fabric-cicd deployment.
I do not cover adding the tables in this post. However, you can see an example of how to do that in my GitHub-FUAM-Deploymenator GitHub repository.
Anyway, first logical step for this process deploys all of the Fabric items in the Git repository apart from the reports and semantic models. So that the relevant Lakehouse and SQL analytics endpoint are first deployed.
# Run script to deploy all items APART FROM reports and semantic models with fabric-cicd to new workspace
- name: Run script to get the workspace Id and deploy FUAM with fabric-cicd to new workspace
run: |
$WorkspaceId = fab get ${{github.event.inputs.WorkspaceName}}.Workspace -q id
python auth_spn_secret_AzDo.py --WorkspaceId $WorkspaceId --Environment "Prod" --RepositoryDirectory ".\workspace" --ItemsInScope ${{env.FirstItemsInScope}}
Once done another step issues the fab api command which gets the database id value for the deployed SQL endpoint. Which then replaces the static “databaseid” in the “parameter.yml” file.
# Run script to find the databaseid values for the new SQL endpoints and replace them in the parameter file
- name: Run script to find the databaseid values for the new SQL endpoints
run: |
$WorkspaceId = fab get ${{github.event.inputs.WorkspaceName}}.Workspace -q id
$response = fab api -X get "/workspaces/$WorkspaceId/items?itemType=SQLEndpoint"
$data = $response | ConvertFrom-Json
$databaseid = ($data.text.value | Where-Object {
$_.displayName -eq "FUAM_Lakehouse" -and $_.type -eq "SQLEndpoint"
}).id
# Path to parameter.yml file
$filePath = "workspace\parameter.yml"
# Read file, replace value, overwrite file
(Get-Content $filePath) -replace 'databaseid', $databaseid | Set-Content $filePath
Final step deploys the reports and semantic models stored in the Git repository with fabric-cicd. Ensuring that the Direct Lake semantic model utilizes the SQL endpoint connection string and the database id values specified in the “parameter.yml” file.
- name: Run script to get the workspace Id and deploy FUAM with fabric-cicd to new workspace
run: |
$WorkspaceId = fab get ${{github.event.inputs.WorkspaceName}}.Workspace -q id
python auth_spn_secret_AzDo.py --WorkspaceId $WorkspaceId --Environment "Prod" --RepositoryDirectory ".\workspace" --ItemsInScope ${{env.SecondItemsInScope}}
Testing that the Direct Lake semantic model works
Providing that the above steps have completed and the tables exist in the newly provisioned Lakehouse you can go into the Microsoft Fabric workspace and check that the new semantic model is working.
You can also perform a quick test by starting a semantic model refresh.
Final words
I hope this method to deploy a working Direct Lake semantic model with fabric-cicd and Fabric CLI has proven to be insightful.
In reality, you can look to work with other techniques. However, this one seems the most sensible when looking to deploy a large number of items stored in a Git repository.
For further advice about implementing FUAM feel free to read my other post that covers tips on implementing FUAM in Microsoft Fabric.
Of course, if you have any comments or queries about this post, feel free to reach out to me.
Be First to Comment