Skip to content

Update Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines

Reading Time: 7 minutes

In this post I want to cover how you can update Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines within Azure DevOps.

To clarify, “DeploymentPipelines-DeployAll.ps1” is one of two sample PowerShell scripts that you can download from Microsoft on their page that covers automating deployment pipelines via APIs.

You can implement this script to deploy all the items in a workspace to stages that you create in a Microsoft Fabric deployment pipeline.

I made some amendments to the script to show one way that you can get it to work with YAML Pipelines. Including how to customize it to deploy to multiple stages.

By the end of this post, you will know what the changes are required to update Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines. Plus, see it working. Along the way I share plenty of links.

You can find a copy of the amended script and a sample YAML pipeline in my AzureDevOps-AutomateFabricDeploymentPipeline GitHub repository. Which you can import into Azure DevOps and use as a template.

To manage expectations, Microsoft states that only Power BI items are currently supported by service principals. I have tested the below method with a Data Warehouse and it does appear to be the case for now.

So for now, you can deploy items like reports and semantic models until everything is supported by service principals. Alternatively, look to use another way to authenticate short-term.

In addition, knowing the below helps you understand the concept of working with approvals in Azure DevOps.

Why use DeploymentPipelines-DeployAll and YAML Pipelines with Fabric deployment pipelines?

Working with deployment pipelines within Microsoft Fabric can be a great experience. Because it provides a nice GUI-based way for you to deploy updates to multiple stages within the Microsoft Fabric environment.

Example of a deployment pipeline in Microsoft Fabric to test updating Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines
Example of a deployment pipeline in Microsoft Fabric

However, there are scenarios that require you to manage the deployments to different stages outside of the Microsoft Fabric environment.

For instance, when a more automated approach is required, or approvals are required to deploy to different workspaces. For example, when an approval is required for a deployment to a Production workspace.

To answer this scenario, Microsoft recently introduced deployment pipelines Fabric REST APIs. Which allows you to orchestrate deployment pipelines using services such as Azure Pipelines in Azure DevOps.

Which are great if you want to work with the APIs for custom scenarios relating to your deployment pipelines outside of the Microsoft Fabric environment.

However, if you just want a simple way to deploy to multiple stages outside of Microsoft Fabric without developing too deeply with the APIs you can look to use the DeploymentPipelines-DeployAll PowerShell script from Microsoft instead.

With a couple of tweaks, you can make it flexible enough to work with multiple stages in your deployment pipeline. Plus, it is a quick way to introduce an approvals process.

Customizing DeploymentPipelines-DeployAll to work with YAML pipelines

To customize the DeploymentPipelines-DeployAll PowerShell script I first created a new repository in Azure Repos in Azure DevOps. Deselecting the option to create a README file.

Create new repository in Azure DevOps
Create new repository in Azure DevOps

Once done, I created a new folder locally to store the local clone(copy) of the Git repository and opened it in Visual Studio Code.

I then downloaded the PowerShell script from the Microsoft page that covers automating deployment pipelines via APIs into a subfolder called “scripts”.

I then made the following code changes. You can choose alternative modifications if you wish.

First of all, I commented out the static parameters, and added them as parameter values that can be entered externally. As you can see below.

# Old static parameters
# $deploymentPipelineName = "MyDP"      # The name of the deployment pipeline
# $sourceStageName = "Development"                    # The name of the source stage
# $targetStageName = "Test"                    # The name of the target stage
# $deploymentNote = "Deployment completed"                       # The deployment note (Optional)

# New ones
param(
[string]$deploymentPipelineName,
[string]$sourceStageName,
[string]$targetStageName,
[string]$deploymentNote
)

I did this so that the script would be flexible enough to handle dynamic parameters provided by the YAML pipeline.

Second change I made was to comment out the Connect-AzAccount command as below.

    # Login to Azure
    #Connect-AzAccount | Out-Null

I did this so that the authentication can be handled by the YAML Pipeline instead.

After I had done this, I saved the file and initialized the folder as a Git Repository in Visual Studio Code.

Initialize Repository
Initialize Repository

I then performed an initial commit for the Git repository and synchronized it with the new repository in Azure DevOps. I covered how to do both of these in a previous post where I covered my initial Microsoft Fabric Git integration tests for Power BI Reports.

Preparing for the YAML Pipeline

Once I verified that the synchronization worked, I went into Azure Pipelines in Azure DevOps and created two new variable groups. One called GIDemoNS to store non-sensitive values such as stage names and another called GIDemoS to store sensitive values.

I then entered various values in the GIDemoNS variable group and sensitive values in the GIDemoS variable group.

Even though you can enter sensitive values manually I highly recommend you connect the variable group to Azure Key Vault instead and select secrets from there.

Afterwards, I created two Azure Pipeline environments to represent Test and Production stages. To show the value of the approvals process I added a new approval for the Production environment.

As you can see below, you can configure a lot of checks before deploying to an environment. It shows how powerful Azure Pipeline environments are.

Environment checks in Azure DevOps
Environment checks

One final thing I did before creatin a YAML pipeline is create a service principal and create a client secret for it. Since that is a perquisite for the working with APIs.

Creating the YAML Pipeline to update the Fabric Deployment Pipeline stages

After the above preparations I went into Pipelines and clicked the “New pipeline” button. Selecting the “Azure Repos Git” option and then selecting my new repository to create a new YAML file.

I then selected “Starter pipeline” and removed the default contents to start adding YAML.

I first YAML to make create a trigger for updates to the main branch. In addition, I referenced the two variable groups and stated which pool to work with.

trigger:
 - main

variables:
- group: GIDemoS
- group: GIDemoNS

pool:
  # vmimage: 'windows-latest'
  name: $(agentpool)

I opted to work with a self-hosted agent, however you can look to work a Microsoft-Hosted agent instead and add logic to install relevant PowerShell modules where required.

Afterwards, I added my first stage to orchestrate the deployment from the Dev stage in deployment pipelines to the test stage from the YAML pipeline itself. As highlighted below.

Dev and Test deployment pipeline stages to test updating Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines
Dev and Test deployment pipeline stages

I first declared the stage, a deployment job and the test environment as below. I used a deployment job in order to specify the environment.

stages:
- stage: Test
  displayName: 'Deploy to Test'
  jobs:
    - deployment: 'DeployTest'
      displayName: 'Deploy to Test'
      environment: Test

Afterwards, I declared a RunOnce strategy before running the below PowerShell tasks to authenticate the service principal.

- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: |
      $SecureStringPwd = ConvertTo-SecureString $(servicePrincipalKey) -AsPlainText -Force
      $pscredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $(servicePrincipalId), $SecureStringPwd
      
      Connect-AzAccount -ServicePrincipal -Credential $pscredential -Tenant $(tenantId)

As you can see, the above code authenticates using the service principal credentials.

I decided to authenticate in each stage to allow flexibility of which agents run each stage.

I then declared the below PowerShell to run after the authentication.

- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: '.\scripts\DeploymentPipelines-DeployAll.ps1 -deploymentPipelineName $(deploymentPipelineName) -sourceStageName $(DevStageName) -targetStageName $(TestStageName) -deploymentNote $(deploymentNote)'

Which calls the PowerShell script with below parameters:

  • deploymentPipelineName – Name of the deployment pipeline in Microsoft Fabric
  • DevStageName – The Dev stage name which is the source for the deployment in the deployment pipeline.
  • TestStageName – The Test stage name, which is the target stage for the deployment in the deployment pipeline.
  • deploymentNote – Any note I may wish to add.

Adding Production to the YAML pipeline

I then added a Production stage as below. Stating the Production environment.

- stage: Prod
  displayName: 'Deploy to Production'
  jobs:
    - deployment: 'DeployProd'
      displayName: 'Deploy to Prod'
      environment: Production

Afterwards, I added a test to authenticate again as before. However, this time the PowerShell task called different parameters. Due to the fact that this time around the Test stage was the source and the Production stage was the target.

script: '.\scripts\DeploymentPipelines-DeployAll.ps1 -deploymentPipelineName $(deploymentPipelineName) -sourceStageName $(TestStageName) -targetStageName $(ProdStageName) -deploymentNote $(deploymentNote)'

You can view the final YAML pipeline in the “automate-fabric-deployment-pipelines.yml” file in the GitHub repository.

After I has completed the YAML I ran the pipeline in Azure DevOps. Given permissions to access resources where required.

As expected, an approval was required due to the setting for the Production environment.

Approval required for Production environment to complete pipeline to update Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines
Approval required for Production environment

After the pipeline had completed, I went back into the deployment pipeline in Microsoft Fabric. I then checked the deployment history to make sure the latest deployments had taken place.

Viewing deployment history in deployment pipelines
Viewing deployment history

As you can see, it states that the Service Principal initiated the deployments. Since we authenticated with that account.

Final words

I hope me sharing how you can update Fabric Deployment Pipeline stages with DeploymentPipelines-DeployAll and YAML Pipelines within Azure DevOps helps some of you.

Especially if you want a way to perform approvals for deployment stages outside of Microsoft Fabric.

Of course, if you have any comments or queries about this post feel free to reach out to me.

Published inAzure DevOpsMicrosoft Fabric

7 Comments

  1. Jacob Jacob

    Hi Kevin,

    I’ve tried replicating most of this, however I have problems getting the yaml pipeline to work.

    A deployment pipeline with the requested name: ‘testPipeline’ was not found.
    Failed to deploy. Error reponse:
    {
    “requestId”: “d6424cef-2489-49a3-b171-bfb10ea5331e”,
    “errorCode”: “InvalidInput”,
    “moreDetails”: [
    {
    “errorCode”: “InvalidParameter”,
    “message”: “The value \u0027stages\u0027 is not valid for Guid.”
    }
    ],
    “message”: “The request has an invalid input”
    }

    Any help would be helpful thank you

    • Kevin Chant Kevin Chant

      Sure, are you using the right name for your Microsoft Fabric deployment pipeline in the parameters?

  2. Ali Ali

    Hi Kevin,

    After following your guide I have finally managed to get the scripts etc. to work, however my issue is, whenever my service principle is running the powershell, it doesn’t get returned any pipelines. After checking the documentation I can see they have this as a pre-requisite: Required Delegated Scopes
    Pipeline.Read.All or Pipeline.ReadWrite.All

    When I try to run the script with my own account it works fine and I get the deployment pipeline returned etc.

    I could not find the delegated scopes anywhere, can you maybe explain how you managed the service principle app registration so that it could access the pipeline and such?

    Thanks

      • Ali Ali

        Hi Kevin,

        I have assigned the workspace admin to my SP and also admin on the deployment pipeline.
        I keep getting the error: ”
        errorCode”: “PrincipalTypeNotSupported”,
        “message”: “The operation is not supported for the principal type”
        }

        I tried creating a service principle by creating an app registration with Tenant.ReadWriteAll granted access on PowerBI service.

        I added the app reg to a security group and added the security group as the admin on workspaces and deployment pipelines.

        The app reg is both owner and member of the Security group

        • Kevin Chant Kevin Chant

          Hi,

          Which item types are in your workspace? Because service principals only work with Power BI items at this moment in time.

          Kind regards

          Kevin

Leave a Reply

Your email address will not be published. Required fields are marked *