Skip to content

Migrate serverless SQL Pool objects to a Microsoft Fabric Data Warehouse

Reading Time: 5 minutes

In this post I want to cover one way you can migrate serverless SQL Pool objects to a Microsoft Fabric Data Warehouse using Azure DevOps. Since I was able to get this working.

To clarify, when I say serverless SQL Pools I mean the ones that are part of Azure Synapse Analytics. Which I covered in a previous post.

For those looking to find out more about Microsoft Fabric itself and how to get started I recommend looking at my one of my previous posts. Which covered how you can create your own Microsoft Fabric environment.

By the end of this post, you will know how to migrate serverless SQL Pool objects to a Microsoft Fabric Data Warehouse using Azure DevOps. Along the way I share plenty of links and advice.

One key point to note is that Microsoft Fabric is now generally available. You can read more about this in detail in the official post by Ryan Majidimehr.

Managing expectations

To manage expectations, this only covers database schema objects. Plus, I need to highlight that at the moment the T-SQL surface area in Microsoft Fabric is limited. Which severely limits what you can migrate. To the extent that for this post I migrated a very simple view.

Even though the options are limited, I still wanted to do this post. So that I can highlight a few important facts. Plus, share a template you can use to test this yourself. With this in mind, you can consider this post a precursor for future posts.

Please also be aware that I use a YAML pipeline within Azure DevOps for this post. So that I can make the code publicly available for you to work with. You can download the code used in this post by going to the AzureDevOps-serverlessSQLPoolToFabricDW GitHub repository.

My advice is to import the repository into Azure Repos and then connect it to Azure Pipelines. I go into details about how to do this in a post I wrote about how to connect a Database Project in Azure Repos to Azure Pipelines.

Prerequisites for this example

For this post I created a database in a serverless SQL Pool. In addition, I created a new Data Warehouse in Microsoft Fabric. As you can see below.

New Datawarehouse in Microsoft Fabric

I also have a YAML pipeline created in Azure Pipelines within Azure DevOps. So that I can perform CI/CD. Which allows me to either do the migration once or constantly. Below is a diagram to help you visualize how this works.

Diagram to visualize CD/CD
Diagram to visualize CD/CD

In order for the pipeline to work, it uses a YAML file which is stored in a Git repository in Azure Repos. Which contains the YAML file used for the pipeline.

In addition, I am using a self-hosted Azure Pipeline agent. Which has the latest version of SqlPackage installed.

You can read more about SqlPackage options for pipelines a blog post I wrote about how to install SqlPackage to work with dacpacs for serverless SQL Pools.

Migrating serverless SQL Pool objects to Microsoft Fabric Data Warehouse

To begin, I first I tested extracting objects from the serverless SQL Pool within Azure Pipelines.

As you can see below, the YAML syntax I used to extract the schema is the same YAML syntax that I used in a previous post. Where I covered how you can deploy a dacpac to a serverless SQL pool.

steps:
 - task: PowerShell@2
 inputs:
   targetType: 'inline'
   script: |
       # PowerShell to extract contents of a database in a serverless SQLPool into a dacpac file
            
 SqlPackage /Action:Extract /TargetFile:$(Build.ArtifactStagingDirectory)\$(TargetFile) /p:VerifyExtraction=true /SourceServerName:$(SQLPoolEndPoint) /SourceDatabaseName:$(SourceDB) /SourceUser:$(SQLPooluser) /SourcePassword:$(SQLPoolpw)

 - task: PublishBuildArtifacts@1
 displayName: 'Publishes dacpac as an artifact'
  # Publishes the dacpac as part of an artifact within Azure DevOps
        inputs:
          PathtoPublish: '$(Build.ArtifactStagingDirectory)'
          ArtifactName: $(SQLPoolartifactname)
          publishLocation: 'Container'

After checking that worked, I began to test deploying the dacpac to a Microsoft Fabric Data Warehouse.

Challenges deploying the dacpac

To be honest, I encountered various challenges to get the deployment working. Including the below:

  • I changed the authentication to be ‘Active Directory – Password’ so that I could authenticate properly. Due to the setup of my trial Microsoft Fabric environment.
    Depending on your environment you might be able to use ‘Active Directory Integrated’ instead. Read the Azure SQL Database Deployment Readme for further details.
  • In addition, I created a new Azure Active Directory account that was exempt from using Azure Active Directory multi-factor authentication. I do not recommend doing this. I did it purely for this post.
  • Afterwards, I made the account a contributor to the Microsoft Fabric workspace that contains the Data Warehouse.
  • I had to add an additional argument based on the SQLPackage Publish documentation. In order to resolve a collation issue.
    A special thanks to Andy Cutler for this post. Which I found during a quick online search which steered me in the right direction.
  • Finally, I had to remove various objects from the source database. Including an external data source. Due to the limited T-SQL surface area in Microsoft Fabric.

YAML code to deploy to a Microsoft Fabric Data Warehouse

After I had addressed the above challenges the below code worked and I was able to deploy the contents of the dacpac to a Microsoft Fabric Data Warehouse.

- stage: FabricDW
  displayName: 'Deploy Data Warehouse'
  jobs:
    - deployment: 'FabricDW'
      displayName: 'Fabric Data Warehouse'
      environment: Production

      pool: 
        name: $(agentpool)

      strategy:
        runOnce:
          deploy:
            steps:
                  - task: DownloadBuildArtifacts@0
                    displayName: 'Download Artifacts'
                    inputs:
                      buildType: 'current'
                      downloadType: 'specific'
                      artifactName:  '$(SQLPoolartifactname)'
                      downloadPath: '$(System.ArtifactsDirectory)'

                  - task: SqlAzureDacpacDeployment@1
                    displayName: 'Install DACPAC on Fabric DataWarehouse'
                    inputs:
                      azureSubscription: $(AzureSubscription)
                      AuthenticationType: 'aadAuthenticationPassword'
                      aadSqlUsername: '$(aadSqlUsername)'
                      aadSqlPassword: '$(aadSqlpw)'
                      ServerName: $(DestSQLConnString)
                      DatabaseName: '$(DestinationDW)'
                      deployType: 'DacpacTask'
                      DeploymentAction: 'Publish'
                      DacpacFile: '$(System.ArtifactsDirectory)\$(SQLPoolartifactname)\$(Targetfile)'
                      AdditionalArguments: '/p:VerifyCollationCompatibility=False'

To verify this, I checked that the pipeline worked in Azure DevOps.

Completed pipeline in Azure DevOps after I migrated serverless SQL Pool objects to a Microsoft Fabric Data Warehouse
Completed pipeline in Azure DevOps

Out of curiosity, I downloaded the dacpac that had been created. I then renamed it to a zip file and extracted its contents.

I did this so that I could view the ‘model.xml’ file. In order to confirm that SQLPackage still used the Database Schema Provider for the serverless SQL Pools that I discovered in a previous post. Which is ‘Microsoft.Data.Tools.Schema.Sql.SqlServerlessDatabaseSchemaProvider’.

Anyway, after checking that the pipeline worked in Azure DevOps I looked at the below Data Warehouse in Microsoft Fabric and checked that the view was there.

View in a Microsoft fabric Data Warehouse after I migrated serverless SQL Pool objects to a Microsoft Fabric Data Warehouse
View in Microsoft Fabric Data Warehouse

As you can see in the data preview the view is rather limited. However, it works.

Final words about migrating serverless SQL Pool objects to a Microsoft Fabric Data Warehouse

I hope this post about one way that you can migrate serverless SQL Pool objects to a Microsoft Fabric Data Warehouse using Azure DevOps has been an interesting insight.

Like I mentioned earlier, its functionality is rather limited at the moment. So, you can consider this post a precursor for a future post.

Hopefully, the T-SQL surface area will be expanded in the future. In addition, I hope Microsoft introduce a smooth mechanism to migrate objects over.

Of course, if you have any comments or queries about this post feel free to reach out to me.

Published inAzure Synapse AnalyticsMicrosoft Fabric

3 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *