Build a simple and extensible multi-stage Azure DevOps pipeline with YAML and Powershell for Microsoft Dataverse solutions.

Aymeric Mouillé
5 min readFeb 7, 2021

--

OK, here’s the scenario : you need to implement a CI to UAT and then deploy to production with approvals.

Simple multi-stage pipeline

Thanks to Azure DevOps multi-stage pipeline, you can design your delivery lifecycle with yaml which simplify pipeline management with yaml code in repository and multi-stage give a better way to follow the overall process.

The following approach introduce the ability to script any custom tasks directly into YAML file with Powershell commands, so you can implement full automation thanks to PowerDataOps module that provide more than 100 cmdlets for Dataverse.
https://github.com/AymericM78/PowerDataOps

Prerequisites
Before proceeding, you need to assert that you have the following prerequisites:

  • Azure DevOps : You need a Azure DevOps project (it’s free for 5 users) with a Build Administrator account
  • Dataverse : You need “system administrator” role on each instance.
  • Solutions : You need to know which is (or are) the solution(s) to move across instance in which order and if they need to be imported in managed (recommanded) or unmanaged.

Configuration
In order to provide all contextual informations to export and import operations, you need to configure Azure DevOps Variable Groups and Environments.

First configuration, is a common one which described parameters that apply to all proces :
- “Solutions” : Indicate unique name of solutions to export.
- “Solutions.ImportOrder” : Indicate unique name of solutions to import (yes same than above), but you can specify solution import order and in some case a solution name twice (for example Security Role solution must be imported before and after entities).
- “Version” : Indicate version number to apply to solutions before export (X is replaced by buildid, you can also use datetime format like yyyy.MMdd)

Azure DevOps Variable Groups: Common variables definition

And for each environment, you create a dedicated variable group with only one variable for ConnectionString (https://docs.microsoft.com/en-us/powerapps/developer/data-platform/xrm-tooling/use-connection-strings-xrm-tooling-connect)

Azure DevOps Variable Groups: Environment variables definition

Then you need to describe your environments (DEV, UAT, PROD) in Environnments section :

Azure DevOps Environment definition

You could specify approval or other rules like business hours

Azure DevOps Approval and Checks
Azure DevOps Environment Checks

Important : For production you need to apply a short timeout in order to let the build complete without waiting during a long period.

Azure DevOps Production Approval

Pipeline creation

Now you can proceed to pipeline creation based on Yaml.

You will need 3 yaml file to produce the following structure :
- BuildAndDeploy.yml : Overall process
- Build.yml : First stage that export solution from dev
- Deploy.yml : Next stages that import solution from artifacts to target environment.

BuildAndDeploy.yml

# ===========================================================
# PowerDataOps MultiStage Deployment
# -----------------------------------------------------------
# Build : Update solutions version and export to artifacts
# Deploy : Import solution on specific environment
# for each stage
# ===========================================================
trigger: noneschedules:
- cron: "0 6 * * *"
displayName: Daily build
branches:
include:
- master (your branch name)
always: true
pool:
vmImage: 'windows-latest'
variables:
- group: 'MyProjectName.Common'
stages:
- template: Build.yml
parameters:
environment: 'DEV'
variableGroup: 'MyProjectName.DEV'
- template: Deploy.yml
parameters:
environment: 'UAT'
variableGroup: 'MyProjectName.UAT'
- template: Deploy.yml
parameters:
environment: 'PROD'
variableGroup: 'MyProjectName.PROD'

As you can see, this pipeline is not a real Continuous Integration as it is triggered by a cron at 6 every day. You can explore other PowerDataOps and DevOps features to apply the trigger to commit and then deploy Plugins and Webresources from repository (for example)

Build.yml

# ===========================================================
# PowerDataOps Build Stage
# -----------------------------------------------------------
# Update solutions version and export to artifacts
# ===========================================================
parameters:
- name: 'environment'
default: ''
type: string
- name: 'variableGroup'
default: ''
type: string
stages:
- stage: Build
variables:
- group: ${{ parameters.variableGroup }}
jobs:
- job: Package
displayName: 'Package Solutions'
steps:
- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
Install-Module PowerDataOps -Force -Scope CurrentUser
displayName: 'Install module PowerDataOps'

- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
Import-Module PowerDataOps;
Set-XrmSolutionsVersionBuild;
displayName: 'Upgrade solutions version'

- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
Import-Module PowerDataOps;
Export-XrmSolutionsBuild;
displayName: 'Export solutions'
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'

Deploy.yml

# ===========================================================
# PowerDataOps Deploy Stage
# -----------------------------------------------------------
# Deploy : Import solution on specific environment
# for each stage
# ===========================================================
parameters:
- name: 'environment'
default: ''
type: string
- name: 'variableGroup'
default: ''
type: string
stages:
- stage: 'Deploy_${{ parameters.environment }}'
displayName: 'Deploy to ${{ parameters.environment }}'
jobs:
- deployment: Deploy
environment: '${{ parameters.environment }}'
variables:
- group: '${{ parameters.variableGroup }}'
strategy:
runOnce:
deploy:
steps:
- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
Install-Module PowerDataOps -Force -Scope CurrentUser
displayName: 'Install module PowerDataOps'
- task: DownloadBuildArtifacts@0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(System.DefaultWorkingDirectory)\Solutions\drop\'
- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
Import-Module PowerDataOps;
Import-XrmSolutionsBuild
displayName: 'Deploy solutions'

Finally you will be able to follow each build and retry production stage if you want.

Azure DevOps Multi-Stage Pipeline

--

--

Aymeric Mouillé
Aymeric Mouillé

Written by Aymeric Mouillé

Microsoft Power Platform Architect and DevOps addict

No responses yet