Recently, I had the opportunity to set up an Azure DevOps pipeline to automate the deployment of Managed Solutions in Dynamics 365 Customer Engagement (Dataverse). The goal was to streamline the release process across multiple environments (ITG → PreProd → Production) with minimal manual intervention.
While the process involves a bit of configuration and understanding of Microsoft’s Power Platform tooling, once it’s in place, it can save a lot of time and reduce deployment errors. In this post, I’ll walk through the approach I followed and point to the official documentation that helped me along the way.
Objectives
-
Export and store Managed Solution from source environment (e.g., Dev or ITG)
-
Use Azure DevOps Pipeline to deploy the solution into target environments
-
Parameterize the deployment using variables
-
Ensure auditability and automation across environments
Tools & Prerequisites
To begin with, here are the core tools and prerequisites:
-
Azure DevOps Project
-
Power Platform Build Tools Extension (Install from Marketplace)
-
Service Principal or Application User with environment-level access
-
Power Platform CLI (PAC CLI) – optional for local testing
Solution Setup in DevOps
1. Install Power Platform Build Tools
This extension provides Azure DevOps tasks for working with Power Platform and Dataverse. You can install it from the Visual Studio Marketplace:
🔗 Power Platform Build Tools
2. Create a Service Connection
Go to Project Settings > Service Connections > New Service Connection > Power Platform
.
You’ll need:
-
Client ID
-
Client Secret
-
Tenant ID
-
Environment URL
Follow the guide:
📘 Set up a service principal
3. Define Pipeline Structure
Here’s an outline of how the pipeline is structured:
Stage 1: Export Solution from Source Environment
-
Power Platform Tool Installer
-
Export Solution Task (Set
Export As = Managed
) -
Upload Artifacts (Store
.zip
file)
Stage 2: Import Solution into Target Environment
-
Download Artifacts
-
Import Solution Task
-
Publish All Customizations (optional but recommended)
You can also enable "Check Solution" before import for validation.
4. Sample YAML Pipeline Snippet
trigger:
- main
pool:
vmImage: 'windows-latest'
variables:
SolutionName: 'MyManagedSolution'
EnvironmentUrl: 'https://myenv.crm.dynamics.com'
stages:
- stage: ExportSolution
jobs:
- job: Export
steps:
- task: PowerPlatformToolInstaller@2
- task: PowerPlatformExportSolution@2
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: 'MyServiceConnection'
solutionName: '$(SolutionName)'
solutionOutputFile: '$(Build.ArtifactStagingDirectory)/$(SolutionName).zip'
managed: true
- task: PublishBuildArtifacts@1
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'solutionArtifact'
- stage: ImportSolution
dependsOn: ExportSolution
jobs:
- job: Import
steps:
- task: DownloadBuildArtifacts@0
inputs:
artifactName: 'solutionArtifact'
- task: PowerPlatformImportSolution@2
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: 'MyServiceConnection'
environmentUrl: '$(EnvironmentUrl)'
solutionInputFile: '$(Pipeline.Workspace)/solutionArtifact/$(SolutionName).zip'
Tips and Best Practices
-
Use variables/groups for managing credentials and URLs
-
Split pipelines by environment stages for approvals
-
Set
Async Operations
timeout for large solutions -
Use PAC CLI locally for troubleshooting before automating
Additional Resources
Here are some helpful links I relied on:
Final Thoughts
This experience really showcased the power of DevOps in enterprise Dynamics 365 development. With a properly set up pipeline, deployments become repeatable, reliable, and consistent. If you're just getting started, I highly recommend playing with PAC CLI locally before configuring your pipelines.
Have you tried DevOps with Dynamics 365? Feel free to share your thoughts or questions!