Tuesday, May 27, 2025

Setting Up Azure DevOps Pipeline for Dynamics 365 Managed Solution Deployment

 

Recently, I had the opportunity to set up an Azure DevOps pipeline to automate the deployment of Managed Solutions in Dynamics 365 Customer Engagement (Dataverse). The goal was to streamline the release process across multiple environments (ITG → PreProd → Production) with minimal manual intervention.

While the process involves a bit of configuration and understanding of Microsoft’s Power Platform tooling, once it’s in place, it can save a lot of time and reduce deployment errors. In this post, I’ll walk through the approach I followed and point to the official documentation that helped me along the way.


 Objectives

  • Export and store Managed Solution from source environment (e.g., Dev or ITG)

  • Use Azure DevOps Pipeline to deploy the solution into target environments

  • Parameterize the deployment using variables

  • Ensure auditability and automation across environments


 Tools & Prerequisites

To begin with, here are the core tools and prerequisites:

  • Azure DevOps Project

  • Power Platform Build Tools Extension (Install from Marketplace)

  • Service Principal or Application User with environment-level access

  • Power Platform CLI (PAC CLI) – optional for local testing


 Solution Setup in DevOps

1. Install Power Platform Build Tools

This extension provides Azure DevOps tasks for working with Power Platform and Dataverse. You can install it from the Visual Studio Marketplace:

๐Ÿ”— Power Platform Build Tools


2. Create a Service Connection

Go to Project Settings > Service Connections > New Service Connection > Power Platform.

You’ll need:

  • Client ID

  • Client Secret

  • Tenant ID

  • Environment URL

Follow the guide:
๐Ÿ“˜ Set up a service principal


3. Define Pipeline Structure

Here’s an outline of how the pipeline is structured:

Stage 1: Export Solution from Source Environment

  • Power Platform Tool Installer

  • Export Solution Task (Set Export As = Managed)

  • Upload Artifacts (Store .zip file)

Stage 2: Import Solution into Target Environment

  • Download Artifacts

  • Import Solution Task

  • Publish All Customizations (optional but recommended)

You can also enable "Check Solution" before import for validation.


4. Sample YAML Pipeline Snippet

trigger:
- main

pool:
  vmImage: 'windows-latest'

variables:
  SolutionName: 'MyManagedSolution'
  EnvironmentUrl: 'https://myenv.crm.dynamics.com'

stages:
- stage: ExportSolution
  jobs:
  - job: Export
    steps:
    - task: PowerPlatformToolInstaller@2
    - task: PowerPlatformExportSolution@2
      inputs:
        authenticationType: 'PowerPlatformSPN'
        PowerPlatformSPN: 'MyServiceConnection'
        solutionName: '$(SolutionName)'
        solutionOutputFile: '$(Build.ArtifactStagingDirectory)/$(SolutionName).zip'
        managed: true
    - task: PublishBuildArtifacts@1
      inputs:
        pathToPublish: '$(Build.ArtifactStagingDirectory)'
        artifactName: 'solutionArtifact'

- stage: ImportSolution
  dependsOn: ExportSolution
  jobs:
  - job: Import
    steps:
    - task: DownloadBuildArtifacts@0
      inputs:
        artifactName: 'solutionArtifact'
    - task: PowerPlatformImportSolution@2
      inputs:
        authenticationType: 'PowerPlatformSPN'
        PowerPlatformSPN: 'MyServiceConnection'
        environmentUrl: '$(EnvironmentUrl)'
        solutionInputFile: '$(Pipeline.Workspace)/solutionArtifact/$(SolutionName).zip'

 Tips and Best Practices

  • Use variables/groups for managing credentials and URLs

  • Split pipelines by environment stages for approvals

  • Set Async Operations timeout for large solutions

  • Use PAC CLI locally for troubleshooting before automating


 Additional Resources

Here are some helpful links I relied on:


 Final Thoughts

This experience really showcased the power of DevOps in enterprise Dynamics 365 development. With a properly set up pipeline, deployments become repeatable, reliable, and consistent. If you're just getting started, I highly recommend playing with PAC CLI locally before configuring your pipelines.

Have you tried DevOps with Dynamics 365? Feel free to share your thoughts or questions!


Tuesday, May 13, 2025

From Campaign Chaos to Customer Connection: How the New Dynamics 365 Marketing Changes Are Revolutionizing Engagement

 

๐ŸŒŸ The Backstory: A Marketing Manager’s Wake-Up Call

Last quarter, Amanda, a seasoned Marketing Manager at a mid-sized retail company, was frustrated.

She had just wrapped up a campaign that should have driven conversions. But the numbers told a different story—email open rates were low, customer journeys weren’t converting, and real-time decisions were more of a dream than reality.

That’s when she saw the update notification for Microsoft Dynamics 365 Marketing.

Curious (and a bit desperate), Amanda clicked in—and what she found changed how her team approached marketing forever.

Welcome to the new Dynamics 365 Marketing, where personalization is smarter, journeys are real-time, and data drives every move.


๐Ÿš€ What’s New in Dynamics 365 Marketing (Now Part of Customer Insights - Journeys)

Microsoft has integrated Dynamics 365 Marketing into Customer Insights – Journeys, creating a seamless platform for data-driven, real-time customer experiences.

Here are the game-changing updates:


๐Ÿ’ก 1. Real-Time Marketing is Now the Default Experience

Old Way: Outbound marketing was the primary model—scheduled campaigns, static segments.

New Way: Real-time marketing is now front and center, allowing businesses to trigger journeys instantly based on customer actions.

Key Features:

  • Event-triggered journeys (e.g., form submissions, purchases)

  • Dynamic content personalization

  • AI-powered channel selection

๐Ÿ”— Learn more about real-time marketing


๐Ÿ”„ 2. Unified Customer Data with Dynamics 365 Customer Insights

By combining Customer Insights - Data and Customer Insights - Journeys, marketers now have a 360° customer view that updates in real-time.

๐Ÿ’ฅ Benefits:

  • Unified profiles from multiple data sources

  • AI-enriched segments

  • Seamless handoff between sales and marketing teams

๐Ÿ”— Unified Customer Profile Documentation


✍️ 3. Enhanced Personalization with Copilot

Microsoft’s Copilot for Dynamics 365 Marketing introduces AI-powered content creation and suggestions.

๐Ÿง  Capabilities:

  • Auto-generate email content based on prompts

  • Recommend subject lines to increase open rates

  • Analyze campaign effectiveness with natural language queries

๐Ÿ“˜ Explore Copilot in Marketing


๐Ÿ” 4. New Channel Integrations

Customers don’t live in just one inbox—and now, neither do your campaigns.

๐Ÿ“ฑ Channel Updates:

  • SMS powered by Azure Communication Services

  • In-app messaging and push notifications

  • Microsoft Teams events integration

๐Ÿ“˜ Channel Capabilities in Real-Time Marketing


๐Ÿ“Š 5. Revamped Analytics & Reporting

Goodbye basic charts—hello deep insights. The new updates bring:

  • Journey-level insights

  • Customer path analysis

  • A/B testing results

  • Power BI integration for advanced dashboards

๐Ÿ“˜ Analyze Customer Journeys


๐Ÿง  From Amanda’s Lens: What Changed for Her Team?

With real-time journeys and AI-powered insights, Amanda’s team:

  • Reduced campaign creation time by 40%

  • Increased email open rates by 27%

  • Personalized outreach for every lead, at scale

In her words:

“It feels like we finally have a marketing engine that thinks with us, not just for us.”


๐Ÿ’ฌ Final Thoughts: Why You Should Care

Whether you're a marketing director at a global enterprise or a CRM lead at a non-profit, the new Dynamics 365 Marketing experience empowers you to connect, convert, and captivate like never before.

๐Ÿ”— Get started with Customer Insights – Journeys


๐Ÿ“ฃ Ready to Level Up?

If you're already using Dynamics 365 Marketing, explore the upgrade path to Customer Insights – Journeys. If not, this might be the perfect time to reimagine your marketing tech stack.

๐Ÿ‘‰ For a hands-on feel, start with Microsoft’s Customer Insights – Journeys Trial.

Monday, May 12, 2025

How I Modernized an 8-Year-Old .NET Solution Using .NET Upgrade Assistant

“20+ projects. .NET Framework 4.6. Legacy dependencies. And a deadline to move everything to .NET 8.”

Recently, I began the ambitious journey of migrating an 8-year-old enterprise solution—a massive monolith with over 20 interconnected projects, built on .NET Framework 4.6, using WCF, old-school config files, and long-abandoned NuGet packages.

The goal?
✔️ Modernize the solution
✔️ Make it cloud-ready
✔️ Target .NET 8
✔️ Run cross-platform

At first, it felt like staring at a mountain with no visible trail. Rewriting everything from scratch wasn’t an option. I needed a guided, gradual approach—something to help me lift and shift without breaking everything in the process.

That's when I discovered the .NET Upgrade Assistant.


๐Ÿš€ Enter: .NET Upgrade Assistant

This tool, provided by Microsoft, is a command-line utility that guides you through upgrading your existing .NET Framework or .NET Core projects to modern .NET versions (6, 7, or 8).

What makes it special is that it doesn’t just convert your project file—it walks through your entire solution, project by project, making incremental, intelligent suggestions and transformations.

๐Ÿ“˜ Official Docs:

๐Ÿ‘‰ https://learn.microsoft.com/en-us/dotnet/upgrade-assistant/


๐Ÿ’ก Why Use .NET Upgrade Assistant?

Here’s what stood out to me:

  • Step-by-step CLI guidance instead of just scripts

  • Automated backup of your solution

  • Dependency analysis and recommendations

  • Target framework upgrades to .NET 6, 7, or 8

  • Support for ASP.NET MVC, WPF, WinForms, Class Libraries, etc.

  • Great integration with modern tools like try-convert, analyzers, and Roslyn-based refactoring


๐Ÿ› ️ Step-by-Step: My Upgrade Journey

Step 1: Installing the Assistant

I started by installing the tool globally:

dotnet tool install -g upgrade-assistant

To ensure I always had the latest version:

dotnet tool update -g upgrade-assistant

Step 2: Running the Upgrade

I moved into my solution directory and ran:

upgrade-assistant upgrade MyLegacySolution.sln

It launched an interactive experience—asking which project to upgrade first, what changes to apply, and whether I wanted to proceed step-by-step.

The tool processed each project by:

  • Updating TargetFramework to net8.0

  • Migrating from packages.config to <PackageReference>

  • Replacing web.config with appsettings.json (for ASP.NET)

  • Notifying me of deprecated APIs

  • Updating project references


Step 3: Handling the Unexpected

Of course, some projects didn’t upgrade cleanly:

  • A few third-party NuGet packages weren’t compatible with .NET 8.

  • WCF references required rewrites or moving to gRPC or REST.

  • Some AppDomain and Remoting APIs had no direct equivalent.

For each issue, the Upgrade Assistant flagged the problem and left comments in the code or console to guide manual intervention.

I also leaned on these helpful docs:


Step 4: Testing & Tuning

After upgrading:

  • I ran all unit and integration tests (yes, we had a decent suite!)

  • Manually tested critical app paths

  • Removed obsolete dependencies

  • Modernized logging to use Microsoft.Extensions.Logging

  • Replaced Web API methods with Minimal APIs


✅ Benefits I Saw Firsthand

Using the Upgrade Assistant saved me weeks of effort and helped me maintain confidence throughout the process.

Here’s what I gained:

Benefit Impact
⚙️ Incremental Upgrades Focused on one project at a time
⏱️ Time Savings Automated 60–70% of migration work
๐Ÿง  Developer Awareness Better understanding of what changed in modern .NET
๐Ÿ”’ Security Moved away from unsupported frameworks
☁️ Cloud-Ready Easy to deploy via Docker + Azure App Service

๐Ÿงญ My Advice for You

  • Take a backup (or commit everything before you start)

  • Use upgrade-assistant analyze to scan your solution beforehand

  • Upgrade class libraries first, then consumers like web apps

  • Expect manual tweaks—this tool gets you 80% there

  • Don't forget to update your DevOps pipeline, if applicable


๐Ÿงต Conclusion

Migrating a large legacy .NET Framework solution to .NET 8 might seem overwhelming—but it doesn't have to be a full rewrite or a leap into the unknown. With the .NET Upgrade Assistant, I found a structured, guided path that made the journey manageable and insightful.

It won’t magically fix every line of code, but it will illuminate the path, project by project, and help you bring your software into the modern .NET era.

If you're holding off on your upgrade—this is your sign to get started.


๐Ÿงฐ Helpful Links


Thursday, April 24, 2025

.NET 8 vs .NET 9: Which One Should You Choose in 2025

 

.NET 8 vs .NET 9: Which One Should You Choose in 2025?

As we move deeper into 2025, many developers and architects are evaluating whether to adopt .NET 8 or make the leap to .NET 9. With both versions available—and each offering distinct advantages—the choice isn’t always straightforward.

This post breaks down the key differences between .NET 8 and .NET 9, outlines when to use each, and provides recommendations based on real-world scenarios.


๐Ÿ“Œ Support and Stability

Let’s start with what matters most for enterprise applications: support lifecycle.

  • .NET 8 (LTS)

    • Long-Term Support (LTS) until November 2026

    • Ideal for stable, long-running applications

  • .NET 9 (STS)

    • Standard-Term Support (STS) until May 2026

    • For fast-paced, innovation-driven projects

✅ Choose .NET 8 if you need long-term reliability.
⚠️ Opt for .NET 9 if you're exploring cutting-edge features and can upgrade regularly.


๐Ÿš€ Performance and Optimizations

.NET 9 introduces hundreds of improvements, especially for runtime performance and deployment optimization:

Notable upgrades in .NET 9:

  • Better JIT optimizations and vectorization

  • Reduced memory footprint via Native AOT and trimming

  • Performance wins in JSON, LINQ, GC, and interop

๐Ÿ† If performance is your top priority—especially for ML, gaming, or high-load services—.NET 9 will impress.


๐Ÿงฐ New Features and Developer Experience

Here’s what’s new from a developer’s perspective:

.NET 8

.NET 9 (Preview)

๐Ÿ’ก .NET 9 is ideal for early adopters and those who want to shape the future of .NET development.


๐Ÿ“Š Quick Comparison Table

Feature.NET 8 (LTS).NET 9 (STS)
Support LifecycleNov 2026May 2026
Stability✅ High⚠️ Moderate
Performance⚡ Fast⚡⚡ Faster (esp. trimming & JIT)
Cloud-native (Aspire)✅ Supported✅ Evolving
Use Case FitLong-term core appsHigh-performance, fast iteration
Upgrade FrequencyEvery 2–3 yearsEvery 12 months

✅ Recommendation

Use this decision matrix:

  • Use .NET 8 if:

    • You want a stable platform with long-term support.

    • Your app is customer-facing or high-reliability.

    • You’re adopting .NET Aspire for microservices.

  • Use .NET 9 if:

    • You need bleeding-edge performance.

    • You’re building for ML, AI, gaming, or analytics.

    • You can handle yearly upgrades and testing.

Both support side-by-side deployments—so you don’t have to choose globally. Mix and match based on service type.


๐Ÿง  Final Thoughts

.NET 8 provides the foundation for building large, reliable systems.
.NET 9 pushes boundaries with performance and modern development techniques.

Before upgrading, always consult the .NET release schedule to plan your lifecycle.

Tuesday, April 15, 2025

๐Ÿš€ Introducing .NET 9 – Smarter, Faster, and Built for the Future

Over the past few weeks, I’ve made it a point to stay more up to date with the latest developments in the .NET ecosystem. As someone who’s been working with .NET for years, I realized that keeping pace with each new release is not only exciting but essential—especially with how fast Microsoft is evolving the platform.

So, I dove into the newest release: .NET 9. And wow—this update brings some major improvements across cloud-native development, performance, and cross-platform UI frameworks like MAUI. Whether you're building modern web apps, services, or desktop/mobile apps, .NET 9 has something powerful to offer.

In this post, I’m summarizing some of the key highlights I found most exciting and useful. Hopefully, this helps you get up to speed quickly too


๐ŸŒŸ What's New in .NET 9?

๐Ÿ”ง Cloud-Native by Default

.NET 9 doubles down on container-first, cloud-native development. Key improvements include:


⚡ Performance: .NET Gets Even Faster

.NET has always prioritized performance, and .NET 9 continues this trend:


๐Ÿงฑ ASP.NET Core: The Web Stack Evolves

ASP.NET Core in .NET 9 brings several developer-friendly updates:

  • Blazor unification complete: You can now mix server-side and WebAssembly rendering seamlessly.
    ๐Ÿ”— Blazor Documentation

  • SignalR over HTTP/3: Better real-time experiences.
    ๐Ÿ”— SignalR Overview

  • Enhanced middleware pipeline with performance-first patterns.
    ๐Ÿ”— ASP.NET Core Middleware

  • Built-in endpoint filters and validation: Cleaner code, better DX.
    ๐Ÿ”— Minimal API Filters


๐Ÿ–ฅ️ .NET MAUI Matures

.NET 9 continues to polish the cross-platform UI framework:

  • More stable and responsive UI across iOS, Android, macOS, and Windows.

  • Improved Hot Reload and Live Preview support in Visual Studio.

  • Simplified lifecycle management and background tasks.
    ๐Ÿ”— Get Started with .NET MAUI


๐Ÿงช Better Testing, Observability & DevOps

  • Improved support for test containers: Run integration tests against ephemeral containers.
    ๐Ÿ”— .NET Test Container Docs

  • dotnet monitor now comes with auto-discovery and UI enhancements.
    ๐Ÿ”— dotnet monitor

  • Better profiling and diagnostics support in Visual Studio and CLI.
    ๐Ÿ”— .NET Diagnostics Tools


๐Ÿ’ก C# 13: Smarter, Cleaner Code

.NET 9 ships with C# 13, bringing:

  • Parameter null checking with !!

  • Default lambda parameters

  • Collection literals

  • Primary constructors for all types
    ๐Ÿ”— What's New in C# 13

These features reduce boilerplate and make your code more readable and expressive.


๐Ÿ› ️ Tooling & Ecosystem


๐Ÿงญ Final Thoughts

.NET 9 isn’t just an update—it’s a refinement of a developer-first, cloud-ready, performance-oriented ecosystem. Microsoft has listened to the community and delivered a platform that’s ready for the next generation of applications.

Whether you're a backend engineer, cloud architect, or full-stack developer, .NET 9 will save you time, reduce complexity, and future-proof your apps.

๐ŸŽฏ Ready to dive in?
๐Ÿ‘‰ Download .NET 9
๐Ÿ‘‰ Try the new templates: dotnet new
๐Ÿ‘‰ Upgrade your existing projects with dotnet upgrade

Sunday, February 9, 2025

Mastering Parallel Processing in .NET: Unlocking Performance with Parallel.ForEach

 


In today’s high-performance applications, processing large datasets efficiently is crucial. Traditional sequential loops can become a bottleneck, especially when dealing with CPU-bound operations. This is where .NET's Parallel.ForEach comes into play, offering a powerful way to execute loops concurrently and significantly reduce execution time.

In this article, we’ll explore how to use Parallel.ForEach, its advantages, potential pitfalls, and best practices.


Understanding Parallel.ForEach

The Parallel.ForEach method in .NET enables parallel execution of loop iterations using multiple threads from the ThreadPool. It’s particularly useful for CPU-intensive tasks, such as processing large collections, performing calculations, or handling file operations.

Basic Syntax

csharp
Parallel.ForEach(collection, item => { // Process each item in parallel Console.WriteLine($"Processing {item} on thread {Thread.CurrentThread.ManagedThreadId}"); });

When to Use Parallel.ForEach

Best Use Cases:

CPU-bound operations – Data processing, mathematical computations, or image transformations.
Processing large collections – Handling millions of records efficiently.
Independent iterations – When each loop iteration doesn’t depend on the result of another.

When NOT to Use:

I/O-bound operations – Parallelizing network or database calls can lead to thread starvation.
Order-dependent processing – If you need strict ordering, consider PLINQ instead.
Mutating shared state – Risk of race conditions if multiple threads modify the same resource.


Optimizing Performance with Parallel.ForEach

1. Controlling Parallelism with ParallelOptions

By default, Parallel.ForEach will try to use all available CPU cores, which might not always be optimal. You can control the degree of parallelism using ParallelOptions:

csharp
var options = new ParallelOptions { MaxDegreeOfParallelism = Environment.ProcessorCount / 2 }; Parallel.ForEach(Enumerable.Range(1, 100), options, i => { Console.WriteLine($"Processing {i} on thread {Thread.CurrentThread.ManagedThreadId}"); });

This limits the number of concurrent tasks, preventing CPU overuse.


2. Handling Exceptions Gracefully

When running tasks in parallel, multiple exceptions might occur. Instead of failing at the first error, wrap your logic in a try-catch and use AggregateException to handle them properly.

csharp
try { Parallel.ForEach(Enumerable.Range(1, 10), i => { if (i == 5) throw new Exception("Error at 5"); Console.WriteLine($"Processing {i}"); }); } catch (AggregateException ex) { foreach (var e in ex.InnerExceptions) { Console.WriteLine($"Caught exception: {e.Message}"); } }

3. Breaking Out of Parallel.ForEach

If you need to stop processing early, use the ParallelLoopState parameter:

csharp
Parallel.ForEach(Enumerable.Range(1, 100), (i, state) => { if (i == 50) { Console.WriteLine("Stopping execution..."); state.Break(); } Console.WriteLine($"Processing {i}"); });

This ensures minimal wasted work when an early exit condition is met.


4. Using Partitioner for Large Collections

For massive datasets, Partitioner optimizes performance by efficiently distributing workloads across threads.

csharp
var partitioner = Partitioner.Create(Enumerable.Range(1, 1000), true); Parallel.ForEach(partitioner, i => { Console.WriteLine($"Processing {i}"); });

This approach enhances cache efficiency and reduces thread contention.


Final Thoughts

Parallel processing in .NET can significantly boost application performance when used correctly. By leveraging Parallel.ForEach, controlling parallelism, handling exceptions, and using partitioning techniques, you can maximize efficiency while avoiding common pitfalls.

๐Ÿ”น Key Takeaways:
✔ Use Parallel.ForEach for CPU-bound tasks.
✔ Control concurrency with ParallelOptions.
✔ Handle exceptions with AggregateException.
✔ Break out early when needed.
✔ Optimize large collections with Partitioner.

With these best practices, you can write high-performance, scalable .NET applications that fully utilize modern multi-core processors. ๐Ÿš€


What are your thoughts on Parallel.ForEach? Have you used it in your projects? Share your experiences in the comments below!