Earlier today, fellow Umbraco package dev Nathan Woulfe blogged about his struggles with getting versioned deployments working in Azure DevOps and in the spirit of sharing, I thought I'd outline my approach to solving the same problem.
Problem
As package devs, you really want to be spending as little time preparing deployments as possible as the quicker it is to deploy, the less of a problem it is when you need push changes out quickly.
The best way to achieve this is through automation, using tools and scripts to generate builds and deploy them for you.
With our package Vendr, our approach has developed over the years, but today we have a pretty nice setup which means for us, releasing nightly/unstable builds is as simple as committing changes to our code repository (we regularly share this with developers so they can quickly review code fixes) and pushing out a release is a matter of merging into our main
branch and tagging that merge with a version number.
NUKE Build
The first step to getting things automated is setting up a build script. A number one requirement for me that I've grown to learn is important over the years is having the ability to run a build script locally.
Previously we have used MSBuild for this, however as this is XML based, it's really quite cumbersome and takes a lot of trial and error to get things working. More recently we've started to use an excellent tool called NUKE Build which is a C# based build automation tool that lets use write our build scripts in C#, benefiting from strongly typed variables and the ability to use any existing C# library for custom build requirements.
I won't go into a ton of detail on how to use it, but essentially you install the global build tool via the command
dotnet tool install Nuke.GlobalTool --global
And then running the command nuke :setup
to initialize your project. This will run you through a wizard to setup the build project in your solution.
Once this is run, the build project will be added to your solution, and a number of folders and script files added to your solution.
In Visual Studio you then have a new _build
project added to your solution, and a Build.cs
file in this project that is your build script.
NUKE will have already stubbed out a build script for you, but you can use the following as a basic script.
using Nuke.Common;
using Nuke.Common.CI;
using Nuke.Common.Execution;
using Nuke.Common.IO;
using Nuke.Common.ProjectModel;
using Nuke.Common.Tools.DotNet;
using Nuke.Common.Tools.GitVersion;
using Nuke.Common.Utilities.Collections;
using static Nuke.Common.IO.FileSystemTasks;
using static Nuke.Common.IO.PathConstruction;
using static Nuke.Common.Tools.DotNet.DotNetTasks;
[CheckBuildProjectConfigurations]
[ShutdownDotNetAfterServerBuild]
class Build : NukeBuild
{
public static int Main () => Execute<Build>(x => x.Pack);
[Parameter("Configuration to build - Default is 'Debug' (local) or 'Release' (server)")]
readonly Configuration Configuration = IsLocalBuild ? Configuration.Debug : Configuration.Release;
[Solution]
readonly Solution Solution;
[GitVersion(Framework = "net5.0")]
readonly GitVersion GitVersion;
AbsolutePath SourceDirectory => RootDirectory / "src";
AbsolutePath ArtifactsDirectory => RootDirectory / "artifacts";
// =================================================
// Clean
// =================================================
Target Clean => _ => _
.Executes(() =>
{
SourceDirectory.GlobDirectories("**/bin", "**/obj").ForEach(DeleteDirectory);
EnsureCleanDirectory(ArtifactsDirectory);
});
// =================================================
// Compile
// =================================================
Target Restore => _ => _
.DependsOn(Clean)
.Executes(() =>
{
DotNetRestore(s => s
.SetProjectFile(Solution));
});
Target Compile => _ => _
.DependsOn(Restore)
.Executes(() =>
{
DotNetBuild(s => s
.SetProjectFile(Solution)
.SetConfiguration(Configuration)
.SetAssemblyVersion(GitVersion.AssemblySemVer)
.SetFileVersion(GitVersion.AssemblySemFileVer)
.SetInformationalVersion(GitVersion.InformationalVersion)
.EnableNoRestore());
});
// =================================================
// Pack
// =================================================
Target Pack => _ => _
.DependsOn(Compile)
.Produces(ArtifactsDirectory)
.Executes(() =>
{
DotNetPack(c => c
.SetProject(Solution)
.SetConfiguration(Configuration)
.SetVersion(GitVersion.NuGetVersionV2)
.SetOutputDirectory(ArtifactsDirectory)
.SetNoBuild(true));
});
}
There is a lot going on here, but ultimately it's defining a number of "Targets", each performing an individual task in the build process. The tasks we have defined are:
- Clean - Cleans out your build artifacts directory (where your package files will get generated to)
- Restore - Restores your solutions NuGet packages
- Compile - Compiles your solution in Release mode
- Pack - Generates your NuGet packages
NB this script assumes you are using the new .NET SDK style projects and your solution is committed to a git repository
With this build script defined, in your project root directory, you can then run the command build pack
to trigger your build process starting with the Pack
target.
Once the build script runs, you should then find your generated NuGet packages in your artifacts
directory.
GitVersion
When automating build scripts like this, one of the biggest pains, and the pain Nathan had in his build setup, is managing the version number of your artifacts being generated. For use, we find the use of a tool called GitVersion a god send.
GitVersion is a tool that looks into your git repository, and uses a number of conventions to determine the next version number to generate. It's a pretty powerful tool, and quite a lot to go through in one blog post so I'll just suggest you head over to the GitVersion website to find out about more of the intricacies.
For our purposes though, we can just know that whenever we commit to the dev branch, it will generate an alpha build with the build number appended on the end, and when pushing a git tag such as v1.2.0
on the main branch, it will create a full release build. Also, the very presence of the v1.2.0
tag, will also increment the dev branch versions from that point on, so it always keeps our version numbers in the correct build order.
A nice thing about NUKE build is that GitVersion is a first class citizen and so requires minimal setup to get working.
In the build script above, you can see the term "GitVersion" used in a few locations. At the very top, we define a GitVersion
variable, which basically tells NUKE we want to use GitVersion and then in both the Compile
and Pack
steps we access multiple defined properties to set things like the compiled assembly versions (GitVersion.AssemblySemVer
) and the version of the generated NuGet packages (GitVersion.NuGetVersionV2
).
All of these version numbers are automatically generated and in the format required for it's specific use case so for us we just use the properties to set the version numbers and we are done.
Azure DevOps
With the above in place, we have the ability to build our packages locally which is really useful for testing or needing to create one off builds for someone, however we don't want to have to be running the builds manually every time, so this is where Azure DevOps comes into play.
Azure DevOps is a lot of things (we actually use it to host our private git repos too) but the thing we are most interested in for automated deployments is Azure Pipelines. Azure Pipelines is basically a configurable build server that can monitor our git repository and trigger builds to occur automatically when certain criteria is met.
Azure Pipelines can be configured via the Azure portal, but it's often easier to use an azure-pipelines.yml
file in the root of your project.
In Vendr, for our payment providers, we have something like this.
trigger:
branches:
include:
- dev
- hotfix/*
- release/*
tags:
include:
- v*
variables:
- group: 'vendr'
- name: 'vmImageName'
value: 'vs2017-win2016'
- name: 'nuGetOrgServiceCreds'
value: 'NuGet.org (Vendr)'
stages:
- stage: build
displayName: Build
dependsOn: [ ]
pool:
vmImage: $(vmImageName)
jobs:
- job: build
displayName: 'Build'
dependsOn: [ ]
steps:
- task: CmdLine@2
inputs:
script: './build.cmd Pack'
- task: PublishBuildArtifacts@1
inputs:
pathToPublish: './artifacts'
artifactName: artifacts
- stage: deploy
displayName: Deploy
condition: succeeded()
dependsOn: [ build ]
jobs:
- deployment: deploy
displayName: Deploy
environment: 'development'
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
# Unstable Deploy
- task: NuGetCommand@2
displayName: 'Deploy to unstable feed'
inputs:
command: 'push'
packagesToPush: '$(Pipeline.Workspace)/artifacts/**/*.nupkg;!$(Pipeline.Workspace)/artifacts/**/*.snupkg'
nuGetFeedType: 'internal'
publishVstsFeed: '{project_name}/{feed_name}'
# Realease Deploy
- task: NuGetCommand@2
displayName: 'Deploy to NuGet.org'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
inputs:
command: push
nuGetFeedType: external
packagesToPush: '$(Pipeline.Workspace)/artifacts/**/*.nupkg;!$(Pipeline.Workspace)/artifacts/**/*.snupkg'
publishFeedCredentials: '$(nuGetOrgServiceCreds)'
Again, there is a fare bit to dig into here, but ultimately this script is first of all configuring which changes in the git repository should trigger the build, then setting and importing some variables and then moving on to the build process.
For the build step, we trigger our build script we previously defined, and then publish the generated artifacts as Azure Pipeline artifacts (this just makes them available to other build steps)
In the deploy step, we then publish the NuGet package files to our unstable nuget feed (this is another thing we use Azure DevOps for too), and if the build is being triggered by the push of a git tag in the format vX.X.X
then we also publish the NuGet packages to nuget.org.
With this script defined and committed to your repository, in Azure DevOps you can go through the Azure Pipelines setup wizard to point it to your repository and it should automatically pick up the azure-pipelines.yml
config and from then on, it will monitor your repository and run the build script and publish your assets as changes are made.
Conclusion
I've tried to summaries a lot of information here so there is probably a lot you might need to look into further, but I hope I give just enough to see how things are working together and the benefits of doing things in this way.
Top comments (0)