Wednesday, July 29, 2020

Managing ADO Licenses from Azure DevOps CLI

My last post introduced using JMESPath with the az devops cli, which hopefully gave you some insight into use the az cli and the az devops extension. Today I want to highlight how you can easily pull the az devops cli into PowerShell to unlock some amazing scripting ability.

A good example of this is how we can use PowerShell + az devops cli to manage Azure DevOps User Licenses.

Background

Azure DevOps is a licensed product, and while you can have unlimited free Stakeholder licenses, any developer that needs access to code repositories needs a Basic license, which costs about $7 CAD / month.

It's important to note that this cost is not tied to usage, so if you've allocated licenses manually, you're essentially paying for it. Interestingly, this is not a fixed monthly cost, but prorated on a daily basis in the billing period. So if you can convert Basic licenses to Stakeholder licenses, you’ll only pay for the days in the billing period when the license was active.

If you establish a process to revoke licenses when they're not being used, you can save your organization a few dollars that would otherwise be wasted. It might not be much, but if you consider 10 user licenses for the year is about $840 – the costs do add up. something you could argue should be added to your end-of-year bonus.

Integrating the Azure DevOps CLI into PowerShell

To kick things off and to show how incredibly easy this is, let's start with this snippet:

param() {

}


$users = az devops user list | ConvertFrom-Json

Write-Host $users.totalCount
Write-Host $users.items[0].user.principalName

Boom. No special magic. We just call the az devops cli directly in our PowerShell script, converting the JSON result into an object by piping it through the ConvertFrom-Json commandlet. We can easily interrogate object properties and build up some conditional logic. Fun.

Use JMESPath to Simplify Results

While we could work with the results in this object directly, the result objects have a complex structure so I’m going to flatten the object down to make it easier to get at the properties we need. I only want the user’s email, license, and the dates when the user account was created and last accessed.

JMESPath makes this easy. If you missed the last post, go back and have a read to get familiar with this syntax.

function Get-AzureDevOpsUsers() {

    $query = "items[].{license:accessLevel.licenseDisplayName, email:user.principalName, dateCreated:dateCreated, lastAccessedDate:lastAccessedDate }"
    $users = az devops user list --query "$query" | ConvertFrom-Json

    return $users
}

Ok. That structure’s looking a lot easier to work with.

Finding Licenses to Revoke

Now all we need to do is find the licenses we want to convert from Basic to Stakeholder. Admittedly, this could create headaches for us if we're randomly taking away licenses that people might need in the future, but we should be safe if we target people who've never logged in or haven't logged in for 21 days.

I’m going to break this down into two separate functions. One to filter the list of users based on their current license, and another function to filter based on the last access date.

function Where-License()
{
    param(
        [Parameter(Mandatory=$true, ValueFromPipeline)]
        $Users,

        [Parameter(Mandatory=$true)]
        [ValidateSet('Basic', 'Stakeholder')]
        [string]$license
    )

    BEGIN{}
    PROCESS
    {
        $users | Where-Object -FilterScript { $_.license -eq $license }
    }
    END{}
}

function Where-LicenseAccessed()
{
    param(
        [Parameter(Mandatory=$true, ValueFromPipeline)]
        $Users,

        [Parameter()]
        [int]$WithinDays = 21,

        [Parameter()]
        [switch]$NotUsed = $false
    )
    BEGIN
    {
        $today = Get-Date
    }
    PROCESS
    {
        $Users | Where-Object -FilterScript {

            $lastAccess = (@( $_.lastAccessedDate, $_.dateCreated) | 
                            ForEach-Object { [datetime]$_ } |
                            Measure-Object -Maximum | Select-Object Maximum).Maximum

            $timespan = New-TimeSpan -Start $lastAccess -End $today

            if (($NotUsed -and $timespan.Days -gt $WithinDays) -or ($NotUsed -eq $false -and $timespan.Days -le $WithinDays)) {
                Write-Host ("User {0} last accessed within {1} days." -f $_.email, $timespan.Days)
                return $true
            }

            return $false
        }
    }
    END {}
}

If you're new to PowerShell, the BEGIN,PROCESS,END blocks may look peculiar but they are essential for chaining results of arrays together. There’s a really good write-up on this here. But to demonstrate, we can now chain these methods together, like so:

Get-AzureDevOpsUsers | 
    Where-License -license Basic | 
        Where-LicenseAccessed -NotUsed -WithinDays 21

Revoking Licenses

And then we use the ever so important function to revoke licenses. This is simply a wrapper around the az devops command to improve readability.

function Set-AzureDevOpsLicense()
{
    param(
        [Parameter(Mandatory=$true)]
        $User,

        [Parameter(Mandatory=$true)]
        [ValidateSet('express','stakeholder')]
        [string]$license
    )
    Write-Host ("Setting User {0} license to {1}" -f $_.email, $license)
    az devops user update --user $user.email --license-type $license | ConvertFrom-Json
}

Putting it all Together

So we've written all these nice little functions, let's put them together into a little PowerShell haiku:

$users = Get-AzureDevOpsUsers | 
                 Where-License -license Basic | 
                 Where-LicenseAccessed -NotUsed -WithinDays 21 | 
                 ForEach-Object { Set-AzureDevOpsLicense -User $_ -license stakeholder }
Write-Host ("Changed {0} licenses." -f $users.length)

Nice. Az you can see that wasn't hard at all, and it probably saved my boss a few bucks. There's a few ways we can make this better...

Dealing with lots of Users

The az devops user list command can return up to 1000 users but only returns 100 by default. If you have a large organization, you'll need to make a few round trips to get all the data you need.

Let’s modify our Get-AzureDevOpsUsers function to retrieve all the users in the organization in batches.

function Get-AzureDevOpsUsers() {

    param(
        [Parameter()]
        [int]$BatchSize = 100
    )

    $query = "items[].{license:accessLevel.licenseDisplayName, email:user.principalName, dateCreated:dateCreated, lastAccessedDate:lastAccessedDate }"
    
    $users = @()

    $totalCount = az devops user list --query "totalCount"
    Write-Host "Fetching $totalCount users" -NoNewline

    $intervals = [math]::Ceiling($totalCount / $BatchSize)

    for($i = 0; $i -lt $intervals; $i++) {

        Write-Host -NoNewline "."

        $skip = $i * $BatchSize;
        $results = az devops user list --query "$query" --top $BatchSize --skip $skip | ConvertFrom-Json

        $users = $users + $results
    }   

    return $users
}

Giving licenses back

If the script can taketh licenses away, it should also giveth them back. To do this, we need the means to identify who should have a license. This can be accomplished using an Azure AD User Group populated with all users that should have licenses.

To get the list of these users, the Azure CLI comes to the rescue again. Also again, learning JMESPath really helps us because we can simplify the entire result into a basic string array:

az login --tenant <tenantid>
$licensedUsers = az ad group member list -g <groupname> --query "[].otherMails[0]" | ConvertFrom-Json

Note that I'm using the otherMails property to get the email address, your mileage may vary, but in my Azure AD, this setting matches Members and Guests with their email address in Azure DevOps.

With this magic array of users, my haiku can now reassign users their license if they've logged in recently without a license (sorry mate):

$licensedUsers = az ad group member list -g ADO_LicensedUsers --query "[].otherMails[0]" | ConvertFrom-Json

$users = Get-AzureDevOpsUsers
$reactivatedUsers = $user | Where-License -license Stakeholder | 
                            Where-LicenseAccessed -WithinDays 3 | 
                            Where-Object -FilterScript { $licensedUsers.Contains($_.email) } | 
                            ForEach-Object { Set-AzureDevOpsLicense -User $_ -license express }

$deactivatedUsers = $user | Where-License -license Basic | 
                            Where-LicenseAccessed -NotUsed -WithinDays 21 | 
                            ForEach-Object { Set-AzureDevOpsLicense -User $_ -license stakeholder }

Write-Host ("Reviewed {0} users" -f $users.Length)
Write-Host ("Deactivated {0} licenses." -f $deactivatedUsers.length)
Write-Host ("Reactivated {0} licenses." -f $reactivatedUsers.length)

Wrapping up

In the last few posts, we've looked at the Azure DevOps CLI, understanding JMESPath and now integrating both into PowerShell to unleash some awesome. If you're interested in the source code for this post, you can find it here.

In my next post, we'll build upon this and show you how to integrate this script magic into an Azure Pipeline that runs on a schedule.

Happy coding.

Monday, July 27, 2020

Azure DevOps CLI Examples

I've always been a fan of Azure DevOps's extensive REST API -- it's generally well documented, consistent and it seems like you can do pretty much anything you can do from within the web-interface. As much as I love the API, I hate having to bust it out. Nowadays, my new go to tool is the Azure DevOps CLI.

The Azure DevOps CLI is actually an extension of the Azure CLI. It contains a good number of common functions that you would normally use on a daily basis, plus features that I would normally rely on the REST API for. Its real power is unlocked when it's combined with your favourite scripting language. I plan to write a few posts on this topic, so stay tuned, but for today, we'll focus on getting up and running plus some cool things you can do with the tool.

Installation

Blurg. I hate installation blog posts. Let's get this part over with:

choco install azure-cli -y
az extension add --name azure-devops

Whew. If you don't have Chocolatey installed, go here: https://chocolatey.org/install

Get Ready

Ok, so we're almost there. Just a few more boring bits. First we need to login:

az login --allow-no-subscription

A quick note on the above statement. There are a number of different login options available but I've found az login with the --allow-no-subscription flag supports the majority of use cases. It'll launch a web-browser and require you to login as you normally would, and the --allow-no-subscription supports scenarios where you have access to the AD tenant to login but you don't necessarily have a subscription associated to your user account, which is probably pretty common for most users who only have access to Azure DevOps.

This next bit let's us store some commonly used parameters so we don't have to keep typing them out.

az devops configure --defaults organization=https://dev.azure.com/<organization>

In case your curious, this config file is stored in %UserProfile%\.azure\azuredevops\config

Our First Command

Let's do something basic, like getting a list of all projects:

az devops project list

If we've configured everything correctly, you should see a boatload of JSON fly by. The CLI supports different options for output, but JSON works best when paired with our automation plus there's some really cool things we can do with the result output by passing a JMESPath statement using the --query flag.

Understanding JMESPath

The JavaScript kids get all the cool tools. There are probably already a few dozen different ways of querying and manipulating JSON data, and JMESPath (pronounced James Path) is no different. The syntax is a bit confusing at first and it takes a little bit of tinkering to master it. So let's do some tinkering.

The best way to demonstrate this is to use the JSON output from listing our projects. Our JSON looks something like this:

{
   "value": [
     {
        "abbreviation": null,
        "defaultTeamImageUrl": null,
        "description": null,
        "id": "<guid>",
        "lastUpdateTime": "<date>",
        "name": "Project Name",
        "revision": 89,
        "state": "wellFormed",
        "url": "<url>",
        "visibility": "private"
     },
     ...
   ]
}

It's a single object with a property called "value" that contains an array. Let's do a few examples...

Return the contents of the array

Assuming that we want the details of the projects and not the outside wrapper, we can discard the "value" property and just get it's contents, which is an array.

az devops project list --query "value[]"
[
    {
        "abbreviation": null,
        "defaultTeamImageUrl": null,
        "description": null,
        "id": "<guid>",
        "lastUpdateTime": "<date>",
        "name": "Project Name",
        "revision": 89,
        "state": "wellFormed",
        "url": "<url>",
        "visibility": "private"
    },
    ...
]

Return just the first element

Because the "value" property is an array, we can get the first element.

az devops project --query "value[0]"

{
    "abbreviation": null,
    "defaultTeamImageUrl": null,
    "description": null,
    "id": "<guid>",
    "lastUpdateTime": "<date>",
    "name": "Project Name",
    "revision": 89,
    "state": "wellFormed",
    "url": "<url>",
    "visibility": "private"
}

You can also specify ranges:

  • [:2] = everything up to the 3rd item
  • [1:3] = from the 2nd up to the 4th items
  • [1:] = everything from the 2nd item

Return an array of properties

If we just wanted the id property of each element in the array, we can specify the property we want. The result assumes there are only 4 projects.

az devops project --query "value[].id"

[
    "<guid>",
    "<guid>",
    "<guid>",
    "<guid>"
]

Return specific properties

This is where JMESPath gets a tiny bit odd. In order to get just a handful of properties we need to do a "projection" which is basically like stating what structure you want the JSON result to look like. In this case, we're mapping the id and name property to projectId and projectName in the result output.

az devops project --query "value[].{ projectId:id, projectName:name }"

[
    {
        "projectId": "<guid>",
        "projectName": "Project 1"
    },
    {
        "projectId": "<guid>",
        "projectName": "Project 2"
    },
    ...
]

Filter the results

Here's where things get really interesting. We can put functions inside the JMESPath query to filter the results. This allows us to mix and match the capabilities of the API with the output filtering capabilities of JMESPath. This returns only the projects that are public.

az devops project list --query "value[?visibility=='public'].{ id:id, name:name }"
[
    {
        "id": "<guid>",
        "name": "Project 3"
    }
]

We could have also written this as:

--query "value[?contains(visibility,'private')].{id:id, name:name}"

Piping the results

In the above example, JMESPath assumes that the results will be an array. We can pipe the result to further refine it. In this case, we want just the first object in the resulting array.

az devops project list --query "value[?visibility=='private'].{ id:id, name:name} | [0]"

{
   "id": "<guid>",
   "name": "Project 3"
}

Piping can improve the readability of the query similar to a functional language. For example, the above could be written as a filter, followed by a projection, followed by a selection.

--query "value[?contains(visibility,'private')] | [].{id:id, name:name} | [0]"

Wildcard searches

Piping the results becomes especially important if we want just the single value of a wildcard search. For this example, I need a different JSON structure, specifically a security descriptor:

[
  {
    "acesDictionary": {
      "Microsoft.IdentityModel.Claims.ClaimsIdentity;<dynamic-value>": {
        "allow": 16,
        "deny": 0,
        "descriptor": "Microsoft.IdentityModel.Claims.ClaimsIdentity;<dynamic-value>",
        "extendedInfo": {
          "effectiveAllow": 32630,
          "effectiveDeny": null,
          "inheritedAllow": null,
          "inheritedDeny": null
        }
      }
    },
    "includeExtendedInfo": true,
    "inheritPermissions": true,
    "token": "repoV2"
  }
]

In this structure, i'm interested in getting the "allow" and "deny" and "token" values but the first element in the acesDictionary contains a dynamic value. We can use a wildcard "*" to substitute for properties we don't know at runtime.

Let's try to isolate that "allow". The path would seem like [].acesDictionary.*.allow but because JMESPath has no idea if this is a single element, so it returns an array:

[
    [
        16
    ]
]

If we pipe the result, [].acesDictionary.*.allow | [0] we'll get a single value.

[
    16
]

Following suit and jumping ahead a bit so that I can skip to the answer, I can grab the "allow", "deny" and "token" with the following query. At this point, I trust you can figure this out using by referencing all the examples I've provided. The query looks like:

--query "[].{allow:acesDictionary.*.allow | [0], deny:acesDictionary.*.deny | [0], token:token } | [0]"
{
    "allow": 16,
    "deny": 0,
    "token": "repoV2"
}

Ok! That is waay too much JMESPath. Let's get back on topic.

Using the Azure DevOps CLI

The Azure DevOps CLI is designed with commands and subcommands and has a few entry points. At each level, there are the obvious inclusions (list, add, delete, update, show), but there are a few additional commands per level.

  • az devops
    • admin
      • banner
    • extension
    • project
    • security
      • group
      • permission
        • namespace
    • service-endpoint
      • azurerm
      • github
    • team
    • user
    • wiki
      • page
  • az pipelines
    • agent
    • build
      • definition
      • tag
    • folder
    • pool
    • release
      • definition
    • runs
      • artifact
      • tag
    • variable
    • variable-group
  • az boards
    • area
      • project
      • team
    • iteration
      • project
      • team
    • work-item
      • relation
  • az repos
    • import
    • policy
      • approver-count
      • build
      • case-enforcement
      • comment-required
      • file-size
      • merge-strategy
      • required-reviewer
      • work-item-linking
    • pr
      • policy
      • reviewer
      • work-item
    • ref
  • az artifacts
    • universal

I won’t go into all of these commands and subcommands, I can showcase a few of the ones I’ve used the most recently…

List of Projects

az devops project list --query "value[].{id:id, name:name}"

List of Repositories

az repos list --query "[].{id:id, defaultBranch:defaultBranch, name:name}" 

List of Branch Policies

az repos policy list --project <name> --query "[].{name: type.displayName, required:isBlocking, enabled:isEnabled, repository:settings.scope[0].repositoryId, branch:settings.scope[0].refName}"

Service Connections

az devops service-endpoint list --project <name> --query "[].name"

One More Thing

So while the az devops cli is pretty awesome, it has a hidden gem. If you can't find a supporting command in the az devops cli, you can always call the REST API directly from the tool using the az devops invoke command. There's a bit of hunting through documentation and available endpoints to find what you're looking for, but you can get a full list of what's available using the following:

az devops invoke --query "[?contains(area,'build')]"
az devops invoke --query "[?area=='build' && resourceName=='timeline']"

[
  {
    "area": "build",
    "id": "8baac422-4c6e-4de5-8532-db96d92acffa",
    "maxVersion": 6.0,
    "minVersion": 2.0,
    "releasedVersion": "5.1",
    "resourceName": "Timeline",
    "resourceVersion": 2,
    "routeTemplate": "{project}/_apis/{area}/builds/{buildId}/{resource}/{timelineId}"
  }
]

We can invoke this REST API call by passing in the appropriate area, resource, route and query-string parameters. Assuming I know the buildId of a recent pipeline run, the following shows me the state and status of all the stages in that build:

az devops invoke 
    --area build 
    --resource Timeline 
    --route-parameters project=myproject buildId=2058 timelineid='' 
    --query "records[?contains(type,'Stage')].{name:name, state:state, result:result}"

Tip: the route and query parameters specified in the routeTemplate are case-sensitive.

More to come

Today's post outlined how to make sense out of JMESPath and some cool features of the Azure DevOps CLI. My next few posts I'll dig deeper into using the cli in your favourite scripting tool

Happy coding.

Wednesday, July 15, 2020

Exclusive Lock comes to Azure Pipelines

semaphore red left

As part of Sprint 171, the Azure DevOps team introduced a much needed feature for Multi-Stage YAML Pipelines, the Exclusive Lock "check" that can be applied to your environments. This feature silently slipped into existence without any mention of it in the release notes, but I was personally thrilled to see this. (At the time this post was written, Sprint 172 announced this feature was available)

Although Multi-Stage YAML Pipelines have been available for a while, there are still some subtle differences between their functionality and what's available through Classic Release Pipelines. Fortunately over the last few sprints we've seen a few incremental features to help close that feature parity gap, with more to come. One of the missing features is something known as "Deployment Queuing Settings" -- a Classic Release pipeline feature that dictates how pipelines are queued and executed. The Exclusive Lock check solves a few pain points but falls short on some of the more advanced options.

In this post, I'll walk through what Exclusive Locks are, how to use them and some other thoughts for consideration.

Deployments and Environments

Let's start with a multi-stage pipeline with a few stages, where we perform CI activities and each subsequent stage deploys into an environment. Although we could write our YAML to build and deploy using standard tasks, we're going to use the special "deployment" job that tracks builds against Environments.

trigger:
 - master

stages:
 - stage: ci_stage
   ...steps to compile and produce artifacts

- stage: dev_stage
   condition: and(succeeded(), eq(variables['Build.SourceBranch','refs/heads/master'))
   dependsOn: ci_stage
   jobs:
   - deployment: dev_deploy
     environment: dev
     strategy:
       runOnce:
         deploy:
           ... steps to deploy
       
 - stage: test_stage
   dependsOn: dev_stage
   ...

If we were to run this hypothetical pipeline, the code would compile in the CI stage and then immediately start deploying into each environment in sequence. Although we definitely want to have our builds deploy into the environments in sequence, we might not want them to advance into the environments automatically. That's where Environment Checks come in.

Environment Checks

As part of multi-stage yaml deployments, Azure DevOps has introduced the concept of Environments which are controlled outside of your pipeline. You can set special "Checks" on the environment that must be fulfilled before the deployment can occur. On a technical note, environment checks bubble up from the deployment task to the stage, so the checks must be satisfied before the stage is allowed to start.

For our scenario, we're going to assume that we don't want to automatically go to QA, so we'll add an Approval Check that allows our testing team to approve the build before deploying into their environment. We'll add approval checks for the other stages, too. Yay workflow!

approval-checks

At this point, everything is great: builds deploy to dev automatically and then pause at the test_stage until the testing team approves. Later, we add more developers to our project and the frequency of the builds starts to pick up. Almost immediately, the single agent build pool starts to fill up with builds and the development team start to complain that they're waiting a really long time for their build validation to complete.

Obviously, we add more build agents. Chaos ensues.

What just happen'd?

When we introduced additional build agents, we were expecting multiple CI builds to run simultaneously but we probably weren't expecting multiple simultaneous deployments! This is why the Exclusive Lock is so important.

By introducing an Exclusive Lock, all deployments are forced to happen in sequence. Awesome. Order is restored.

There unfortunately isn't a lot of documentation available for the Exclusive Lock, but according to the description:

“Adding an exclusive lock will only allow a single run to utilize this resource at a time. If multiple runs are waiting on the lock, only the latest will run. All others will be canceled."

Most of this is obvious, but what does 'All others will be canceled' mean?

Canceling Queued Builds

My initial impression of the "all other [builds] will be canceled" got me excited -- I thought this was the similar to the “deploy latest and cancel the others” setting of Deployment Queuing Settings:

deployment-queue-settings

Unfortunately, this is not the intention of the Exclusive Lock. It focuses only on sequencing of the build, not on the pending queue. To understand what the “all others will be canceled” means, let's assume we have 3 available build agents and we'll use the az devops CLI to trigger three simultaneous builds.

az pipelines run --project myproject --name mypipeline 
az pipelines run --project myproject --name mypipeline 
az pipelines run --project myproject --name mypipeline

In this scenario, all three CI builds happen simultaneously but the fun happens when all three pipeline runs hit the dev_stage. As expected, the first pipeline takes the exclusive lock on the development environment while the deployment runs and the remaining two builds queue up waiting for the exclusive lock to be released. When the first build completes, the second build is automatically marked as canceled and the last build remains begins deployment.

exclusive-lock-queuing

This is awesome. However I was really hoping that I could combine the Exclusive Lock with the Approval Gate to recreate the same functionality of the Deployment Queuing option: approving the third build would cancel the previous builds. Unfortunately, this isn’t the case. I’m currently evaluating whether I can write some deployment automation in my pipeline to cancel other pending builds.

Wrapping Up

In my opinion, Exclusive Locks are a hidden gem of Sprint 171 as they’re essential if you’re automatically deploying into an environment without an Approval Gate. This feature recreates the “deploy all in sequence” feature of Classic Release Pipelines. The jury is still out on canceling builds from automation. I’ll keep you posted.

Happy coding!

Tuesday, July 14, 2020

Using Templates to improve Pull Requests and Work-Items (Part 2)

In my previous post, I outlined how to setup templates for pull-requests. Today we’ll focus on how to configure work-items with some project-specific templates. We’ll also look at how you can create these customizations for all projects within the enterprise and the motivations for doing so.

While having a good pull-request template can improve clarity and reduce the effort needed to approve pull-requests, having well defined work-items are equally as important as they can drive and shape the work that needs to happen. We can define templates for our work-items to encourage work-item authors to provide the right level of detail (such as steps to reproduce a defect) or we can use templates to reduce effort for commonly created work-items (such as fields that are common set when creating a technical-debt work-item).

Creating Work-Item Templates

Although you can define pull-request templates as files in your git repository, Azure DevOps doesn’t currently support the ability to customize work-items as managed source files. This is largely due to the complexity of work-items structure and the level of customization available, so our only option to date is to manipulate the templates through the Azure Boards user-interface. Fortunately, it’s relatively simple and there are a few different ways you can setup and customize your templates – you can either specify customizations through the Teams configuration for your Project, or you can extract a template from an existing work item.

As extracting from an existing work-item is easier, we’ll look at this first.

Creating Templates from Existing Work-Items

To create a template from an existing work item, simply create a new work-item that represents the content that you’d like to see in your template. The great news is that our template can capture many different elements, ranging from the description and other commonly used fields to more specialized fields like Sprint or Labels.

It’s important to note that templates are team specific, so if you’re running a project with multiple scrum teams, each team can self-organize and create templates that are unique to their needs.

Here’s an example of user story with the description field pre-defined:

work-item-example

Once we like the content of the story, we can convert it into a template using the ellipsis menu (…) Templates –> Capture:

work-item-capture-template

The capture dialog allows us to specify which fields we want to include in our template. This typically populates with the fields that have been modified, but you can remove or add any additional fields you want:

capture-template-dialog

As some fields are stored in the template as HTML, using this technique of creating a template from an existing work-item is especially handy.

Customizing Templates

Once you’ve defined the template, you find them in Settings –> Team Configuration. There’s a sub-navigation item for Templates.

edit-template

Applying Templates

Once you have the template(s) created, there are a few ways you can apply them to your work-items: you can apply the template to the work-item while you’re editing it, or you can apply it to the work-item from the backlog. Both activities are achieved using the ellipsis menu: Templates –> <template-name>.

The latter option of applying the template from the Backlog is extremely useful because you can apply the template to multiple items at the same time.

assign-template-from-backlog

With some creative thinking, templates can be used like macros for commonly performed activities. For example, I created a “Technical Debt” template that adds a TechDebt tag, lowered priority and changes the Value Area to Architectural.

Creating Work-Items from Templates

If you want to apply the template to work-items as you create them, you’ll need to navigate to a special URL that is provided with each template (Settings –> Boards: Team Configuration –> Templates).

get-link-for-template

The Copy Link option copies the unique URL to the template to the clipboard, which you can circulate to your team. Personally, I like to create a Markdown widget on my dashboard that allows team members to navigate to this URL directly.

create-work-item-from-dashboard

Going Further – Set defaults for Process Template

Unfortunately, there’s no mechanism to specify which work-item template should be used as the default for a team. You can however provide these customizations at the Process level, which applies these settings for all teams using that process template. Generally speaking, you should only make these changes for enterprise-wide changes.

Note that you can’t directly edit the default process templates, you will need to create a new process template based on the default: Organization Settings –> Boards –> Process:

process-template

Within the process, you can bring up any of the work-items into an editor that let’s you re-arrange the layout and contents of the work-item. To edit the Description field to have a default value, we select the Edit option in the ellipsis menu:

edit-process-template

Remembering that certain fields are HTML, we can set the default for our user story by modifying the default options:

edit-process-template-field

Wrapping up

Hopefully the last two posts for providing templates for pull requests and work-item templates has given you some ideas on how to quickly provide some consistency to your projects.

Happy coding!