Saturday, June 06, 2020

Downloading Artifacts from YAML Pipelines

Azure DevOps multi-stage YAML pipelines are pretty darn cool. You can describe a complex continuous integration pipeline that produces an artifact and then describe the continuous delivery workflow to push that artifact through multiple environments in the same YAML file.

In today’s scenario, we’re going to suppose that our quality engineering team is using their own dedicated repository for their automated regression tests. What’s the best way to bring their automated tests into our pipeline? Let’s assume that our test automation team has their own pipeline that compiles their tests and produces an artifact so that we can run these tests with different runtime parameters in different environments.

There are several approaches we can use. I’ll describe them from most-generic to most-awesome.

Download from Azure Artifacts

A common DevOps approach that is evangelized in Jez Humble’s Continuous Delivery book, is pushing binaries to an artifact repository and using those artifacts in ad-hoc manner in your pipelines. Azure DevOps has Azure Artifacts, which can be used for this purpose, but in my opinion it’s not a great fit. Azure Artifacts are better suited for maven, npm and nuget packages that are consumed as part of the build process.

Don’t get me wrong, I’m not calling out a problem with Azure Artifacts that will you require you to find an alternative like JFrog’s Artifactory, my point is that it’s perhaps too generic. If we dumped our compiled assets into the artifactory, how would our pipeline know which version we should use? And how long should we keep these artifacts around? In my opinion, you’d want better metadata about this artifact, like source commits and build that produced it, and you’d want these artifacts to stick-around only if they’re in use. Although decoupling is advantageous, when you strip something of all semantic meaning you put the onus on something else to remember, and that often leads to manual processes that breakdown…

If your artifacts have a predictable version number and you only ever need the latest version, there are tasks for downloading these types of artifacts. Azure Artifacts refers to these loose files as “Universal Packages”:

- task: UniversalPackages@0
  displayName: 'Universal download'
  inputs:
    command: download
    vstsFeed: '<projectName>/<feedName>'
    vstsFeedPackage: '<packageName>'
    vstsPackageVersion: 1.0.0
    downloadDirectory: '$(Build.SourcesDirectory)\someFolder'

Download from Pipeline

Next up: the DownloadPipelineArtifact task is full featured built-in Task that can download artifacts from different sources, such as an artifact produced in an earlier stage, a different pipeline within the project, or other projects within your ADO Organization. You can even download artifacts from projects in other ADO Organizations if you provide the appropriate Service Connection.

- task: DownloadPipelineArtifact@2
  inputs:
    source: 'specific'
    project: 'c7233341-a9ff-4e76-9367-909816bcd16g'
    pipeline: 1
    runVersion: 'latest'
    targetPath: '$(Pipeline.Workspace)'

Note that if you’re downloading an artifact from a different project, you’ll need to adjust the authorization scope of the build agent. This is found in the Project Settings –> Pipelines : Settings. If this setting is disabled, you’ll need to adjust it at the Organization level first.

image

This works exactly as you’d expect it to, and the artifacts are downloaded to $(Pipeline.Workspace). Note in the above I’m using the project guid and pipeline id, which are populated by the Pipeline Editor, but you can specify them by their name as well.

My only concern is there isn’t anything that indicates our pipeline is dependent on another project. The pipeline dependency is silently being consumed… which feels sneaky.

build_download_without_dependencies

Declared as a Resource

The technique I’ve recently been using is declaring the pipeline artifact as a resource in the YAML. This makes the pipeline reference much more obvious in the pipeline code and surfaces the dependency in the build summary.

Although this supports the ability to trigger our pipeline when new builds are available, we’ll skip that for now and only download the latest version of the artifact at runtime.

resources:
 pipelines:
   - pipeline: my_dependent_project
     project: 'ProjectName'
     source: PipelineName
     branch: master

To download artifacts from that pipeline we can use the download alias for DownloadPipelineArtifact. The syntax is more terse and easier to read. This example downloads the published artifact 'myartifact' from the declared pipeline reference. The download alias doesn’t seem to specify the download location. In this example, the artifact is downloaded to $(Pipeline.Workspace)\my_dependent_project\myartifact

- download: my_dependent_project
  artifact: myartifact

With this in place, the artifact shows up and change history appears in the build summary.

build_download_with_pipeline_resource

Update: 2020/06/18! Pipelines now appear as Runtime Resources

At the time this article was written, there was an outstanding defect for referencing pipelines as resources. With this defect resolved, you can now specify the version of the pipeline resource to consume when manually kicking-off a pipeline run.

  1. Start a new pipeline run

    manually-trigger-build-select-resource
  2. Open the list of resources and select the pipeline resource

    manually-trigger-build-select-resource-pipeline
  3. From the list of available versions, pick the version of the pipeline to use:

    manually-trigger-build-select-resource-pipeline-version

With this capability, we now have full traceability and flexibility to specify which pipeline resource we want!

Conclusion

So there you go. Three different ways to consume artifacts.

Happy coding!

0 comments: