Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Foundations in Azure DevOps YAML pipelines

Parameters, variables, & dependencies in Azure DevOps YAML pipelines — a short introduction

Photo by Jeremy Bishop on Unsplash

CI/CD is the name of the new game, and if you construct everything as code, you can version, reuse, share it, and make it better as you go along, using your already beloved tools. Next, I will try to make your life easier in setting up your pipelines in Azure DevOps with a focus on the simplest (and you will find out by yourself, that in fact, those are the most complex) aspects of them: parameters, variables, and dependencies.

The foundation of every logic from my perspective stands in what you give as input(parameters), what you know as a fact(variables), and the correlation between them and the rest(dependencies). Now, there is a small misleading concept in Azure DevOps related to “dependency”, so I will try to shed some light on it and choose a very specific wording for what I will describe next.

Dependencies in the context of Azure DevOps are relations between different layers of functional code that means tasks, jobs, or stages. Also, dependency refers to the “dependsOn” directive which can give extra control on a stage or a job run. I will continue to use dependency as such, making the distinction between the above while using them.

Correlation will be used next in a broader context of relations between objects like parameters and variables and the surrounding specific part of the code where we use them.

Parameters

Defining parameters needs to take into consideration their type and usage. If they are intended for user input, then the recommendation is to use the expanded way of defining them for better control:

parameters:
- name: version
displayName: "The version"
type: string
default: '0.0.0'
- name: type
displayName: "The type of deployment"
type: string
values:
- patch
- minor
- major
default: 'patch'
- name: environment
type: object
default:
- env: 'test1'
tester: 'build'
- env: 'test2'
tester: 'test1'
- name: dep
displayName: Check environment?
type: boolean
default: true

If the parameters are used for passing values between templates like from main to child, or from child1 to child2, a simpler approach can be used.

In the main or bigger child template you will have something like:

    - template: /templates/your_template.yml
parameters:
Parameter1: 'Parameter1_Value'
Parameter2: 'Parameter2_Value'
Parameter3: 'Parameter3_Value'
Parameter4: 'Parameter4_Value'
Parameter5: 'Parameter5_Value'

In the template referenced above /templates/your_template.yml you must have something like:

parameters:
Parameter1: 'Parameter1_Value'
Parameter2: 'Parameter2_Value'
Parameter3: 'Parameter3_Value'
Parameter4: 'Parameter4_Value'
Parameter5: 'Parameter5_Value'

For more complex tasks you still may need the expanded notations of parameters inside your templates and that is correct also.

To access the parameters, we will use this notation in usual cases:

${{ parameters.ParameterName }}

In case the parameter is used inside a condition, then depending on the parameter type, we will use only the notation of parameters.ParameterName:

${{ if ne(parameters.ParameterName, 'A') }}

If we are looping through a parameter object, then the notation will be done using the object structure and taking into consideration the things targeted.

A simple example:

# simple object parameter
parameters:
- name: environments
type: object
default:
- "test1"
- "test2"
- "test3"
stages:
- ${{ each environment in parameters.environments }}:
- stage: ${{ environment }}
displayName: "${{ environment }} Stage"
...

A more complex example:

# complex object parameter
parameters:
- name: environments
type: object
default:
- name: "test1"
test: "ok"
validated: "yes"
- name: "test2"
test: "notOK"
validated: "yes"
- name: "test3"
test: "undefined"
validated: "no"
stages:
- ${{ each environment in parameters.environments }}:
- stage: ${{ environment.name }}
displayName: "${{ environment.name }} Stage"
...
# to target an element in the loop we can use 
# environment.name
# environment.test
# environment.validated
# if a specific element is needed from the object parameter
# we will use something like environment[0].name (this is the name of the first environment)

I will not go deeper into the conditions and how you can use parameters in them, because that is not in scope right now, and I will discuss it in a later article.

Things you must know about parameters:

  • They can be defined/accessed as user input only in the main template that is triggered
  • You can NOT define a template that holds only the parameters used by the main pipeline, due to the run time dependency of them (the user input parameters need to be defined locally in the main pipeline)
  • You can abstract parameters (compose them and pass them to other templates without being visible to the user as input)
  • They can be overwritten by another parameter or a variable
  • They can be used in complex conditions (I prefer using them instead of variables)

Good documentation:

  • Runtime parameters
  • Template types & usage (passing parameters to your templates)
  • YAML schema reference for Azure Pipelines (good to know the inner schematics of the pipeline as code)

Variables

Defining variables can be done inside the main template or in a separate template, and it is as easy as:

variables:
VariableName1: VariableName1_Value
VariableName2: VariableName2_Value
VariableName3: VariableName3_Value
VariableName4: VariableName4_Value
VariableName5: VariableName5_Value

If the above is not in your main pipeline, but in a child template, defining your variables needs to reference their location:

variables:
- template: /vars/generic_vars.yml
- template: /vars/specific_vars.yml

Variables can be used generally as:

$(VariableName)

In case the variable represents something that needs to be available on runtime and it is not locally defined, in the main template, then we need to specify it like:

${{ variables.VariableName }}

Things you must know about variables:

  • They can be defined in a separate (external) template and be reused by multiple pipelines
  • They can be overwritten by another parameter or a variable
  • If you are using them from an external template and reference them in the main pipeline, their existence is passed automatically to any child template. Ex: $(variableName) present in the main template can be accessed in the same way as $(variableName) by any child template without any other change
  • If you have special variables like service connections that are required for the pipeline run and your variables are defined externally to the main pipeline (in another template) you will need to reference them as run time components. Ex: $(variableName) notation will not work in this case, because the variable needs to be available at run time, so you will use the ${{variables.variableName}} notation
  • If you are composing another variable or parameter based on the variable defined, make sure that quotes or single quotes are used correctly for your desired result

Good documentation:

  • Variables types
  • Predefined variables
  • Set variables in scripts

Dependencies

Dependencies as passing a variable, can be done between two or more jobs or two or more stages. To achieve this you will need:

  1. The “dependsOn” directive to mark the correlation between jobs or stages that produce and consume respectively the variable
  2. The exposure of the variable that is generated at the job or stage level
  3. The dependency declaration that makes possible the usage of the variable next

1. Using dependsOn

Dependency between different jobs and between different stages is fairly simple, using the “dependsOn” directive:

- job: FirstJob
displayName: 'FirstJob - displayName'
steps:
...
- job: SecondJob
dependsOn: FirstJob
displayName: 'SecondJob - displayName'
steps:
...

The SecondJob above will continue to function if the FirstJob is successful or skipped.

- stage: FirstStage
displayName: 'FirstStage - displayName'
jobs:
...
- stage: SecondStage
dependsOn: FirstStage
displayName: 'SecondStage - displayName'
jobs:
...

The SecondStage above will continue to function if the FirstStage is successful or skipped.

By default, if “dependsOn” is missing, the pipeline will run in the sequence it was written in. If “dependsOn” contains only one job or stage specified and everything is based on that, the pipeline run will be parallel. If more “dependsOn” directives are necessary, the syntax is as follows:

dependsOn: 
- FirstPhase
- SecondPhase

The FirstPhase and SecondPhase can be jobs or stages. Now that we covered the “dependsOn” directive, we will continue with exposing a variable to be available for future usage.

2. Exposing a variable

We have two possibilities to do this depending on our preference:

Bash

echo "##vso[task.setvariable variable=VariableName;isOutput=true]VariableValue"

PowerShell

Write-Host "##vso[task.setvariable variable=VariableName;isOutput=true]VariableValue"

In both of the above examples, we are creating the variable VariableName which will have the value of VariableValue. Here “isOutput=true” is required to pass the variable to future jobs or stages.

Let’s see this unfold:

jobs:
- job: FirstJob
displayName: 'FirstJob - displayName'
steps:
- bash: |
echo "##vso[task.setvariable variable=VariableName;isOutput=true]VariableValue"
name: TaskThatExposesTheVariable
- job: SecondJob
dependsOn: FirstJob
displayName: 'SecondJob - displayName'
steps:
...

Above, we already have the dependency expressed as “dependsOn” and a bash task that created the VariableName. Giving a name to the task where you use the Bash or PowerShell commands is required to reference them later.

3. The dependency declaration (wrap up)

The dependency is expressed as a variable declaration at the job or stage level. For job dependencies we will use “dependencies” and for stages, “stageDependencies”

Job dependency schema

$[ dependencies.JobName.outputs['TaskName.VariableName'] ]

Stage dependency schema

$[ stageDependencies.StageName.JobName.outputs['TaskName.VariableName'] ]

Job and stage dependency using variables

$[ dependencies.$(JobName).outputs['$(TaskName).$(VariableName)'] ]
$[ stageDependencies.$(StageName).$(JobName).outputs['$(TaskName).$(VariableName)'] ]

Job and stage dependency using parameters

$[ dependencies.${{ parameters.JobName }}.outputs['${{ parameters.TaskName }}.${{ parameters.VariableName }}'] ]
$[ stageDependencies.${{ parameters.StageName }}.${{ parameters.JobName }}.outputs['${{ parameters.TaskName }}.${{ parameters.VariableName }}'] ]

Let’s try to wrap up everything in a more complex example:

- stage: FirstStage
displayName: 'FirstStage - displayName'
jobs:
  - job: FirstJob
displayName: 'FirstJob - displayName'
steps:
- bash: |
echo "##vso[task.setvariable variable=VariableName;isOutput=true]VariableValue"
name: TaskThatExposesTheVariable
- job: SecondJob
dependsOn: FirstJob
displayName: 'SecondJob - displayName'
variables:
VariableName: $[ dependencies.FirstJob.outputs['TaskThatExposesTheVariable.VariableName'] ]
steps:
...
- stage: SecondStage
dependsOn: FirstStage
displayName: 'SecondStage - displayName'
variables:
VariableName: $[ stageDependencies.FirstStage.FirstJob.outputs['TaskThatExposesTheVariable.VariableName'] ]
jobs:
...

Now we referenced the variable across jobs and stages.

Things you must know about dependencies:

The “dependsOn” directive:

  • If not mentioned it will default to a sequence of stages or jobs, where the previous phase needs to be successful to continue
  • If mentioned, the job or stage will depend by default on the success of the mentioned stage or job
  • If multiple jobs or stages depend on a single job or stage, the flow will change from sequence to parallel
  • The directive can hold multiple jobs or stages and as such, the evaluation will be done on all of them to be successful before continuing
  • If one or more of the dependent jobs or stages are successful or skipped, the next phase will run; dependsOn will block the run only if a previous job or stage is failed.

Dependencies between tasks, jobs, and stages:

  • Tasks in the same job context share local variables without the need of referencing them
  • Any dependency at the job or stage level is required to be implemented with the addition of the dependsOn directive. Ex: you want to pass variable A from stage 1 to stage 2. To do that, besides the reference of variable A in stage 2, you will need stage 2 to have a dependsOn directive pointing to stage 1
  • To pass a variable from one job to another job you will use dependencies
  • To pass a variable from one stage to another, you will use stageDependencies
  • To use any dependency, you will need to expose the variable from the stage or job that generates it to the job or stage that can consume it. This is done using the VSO task.setvariable with isOutput=true

Good documentation:

  • Stages, dependencies, & conditions
  • How to pass variables in Azure Pipelines YAML tasks
  • Conditions (I know this is not in the scope of this article but it is a must-read for creating your pipelines)

Last but not least

I know this may feel a lot and the level of flexibility is extreme, you can assign a parameter to a variable and vice versa and create all sorts of correlations beyond imagination. But this is good! This is awesome! By the end of the day your flow, your pipeline will do exactly as you please, and in the way you want it.

Before you leave…

I hope you enjoyed reading this as much as I loved writing it! In the next articles I will try to focus on things that I’ve discovered while tackling the possibilities of Azure DevOps and other “techy” things. Stay tuned, follow, subscribe, share, leave a comment, be as social as possible for the sake of the Social Media Gods! Will sign out now and walk my dog. :-)


Foundations in Azure DevOps YAML pipelines was originally published in ING Hubs Romania on Medium, where people are continuing the conversation by highlighting and responding to this story.



This post first appeared on Stefan Plesca |, please read the originial post: here

Share the post

Foundations in Azure DevOps YAML pipelines

×

Subscribe to Stefan Plesca |

Get updates delivered right to your inbox!

Thank you for your subscription

×