Gitlab can be a bit of a challenge if you have never written any CI/CD pipelines, forget about your software development mindset and quickly adapt to the pipeline mindset with these 5 easy to remember tips.
Templates!
Pipelines can get messy, their top-down execution means you need to carefully analyze which rules apply to each job before executing them. You can simplify most of these rules with templates.
Templates can be defined in separate repos, and then used in your project's gitlab-ci.yml for extra commodity and abstraction. You can enable this behavior with include and extends
ci-cd-templates repo
templates.yml
.echo-statement
- before_script:
- echo "this is an inherited step!"
my-project repo
gitlab-cy.yml
include:
- project: 'ci-cd-templates'
file: 'templates.yml'
rev: (tag or branch)
example-job:
extends:
- .echo-statement
script:
- echo "this is a normal step!"
Important caveat
- Commands in your job will be defined by the template, if you define them in example-job they will overwrite the template's keys and values. This is my templates normally use a before/after script.
Abstract Logins
Let's use the freshly acquired knowledge from the last tip to simplify our pipelines. When doing pipelines you will inadvertably reach a point where you will need to login into something, cloud, api, gitlab api, anything really.
Keep your pipelines cleaner by abstracting the logins to your templates. Since logins almost always generate files from which to yield the access tokens, combine this with artifacts and abstract your logins.
ci-cd-templates repo
templates.yml
.aws-login
variables:
role: MY-ROLE
role_session_name: dev
account_id: 0000000
before_script:
- aws sts assume-role --role-arn "arn:aws:iam::${ACCOUNT_ID}:role/${ROLE}" --role-session-name "${ROLE_SESSION_NAME}" > creds.tmp.json
- echo AWS_ACCESS_KEY_ID=$(cat creds.tmp.json | jq -r ".Credentials.AccessKeyId") > build.env
- echo AWS_SECRET_ACCESS_KEY=$(cat creds.tmp.json | jq -r ".Credentials.SecretAccessKey") >> build.env
- echo AWS_SESSION_TOKEN=$(cat creds.tmp.json | jq -r ".Credentials.SessionToken") >> build.env
- rm creds.tmp.json
artifacts:
reports:
dotenv: build.env
Attach this login templated job to your deployment jobs.
deploy-dev:
extends: .aws-login
script:
- docker build -t [...]
- docker push [...]
Keep jobs simple
When developing jobs, keep them as purpose fit as possible, this will ensure readability and maintainability.
Declare the same stages every time and give them individual stages. Those who have read "The Pragmatic Programmer" will kill me for saying this, but remember: forget about your programmer mindset.
A template we use in almost every project:
gitlab-ci.yml
stages:
- test
- build
- deploy
make-test: # executes unittests / integration tests
stage: test
init-dev-dependencies: # downloads dependencies
stage: build
build-dev: # builds a docker image / gz / tar / etc
stage: build
deploy-dev: # deploys / uploads a docker image / gz / tar / etc
stage: deploy
init-tst-dependencies: # same steps, different enviornment
stage: build
build-tst:
stage: build
deploy-tst:
stage: deploy
[ other environments ]
Keep jobs neatly linked
Neatly tie jobs in order to form dependent pipelines with needs and dependencies.
- Needs: Needs a job to run before itself, this will create independent pipelines for each environment, if you define it correctly.
- Dependencies: Depends on the files from the dependent jobs.
gitlab-ci.yml
make-test:
stage: test
init-dev-dependencies:
stage: build
needs:
- make-test
build-dev:
stage: build
needs:
- init-dev-dependencies
dependencies:
- init-dev-dependecies
deploy-dev:
stage: deploy
needs:
- build-dev
dependencies:
- build-dev
init-tst-dependencies:
stage: build
needs:
- make-test
build-tst:
stage: build
needs:
- init-tst-dependencies
dependencies:
- init-tst-dependencies
deploy-tst:
stage: deploy
needs:
- build-tst
dependencies:
- build-tst
For a clearer view of what needs actually does:
With needs:
Without needs:
So, in essence, it parallelizes.
Keep important steps manual
Lastly, keep the most important or dangerous steps manual, by adding when.
When will make sure a user presses the button in the gitlab UI in order for the step to run. Of course jobs that are linked to a manual job via needs will not run until the button is pressed. It is also highly recommended to tie the initial production job to the acceptance / test deployment, for extra security against unwanted deployments.
gitlab-ci.yml
init-prd-dependencies: # no need for unit tests when deploying to prod, since it should already be tested in the dev/tst/acc pipelines
stage: build
when: manual
build-prd:
stage: build
needs:
- init-prd-dependencies
- deploy-acc
dependencies:
- init-prd-dependecies
deploy-prd:
stage: deploy
needs:
- build-prd
dependencies:
- build-prd
These are some tips from things we have learned in the past months, these GitLab features have helped our team create better, safer, readable and robust pipelines.
Top comments (2)
:o
Hello π
A few remarks :
1) Your example in 'Abstract Logins' could be more understandable ; you jump directly from AWS keys to 'docker build' ; there is at least a 'docker login' command missing in between.
2) In 'Keep jobs simple', one of the 'stages' should be 'jobs' I guess :
3) In 'Keep jobs neatly linked' you say that 'needs' parallelizes. This is not the case : remove the needs keywords in your example, and you will get the exact same first workflow, not the second one, since 'deploy-dev' and 'deploy-tst' are in the same stage.