Over time, as the organization (or company) grows, the number of applications increases. It causes a speed down of delivery because each application has different runtime environments (e.g., Kubernetes, Lambda), and users have to add a new pipeline for each application. This article will show how to build a scalable deployment system for organizations with open sources.
Goal
- We manage all deployments in one place even though the runtime environment is different.
- Any developers can add a new pipeline for deployment even though they are a newbie.
Overview
I'll build the event-driven deployment system around GitHub deployment API. Deployment API builds the system loosely decouples between trigger and execution so that the system trigger in the same way and the tooling executes differently for each runtime environment, respectively.
This article shows you how to build a deployment system by integrating Gitploy(trigger) with Spinnaker(executor).
Prerequisites
Unified Helm Chart
Helm is the package manager for Kubernetes, and it has a fantastic template feature. Chart users can customize the template files are located in the templates/
directory by overriding values. For example, you can override the image tag if you want to deploy with a specific tag. This article will override values to customize the template files differently for each environment, such as dev
, qa
, and production
, respectively.
I prepared the Helm chart for deploying an application. I recommend maintaining the single Helm chart and overriding values across all applications.
After creation, I hosted the Helm chart by GitHub IO. Check this article for the detail.
Values File
To override the values of the Helm chart, I added values files for each environment under the release/
directory in the git repository. I decided to include the values file into the git repository because it lets us follow up on the infrastructure changes and easily enables rollback.
release
├── values.dev.yaml
└── values.production.yaml
Docker Image
I build the CI with GitHub action that builds Docker images for every commit by listening for the push event. So I have docker images corresponding to every commit.
Step 1: Trigger
As I spoke, GitHub provides deployment API to build the event-driven deployment system, but unfortunately, it doesn't support UI for users. So I installed Gitploy, an open source-based 3rd party tool, to build the deployment system around deployment API. It supports advanced features for deployment such as rollback, promotion, and review.
I activated a repository and added a new deploy.yml
file, which defines dev
and production
environments.
# deploy.yml
envs:
- name: dev
auto_merge: false
# To avoid the context verification.
# required_context: []
- name: production
auto_merge: false
Step 2: Execution
Pipeline
I chose Spinnaker for deploy tooling. Spinnaker is an open-source deployment tool created by Netflix. And pipelines are the essential concept in Spinnaker for controlling how to deploy your application. I'll explain how to make a pipeline to integrate with deployment API.
When you create a new pipeline, you can find the configuration stage for configuring how to trigger the pipeline. I set up the trigger webhook type to listen for events from GitHub and added constraints to be triggered when the repository name and environment are matched.
The next stage is bake. It builds manifests for Kubernetes with the Helm chart and the values file for the environment. I specified the Helm chart version and set the path and the version of the values file. The thing is that I used the pipeline expression to fetch the values file, which is at the commit SHA of the deployment (i.e., ${trigger['payload']['deployment']['sha']}
).
After the baking stage, the Spinnaker deploys to Kubernetes. Then is it finished? No, the Spinnaker had to update the deployment status by deployment API. Spinnaker provides a simple way to add a custom stage to make API calls as part of a pipeline. To create a custom webhook, I added the configuration for the stage in orca-local.yml and applied it. Then the Spinnaker shows the stage with a few variables such as owner, repo, deployment_id, description, and state. I've set variables with the string values, but the deployment_id is the pipeline expression (i.e. ${trigger['payload']['deployment']['id']}
).
After completing the pipeline, I created a new GitHub webhook to dispatch events when Gitploy triggers a deployment. The payload URL should be http://GATE_HOST/api/v1/webhooks/webhook/app
. And I configured the content-type application/json
, and selected deployment
in the events.
Finally, when I've deployed it in Gitploy, I can follow how the deployment status is updated, and when I click the View detail
, it redirects to the executed Spinnaker pipeline.
Pipeline Template
Creating a new pipeline manually for each application is redundant and slows down delivery speed. But Spinnaker provides the pipeline template to share with your teams within a single application across different applications to resolve this issue.
I made the pipeline template by the documentation. Now I can create a new pipeline with variables quickly.
Conclusion
If a developer wants to build a new deployment pipeline, just follow up these steps:
- Add
deploy.yml
file to activate Gitploy, and add values files for each environment. - Add a new pipeline from the pipeline template with some variables.
Therefore, I can achieve the Goal as mentioned before.
- I can manage all deployments in Gitploy even though the runtime environment is different.
- Any developer can create a new pipeline in a few minutes easily.
Building a scalable deployment system in all enterprise-level companies is a living challenge, and I believe it could be a good solution.
References
Thanks for reading, and please leave me a comment about your opinion! :)
Top comments (0)