Continuous Integration(CI) pipelines needs a target infrastructure to which the CI artifacts are deployed. The deployments are handled by CI or we can leverage Continuous Deployment pipelines. Modern day architecture uses automation tools like terraform, ansible to provision the target infrastructure, this type of provisioning is called IaC.
Usually CI/CD and IaC don't run in tandem. Many times we want to trigger the CI pipeline only when the target infrastructure is ready to bootstrap with software components that are required by CI/CD pipelines.
As part of this DIY blog let us tackle the aforementioned problem with an use case.
Use Case
As CI/CD user I would like to provision a Kubernetes Cluster on Google Cloud Platform(GKE) using Terraform. The successful provision of the cluster should notify a CI pipeline to start bootstrapping ArgoCD on to GKE.
What you need ?
- A Terraform Cloud Account. Create a workspace on the terraform cloud to be used for this exercise.
- Google Cloud Account used to create the Google Kubernetes Engine(GKE) cluster.
- Though we can use any CI platform, for this demo we will use Harness CIas our CI platform. You can do a free tier signup from here.
Demo Sources
The demo uses the following git repositories a sources,
- IaC vanilla-gke: the terraform source repository that will be used with terraform cloud to provision GKE.
- Kubernetes manifests bootstrap-argocd: the repository that holds kubernetes manifests to bootstrap argo CD on to the GKE cluster
- Harness CI Pipeline tfc-notification-demo
Fork and Clone the Sources
To make fork and clone easier we will use gh CLI. Download the add gh
to your $PATH
.
Let us create a directory where we want to place all our demo sources,
mkdir -p "$HOME/tfc-notification-demo"
cd "$HOME/tfc-notification-demo"
export DEMO_HOME="$PWD"
IaC
Clone and fork vanilla-gke
repo,
gh repo clone harness-apps/vanilla-gke
cd vanilla-gke
gh repo fork
export TFC_GKE_REPO="$PWD"
Bootstrap Argo CD Sources
Clone and fork bootstrap-argocd
repo,
cd ..
gh repo clone harness-apps/bootstrap-argocd
cd bootstrap-argocd
gh repo fork
export ARGOCD_BOOTSTRAP_REPO="$PWD"
Harness CI Pipeline
Clone and fork tfc-notification-demo
repo,
cd ..
gh repo clone harness-apps/tfc-notification-demo
cd tfc-notification-demo
gh repo fork
export TFC_DEMO_REPO="$PWD"
For rest of the blog we will reference the repositories vanilla-gke
and bootstrap-argocd
and tfc-notification-demo
as $TFC_GKE_REPO
, $ARGOCD_BOOTSTRAP_REPO
and $TFC_DEMO_REPO
.
Harness CI
In the following sections we will define and create the resources required to define a CI pipeline using Harness platform.
Create Harness Project
Create new harness project named terraform_integration_demos
using Harness Web Console,
Update its details as shown,
Follow the wizard leaving rest to defaults and on the last screen choose Continuous Integration,
Click Go to Module to go to project home page.
Define New Pipeline
Click Pipelines to define a new pipeline,
For this demo will be doing manual clone, hence disable the clone,
Click on Pipelines and delete the default Build pipeline,
Add harnessImage
Docker Registry Connector
As part of pipelines we will be pulling image from DockerHub. harnesImage
Docker Registry Connector helps pulling the public Docker Hub images as an anonymous user.
Let us configure an harnesImage
connector as described in docker registry connectors. The pipelines we create as part of the later section will use this connector.
Configure GitHub
GitHub Credentials
Create a GitHub PAT for the account where you have have forked the repositories $TFC_GKE_REPO
and $ARGOCD_BOOTSTRAP_REPO
. We will refer to the token as $GITHUB_PAT
.
From the Project Setup click Secrets,
Update the encrypted text secret details as shown,
Click Save to save the secret,
Connector
As we need to clone the sources from GitHub, we need to define a GitHub Connector, from the Project Setup click Connectors,
From connector list select GitHub,
Enter the name as GitHub,
Click Continue to enter the connector details,
Click Continue and update the GitHub Connector credentials,
When selecting the Personal Access Token make sure you select the GitHub PAT
that we defined in previous section,
Click Continue and use select Connect through Harness Platform,
Click Save and Continue to run the connection test, if all went well the connection should successful,
Google Cloud Service Account Secret
We need Google Service Account(GSA) credentials(JSON Key) to query the GKE cluster details and create resources on it.
Set environment
export GCP_PROJECT="the Google Cloud Project where Kubernetes Cluster is created"
export GSA_KEY_FILE="path where to store the key file"
Create SA
gcloud iam service-accounts create gke-user \
--description "GKE User" \
--display-name "gke-user"
IAM Binding
Add permissions to the user to be able to provision kubernetes resources,
gcloud projects add-iam-policy-binding $GCP_PROJECT \
--member="serviceAccount:$GSA_NAME@$GCP_PROJECT.iam.gserviceaccount.com" \
--role="roles/container.admin"
Download And Save GSA Key
IMPORTANT: Only security admins can create the JSON keys. Ensure the Google Cloud user you are using has Security Admin role.
gcloud iam service-accounts keys create "${GSA_KEY_FILE}" \
--iam-account="gke-user@${GCP_PROJECT}.iam.gserviceaccount.com"
GSA Secret
Get back to the Project Setup click Secrets,
Add the GSA secret details as shown,
IMPORTANT: When you browse and select make sure you select the
$GSA_KEY_FILE
as the file for the secret.
Click Save to save the secret,
Terraform Workspace
On your terraform cloud account create a new workspace called vanilla-gke. Update the workspace settings to use Version Control and make it point to $TFC_GKE_REPO.
Configure the workspace with following variables,
For more details on available variables, check Terraform Inputs.
IMPORTANT: The
GOOGLE_CREDENTIALS
is Google Service Account JSON Key with permissions to create GKE cluster. Please check the https://github.com/harness-apps/vanilla-gke#pre-requisites for the required roles and permissions. This key will be used by Terraform to create the GKE cluster. When you add the key to terraform variables, you need to make it as base64 encoded e.g.cat YOUR_GOOGLE_CREDENTIALS_KEY_FILE | tr -d \\n
Going forward we will refer to the Terraform Workspace as $TF_WORKSPACE
Lookup your terraform cloud organizations
And set it's value to the variable $TF_CLOUD_ORGANIZATION
.
Create need Terraform API Token that can be used to pull the outputs of terraform run(cloud). From your terraform user settings Create an API token,
And save the API token to the variable $TF_TOKEN_app_terraform_io
. We will use this variable in CI pipeline.
Harness CI Pipeline
Getting back to Harness web console, navigate to your project terraform_integration_demos, click Pipelines and Create a Pipeline --> Import From Git,
Update the pipeline details as shown,
IMPORTANT: Make sure the Name of the pipeline is
bootstrap argocd pipeline
to make the import succeed with defaults.
Click the bootstrap argocd pipeline
from the list to open the Pipeline Studio and click on the stage Bootstrap Argo CD to bring up the pipeline steps,
You can click on each step to see the details.
The Pipeline uses the following secrets,
-
google_application_credentials
- the GSA credentials to manipulate GKE -
terraform_cloud_api_token
- the value of$TF_TOKEN_app_terraform_io
-
terraform_workspace
- the value$TF_WORKSPACE
-
terraform_cloud_organization
- the value$TF_CLOUD_ORGANIZATION
We already added google_application_credentials
secret as part of the earlier section. Following the similar pattern let us add the terraform_cloud_api_token
, terraform_workspace
and terraform_cloud_organization
as text secrets.
HINT:
From the Project Setup click Secrets,
TIP: You can also skip adding
terraform_workspace
andterraform_cloud_organization
, we can extract the values from the webhook payload using the expressions<+trigger.payload.workspace_name>
and<+trigger.payload.organization_name>
respectively.
Notification Trigger
For the Harness CI pipelines to listen to Terraform Cloud Events we need to define a Trigger, navigate back to pipelines and select the bootstrap argocd pipeline --> Triggers,
Click Add New Trigger to add a new webhook trigger(Type: Custom
),
On the Configuration page enter the name of the trigger to be tfc notification
,
Leave rest of the fields to defaults and click Continue, leave the Conditions to defaults and click Continue.
On the Pipeline Input update the Pipeline Reference Branch to be set to main
NOTE: The Pipeline Reference Branch does not have any implication with this demo as we do manual clone of resources.
Click Create Trigger to create and save the trigger.
Copy Webhook URL
Let us refer to this value as $TRIGGER_WEBHOOK_URL
.
Terraform Notification
On your terraform cloud console navigate to the workspace Settings --> Notifications,
Click Create Notification and select Webhook as the Destination,
Update the notification details as shown,
Since we need to bootstrap argo CD only on create events we set the triggers to happen only on Completed,
Click Create Notification to finish the creation of notification.
NOTE: The creation would have fired a notification, if the cluster is not ready yet the pipeline would have failed.
Congratulations!!. With this setup any new or updates thats done to the $TFC_GKE_REPO
will trigger a plan and apply on Terraform Cloud. A Completed plan will trigger the bootstrap argocd pipline
to run and apply the manifests from $BOOTSTRAP_ARGOCD_REPO
on the GKE cluster.
An example of successful pipeline run
Summary
By using the terraform notifications feature we were able to make the CI pipelines listen to IaC events and run the CI pipelines as needed.
Top comments (0)