Today I will keep working on part 2
Clone or Fork project folder
https://github.com/aks60808/AiGames
make sure you have this project in your repository!
Jenkins Plugin Install
Get into your Jenkins page http://YOUR_CICD_SERVER_IP:8080/
The followings are plugins we needed for this walkthrough
- Ansible
- Docker
- Docker Pipeline
- NodeJS
- ThinBackup
- Google Cloud Storage
GO TO Dashboard ⇒ manage plugins
Search and Tick all the required plugin to install (Click Download & install with restart)
Configure Global tool
once you had the required plugins installed, GO TO Dashboard ⇒ Global Tool Configuration
We are going to configure some useful tool while building pipeline
Name the NodeJS plugins ⇒ MyNodeJS (this should match your naming in jenkinsfile)
Name the Docker plugins ⇒ MyDocker (this should match your naming in jenkinsfile)
Configure Depolyment Server
to get into deployment server with ansible, we need to configure hosts/inventory again.
Generate ssh keys for cicd server
first open terminal and ssh into cicd server
ssh YOUR_CICD_SERVER_IP
in your home directory, run
ssh-keygen -t rsa -f ~/.ssh/id_rsa
read the public key file
cat ~/.ssh/id_rsa.pub
copy the content on the terminal then paste to GOOGLE COMPUTE ENGINE page ⇒ metadata ⇒ ssh keys
then saved. (like what we did in part 1)
Put ssh private key into Jenkins
During Jenkins building process, I found that it is hard to get in user’s home directory unless you specify the absolute path. Hence, the more secure and well-managed way is to put credentials in Jenkins-managed Credentials.
Click Manage Credentials →SYSTEM → Global credentials → Add Credentials
CICD ssh private key
we need to put the private key we’ve created previously (~/.ssh/id_rsa).
select SSH username with private and specify your credential ID, ssh login username and private key content.
Please be noted that the ID should match the Credential ID setup in the Jenkinsfile pipeline script.
cat .ssh/id_rsa
copy the content of the private key put into key section then saved.
Configure hosts on CICD server
We will use ansible to configure Deployment server throughout the CICD pipeline.
Go to ansible folder
cd /etc/ansible/
Edit hosts file with sudo
sudo vim hosts
Put following content into the hosts file
[depolyment]
x.x.x.x
# replace above with depolyment server IP address
[depolyment:vars]
ansible_user=tommylin
# replace above with your login username
ansible_ssh_common_args='-o StrictHostKeyChecking=no'
Don’t forget to put deployment server ip and your ssh login username in the file!
Configure Deployment server
in your local project directory, use text editor to open ansible_playbook/deployment_server_init_config.yml
---
- name: Install Docker
hosts: depolyment
become: yes
vars:
username: "tommylin"
# replace the above username with yours.
Replace with your ssh login user name for purpose of adding $USER to docker group.
After that, in project directory run the following:
ansible-playbook -i inventory ansible_playbook/deployment_server_init_config.yml
The playbook will configure deployment server with docker installation.
Create a pipeline job
Enable Discard Old builds
We will create docker image release throughout building action, we don’t need to keep every build as it could be tons of large build increase the burden of backup.
Tick GitHub hook trigger
This is where our github webhook magic enabled on Jenkins’ side!
GET the pipeline Script from Github
Put your project repo URL into Repositories URL blank. If your repo is private then Credentials is needed(Can configure it using Jenkins managed credentials). Other box I just leave them as default settings.
GITHUB WEBOOK setup
In your GitHub Repo page, click Settings icon ⇒ Webhooks ⇒ Add webhook
PUT IN http://CICD_SERVER_IP:8080/github-webhook into the Payload URL blank.
DON’T FORGET to select JSON format as your Content type
Click Add webhook button then we are almost there!!
Configure the place where you store releases
go to Google Container Registry console to configure visibility to public.
Build Action triggered by GitHub push!
you can make dummy commit then push to the master testing the automation. You can click the build number to get detailed information as Console output.
Check the release on Google Container Registry after build success.
Well done! CHECK THE DEPLOYMENT SERVER!
you can go visit http://YOUR_DEPLOYMENT_SERVER_IP:3000/AiGames/game/sudoku to have some fun with these games
Backup
Setup ThinBackup
In this walkthrough, Google Cloud Storage are used to backup jenkins.
GO TO JENKINS Dashboard and SCROLL DOWN, you will see the ThinBackup plugin we just installed.
Click Settings
Configure the following
Set Backup directory:
${HOME}/.jenkins-backup
Backup schedule ( I backup @7:00 AM everyday)
I only backed up build results, userContent folder and Backup plugins archives since I don’t need any build artifact being backup from Jenkins. I only backup my worksapce configuration.
after click SAVE button, thinbackup will backup jenkins on your VM.
Bring Backup to Google Cloud Storage
PUT Service account key into Jenkins-managed credentials
Specify you GCP project ID and service account key .json file then click create.
Create a pipeline job
This job will start 9:00 AM everyday, right after thinbackup!
In Pipeline Script section, Please copy and paste this snippet.
pipeline {
environment {
JENKINS_BACKUP_FILE_NAME = ''
SA_KEY_CRED_ID = 'YOUR PROJECT ID(JENKINS MANAGED CRED)'
GCS_BUCKET_URL='YOUR GCS BUCKET URL'
}
agent any
stages {
stage('Backing up Jenkins') {
steps {
script {
JENKINS_BACKUP_FILE_NAME = sh (
script: '''#!/bin/bash
# find latest backup
thinbackup_path="$JENKINS_HOME/.jenkins-backup"
latest_backup_folder=$(ls -t $thinbackup_path | head -n1)
full_path="$thinbackup_path/$latest_backup_folder"
# pack it up
backup_name="$latest_backup_folder.tar.gz"
tar cvzf $backup_name $full_path
echo ${backup_name}
''',
returnStdout: true
).trim().tokenize().last()
}
step([$class: 'ClassicUploadStep', credentialsId: "${SA_KEY_CRED_ID}",
bucket: "${GCS_BUCKET_URL}",
pattern: JENKINS_BACKUP_FILE_NAME])
sh '''
rm '''+JENKINS_BACKUP_FILE_NAME+'''
'''
}
}
}
}
Go to Google Cloud Storage
you can directly click build now to run the upload mission. you should see these kind of backup files in your cloud storage.
Please be noted that in cicd_server.tf I’ve stated the lifecycle_rule age = 2 days for cloud storage. Dpn’t get confused if it only keep 2 copies all the time.
Cleanup
In the end, don’t forget to cleanup using:
terraform destroy
DONT FORGET ENTER YES !
Summary:
In part 2, I use Jenkins, Anisble, and docker to host a web on Google Compute Engine. I understand that there are tons of act can be improved to obtain a more resilient and reliable pipeline. This is the very first pipeline I had ever built. If you are a beginner as me, hope this walkthrough can get you some feeling about how a pipeline being constructed. Thank you for your time reading this :)
Top comments (0)