Introduction
In the previous post I have created a Hugo blog which is being deployed to Cloudflare pages. After spending some time playing around and tuning it, I have generated quite a few deployments to Cloudflare pages. Now all of them are visible in Cloudflare dashboard under Workers & Pages section for each site. Why would I need all of them and how to clean all that mess?
Value of Deployment History
For production deployments - useful if latest deployments have problems, then it is easy and straightforward to rollback to the previous version. This can be done from Cloudflare dashboard by finding deployment to rollback to and selecting Rollback to this deployment
in the "..." menu.
For preview deployments - while a new feature or content is being created it is good to test how a site performs in different environments after specific changes, track overall feature development progress, and share with other people for review. But after active development is finished, content ready, and everything is deployed in production and tested, I see no much value in keeping deployment history of work-in-progress changes. Compared to git, regular practice is to squash commits and delete source branches when merging features or bugfixes to the default branch.
Cleaning up Deployment History
As stated above, I see value in:
- keeping couple of latest production deployments, just in case
- keeping preview deployments while actively working on them
Let's now see how this can be achieved.
Cleaning from Cloudflare
Deployments can be deleted manually from the Cloudflare dashboard under Workers & Pages section for each site. I've tried this approach, and if only few deployments are to be deleted then it is fine and failsafe, but deleting many deployments - very tedious and annoying process, because:
- deployments can only be deleted one by one, no bulk delete
- each delete operation has to be be confirmed
- if deployment has an alias, alias name should be typed-in into edit field to confirm deletion
Automating Cleanup
I was hoping that the wrangler
tool would have something to manage existing deployments, but not... it can only list deployments in human-readable format, but not any standard machine-friendly format.
Fortunately, Cloudflare has Pages APIs that provide full control over deployments, including listing and deleting. Official example are here.
I will use these APIs to create a script to do obsolete deployment cleanup that are older than certain number of days, while always keeping specific number of latest deployments in case rollback would be needed.
With ChatGPT doing most of typing, I've created python following script:
usage: cleanup-deployments.py [-h] --environment {production,preview} --count COUNT --days DAYS [--dry-run]
Fetch and delete obsolete page deployments
options:
-h, --help show this help message and exit
--environment {production,preview}
deployment environment
--count COUNT number of deployments to keep
--days DAYS number of days to keep
--dry-run perform a dry run
Apart from command line arguments it also expects certain environment variables to be defined, for authentication - CLOUDFLARE_API_TOKEN
and CLOUDFLARE_ACCOUNT_ID
are used, to select pages project - CLOUDFLARE_PROJECT_NAME
. Names were chosen to match variables that are already available in my CI/CD environment in GitLab and used by deployment jobs and the wrangler
tool.
Integrating into CI/CD
I took .gitlab-ci.yml from previous article and made following modifications to:
- define deployment cleanup jobs for production and preview environment with their specific options
- make cleanup jobs runnable on schedule
- exclude build and deployment jobs from scheduled pipelines
stages:
- build
- deploy
- cleanup
workflow:
rules:
...
# Always run on schedule
- if: $CI_PIPELINE_SOURCE == "schedule"
.build:
rules:
- if: $CI_PIPELINE_SOURCE == "schedule"
when: never
...
deploy:preview:
rules:
- if: $CI_PIPELINE_SOURCE == "schedule"
when: never
...
deploy:production:
rules:
- if: $CI_PIPELINE_SOURCE == "schedule"
when: never
...
.cleanup:
image: python:3.11-alpine
stage: cleanup
variables:
CLEANUP_KEEP_DAYS: 7
before_script:
- pip install requests python-dateutil
script:
- python scripts/cleanup-deployments.py --environment ${CLEANUP_ENVIRONMENT} --days ${CLEANUP_KEEP_DAYS} --count ${CLEANUP_KEEP_COUNT}
rules:
- if: $CI_PIPELINE_SOURCE == "schedule"
cleanup:preview:
extends: [.cleanup]
variables:
CLEANUP_ENVIRONMENT: "preview"
CLEANUP_KEEP_COUNT: 0
cleanup:production:
extends: [.cleanup]
variables:
CLEANUP_ENVIRONMENT: "production"
CLEANUP_KEEP_COUNT: 2
Complete .gitlab-ci.yml
and cleanup-deployments.py
script source code are available on GitHub.
Next, I have configured scheduled pipeline in my GitLab project under Build
-> Pipeline schedules
menu:
Now, it is all set and you have to wait for the next run to happen or press the "Play" button to create the scheduled pipeline manually.
This is how successful scheduled pipeline should look like:
And runner log:
...
Installing collected packages: urllib3, six, idna, charset-normalizer, certifi, requests, python-dateutil
Successfully installed certifi-2024.2.2 charset-normalizer-3.3.2 idna-3.7 python-dateutil-2.9.0.post0 requests-2.31.0 six-1.16.0 urllib3-2.2.1
$ python scripts/cleanup-deployments.py --environment ${CLEANUP_ENVIRONMENT} --days ${CLEANUP_KEEP_DAYS} --count ${CLEANUP_KEEP_COUNT}
Fetching all production page deployments...
Found 2 production page deployments.
Deleting obsolete production page deployments older than 7 days, while keeping 2 latest...
0 obsolete production page deployments have been deleted.
Cleaning up project directory and file based variables
00:01
Job succeeded
Conclusion
- Cloudflare is a great service for hosting static content, it provides complete history of page deployments
- Rolling back production deployment can be done from Cloudflare dashboard
- Cleaning deployments from Cloudflare dashboard is tedious task
- Cloudflare provides flexible API which allow listing and cleaning up deployments
- Scripting skills are required to make use of Cloudflare API
- Cleanup job can be added to GitLab CI/CD and configured to run on schedule
Top comments (0)