Introduction
In the data-driven domain of content creation, having a real-time pulse on audience engagement is invaluable. My quest was to construct a tool that epitomizes both efficiency and elegance—a solution where a simple URL would serve as a portal to the metrics of my Dev.to articles.
The Challenge
Manual monitoring was out of the question. The goal was to create an automated system that provides live updates on comments and reactions without the hassle of repeatedly checking each article. And importantly, this system had to be cost-free.
The Solution
My resolution took the form of a memorable URL: bit.ly/dev-mon, a nod to both its daemon-like operation and ease of recall. The choice of hosting fell on the sturdy shoulders of GitHub Pages, a free platform that became the stage for my engagement dashboard.
TL;DR: Quick Start Guide
Eager to see a dashboard like this?
Follow these steps:
Clone the Repository: Visit InteractoGraph and clone the repo.
Customize for Your Profile: Navigate to
./cypress/e2e/0-mine/all_in_one.cy.js#L4
in your cloned repo and replacerealvorl
inhttps://dev.to/realvorl
with your DEV.TO username.Commit and Push: Commit the changes with a message like "Update username in Cypress test" and push to
main
.
Voilà! Post workflow, visit:
https://<YOUR-GITHUB-USERNAME>.github.io/InteractoGraph/
for your very own engagement dashboard. Of course, you can now extend it to include anything that you can think of and are able to scrape using the Cypress API.
For a deeper understanding of the setup, let's dive into the subsequent sections.
Automating Data Collection with Cypress
The Cypress test below scrapes engagement data from my Dev.to profile, illustrating the ease of automation:
describe('Article Metrics on dev.to', () => {
it('captures comments and reactions for multiple articles', () => {
// Visit the user's main page
cy.visit('https://dev.to/realvorl');
// Define an array to hold our article metrics
let articleMetrics = [];
// Wait for the articles to load
cy.get('.crayons-story__body').each(($body, index, $bodies) => {
// Extract the article title, comments, and reactions
const title = $body[0].getElementsByTagName('h2')[0].textContent.trim();
const links = $body[0].getElementsByTagName('a');
const reactionsCount = links[links.length - 2].innerText.trim();
const commentsCount = links[links.length - 1].innerText.trim();
// Add the metrics to our array
articleMetrics.push({
title,
'commentNo': parseInt(commentsCount, 10) || 0,
'reactionNo': parseInt(reactionsCount, 10) || 0
});
// After the last article, write the metrics to a JSON file
if (index === $bodies.length - 1) {
cy.writeFile('./cypress/e2e/0-mine/data.json', {
'dateISO': new Date().toISOString(),
'articles': articleMetrics
}, { flag: 'w+' });
}
});
});
});
It captures titles, comments, and reactions and consolidates them into a JSON file, timestamped for traceability.
Integrating Data with Visualization
Using a Node.js script, updateHtml.js
, the data from Cypress is dynamically injected into a Google Chart within an HTML template, chart.tmpl
.
const fs = require('fs');
const tmplFilePath = 'chart.tmpl'; // Template file path
const htmlFilePath = 'index.html'; // The final HTML file path
const jsonFilePath = 'data.json'; // The JSON data file path
// Read the template HTML content and the JSON data
const htmlContent = fs.readFileSync(tmplFilePath, 'utf8');
const jsonData = fs.readFileSync(jsonFilePath, 'utf8');
// Replace the placeholder in the template with the actual JSON data
const updatedHtmlContent = htmlContent.replace('// JSON_PLACEHOLDER',
`var json = ${jsonData}`);
// Write the updated HTML content to the final file
fs.writeFileSync(htmlFilePath, updatedHtmlContent);
This transforms the JSON data into an interactive chart, ready for publishing.
Continuous Integration and Deployment with GitHub Actions
The workflow I've established with GitHub Actions automates data scraping, chart updating, and deployment to GitHub Pages. Here's the workflow:
name: CI
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
workflow_dispatch:
schedule:
# Runs at minute 0 past every 6th hour
- cron: '0 */6 * * *'
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install and Run Cypress
run: |
# Set up Git credentials
git config --global user.email "6973263+realvorl@users.noreply.github.com"
git config --global user.name "(Vorl)-Bot"
# Install dependencies and run tests
yarn
yarn cy-all
# Commit and push the results
git add .
git commit -am "chore (ci): update e2e test results $(date +%s)"
git push
# Branching strategy for GitHub Pages
git checkout -b gh-pages
git push --set-upstream origin gh-pages --force
- name: Deploy to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_branch: gh-pages
publish_dir: ./cypress/e2e/0-mine
- name: Upload a Build Artifact
uses: actions/upload-artifact@v4.0.0
with:
name: chart
path: ./cypress/e2e/0-mine/index.html
retention-days: 14
The branching strategy for GitHub Pages involves force-pushing to a gh-pages
branch, using the peaceiris/actions-gh-pages
action to deploy the updated chart, and uploading the chart as a build artifact for quick access.
The first unintended consequence, but actually my favorite feature of this implementation, is the diff
that you can get when you look at the git history:
You can just click on the parent
link and you can navigate, in this view, through the history of the changes.
Another unintended feature is the possibility to declutter the notifications on your phone. Because this can be used for virtually anything that you want to monitor, you can disable the annoying push notifications, and just do an hourly / daily digest of the engagement you get online.
Conclusion
This automated system scrapes new data and refreshes the dashboard every 6 hours, eliminating manual effort and providing a continuously updated view of article engagement. Of course, it is not limited to DEV.TO, basically anything that you are interested in (LinkedIn), you can set up a Cypress Test, enrich the JSON with that information and follow the pattern laid out in this article.
Top comments (0)