When it comes to modern web applications, the upload is just the beginning. Every image that lands on your server, whether it's a user’s avatar, a product screenshot, or the hero cover of an article, needs to be refined and prepared for production. But here’s the catch: optimizing images isn’t a one-size-fits-all task. You need a seamless workflow, one that identifies which images need optimization and avoids re-processing those that are already production-ready. Today, we’re setting out on an automated journey to craft exactly that—without cutting corners.
The Foundation: Tracking Optimization Status with a Database Flag
First things first: we need a way to distinguish between images that are fresh off the upload and those that have been processed. For this, a simple flag in your database does the trick. We’re calling it is_optimized
, a boolean column in the images
table that defaults to false
whenever a new record is added. Once an image is processed and optimized, this flag is set to true
.
Here’s the SQL to create it:
ALTER TABLE images ADD COLUMN is_optimized BOOLEAN DEFAULT FALSE;
With this in place, each image has a clear status indicator—no guesswork, no over-optimization. This is_optimized
flag becomes our guide, telling us exactly which images to optimize and which ones are already ready for production.
Automation, Bash, and Node.js: A Match Made for Efficiency
We could use Bash alone to loop over each image, but for a task as nuanced as image optimization, we need a bit more finesse. Enter Node.js. Instead of letting Bash handle everything, we’ll set it up to launch a Node.js script that manages the database querying, image processing, and updates.
So, let’s prepare our environment. First, we create a directory where our script will live:
mkdir -p /mnt/image_optimization
cd /mnt/image_optimization
Inside this directory, install the necessary Node.js packages:
npm install pg sharp imagemin imagemin-mozjpeg imagemin-pngquant
Each of these packages has a specific role in our workflow:
- pg : Handles database interactions.
- sharp : Manages resizing and basic image transformations.
- imagemin : Provides plugins for deep compression and quality adjustments.
- imagemin-mozjpeg and imagemin-pngquant : Compress JPEG and PNG images, respectively, with optimal quality.
1900+ FREE RESOURCES FOR DEVELOPERS!! ❤️ 😍🥳 (updated daily)
1391+ Free HTML Templates
271+ Free News Articles
49+ Free AI Prompts
210+ Free Code Libraries
37+ Free Code Snippets & Boilerplates for Node, Nuxt, Vue, and more!
24+ Free Open Source Icon Libraries
Visit dailysandbox.pro for free access to a treasure trove of resources!
The Heart of the System: Our Image Optimization Script
With the environment set up, it’s time to build the Node.js script. This script will do the heavy lifting: querying the database for images that need optimization, processing each one, and updating the database to mark the image as optimized. We’ll call this script optimize_images.js
.
Database Connection Setup
First, let’s set up the connection to our PostgreSQL database:
import pkg from 'pg';
const { Client } = pkg;
const client = new Client({
host: 'HOST',
user: 'USER',
password: 'PASSWORD',
database: 'DATABASE',
});
client.connect();
This connection lets us query for images and update records after they’re processed.
Optimizing Each Image
Our script needs to handle each image differently, depending on its format. We’ll set up a function, optimizeImage
, to resize and compress images based on their file extension.
import sharp from 'sharp';
import imagemin from 'imagemin';
import imageminMozjpeg from 'imagemin-mozjpeg';
import imageminPngquant from 'imagemin-pngquant';
import path from 'path';
import fs from 'fs';
const IMAGE_DIR = "/mnt/sites/site.com/images";
const MAX_WIDTH = 1920;
const MAX_HEIGHT = 1080;
const optimizeImage = async (filePath) => {
try {
const ext = path.extname(filePath).toLowerCase();
let optimizedImage;
// Resize the image
const resizedImage = await sharp(filePath)
.resize({
width: MAX_WIDTH,
height: MAX_HEIGHT,
fit: 'inside',
})
.toBuffer();
// Compress the resized image based on its format
if (ext === '.jpg' || ext === '.jpeg') {
optimizedImage = await imagemin.buffer(resizedImage, {
plugins: [imageminMozjpeg({ quality: 85 })],
});
} else if (ext === '.png') {
optimizedImage = await imagemin.buffer(resizedImage, {
plugins: [imageminPngquant({ quality: [0.65, 0.8] })],
});
} else {
console.log(`Skipping unsupported file format: ${filePath}`);
return;
}
// Write the optimized image back to the file system
fs.writeFileSync(filePath, optimizedImage);
console.log(`Optimized and saved: ${filePath}`);
} catch (error) {
console.error(`Error optimizing ${filePath}:`, error);
}
};
This function resizes each image to fit within a 1920x1080
frame and applies format-specific compression to maintain quality while reducing file size.
Querying and Updating the Database
Now we need the main function, optimizeImages
, which queries the database, processes each image, and logs an SQL update for each processed record.
const sqlFilePath = path.join(path.resolve(), 'optimize.sql');
fs.writeFileSync(sqlFilePath, '', 'utf8'); // Clear SQL file at the start
const appendUpdateSQL = (imageId) => {
const updateSQL = `UPDATE images SET is_optimized = true WHERE id = ${imageId};\n`;
fs.appendFileSync(sqlFilePath, updateSQL, 'utf8');
};
const optimizeImages = async () => {
try {
const res = await client.query("SELECT id, name FROM images WHERE is_optimized = false");
for (const row of res.rows) {
const { id, name } = row;
const filePath = path.join(IMAGE_DIR, name);
if (fs.existsSync(filePath)) {
await optimizeImage(filePath);
appendUpdateSQL(id); // Log the update for later execution
console.log(`Appended SQL update for image ID: ${id}`);
} else {
console.error(`File not found: ${filePath}`);
}
}
console.log("Image optimization process completed. Update statements written to optimize.sql.");
} catch (error) {
console.error("Error during image optimization:", error);
} finally {
client.end();
}
};
This function checks the database for images with is_optimized = false
, processes each one, and logs an SQL update statement to optimize.sql
. Each processed image is now ready for production.
The Bash Script and Cron Job: Keeping It Running Like Clockwork
The final piece is a Bash script, optimize_images.sh
, that launches our Node.js script every 30 minutes. It checks for a newly generated optimize.sql
file and, if found, runs the SQL statements to update our database.
#!/bin/bash
OPTIMIZE_DIR="/mnt/image_optimization"
SQL_FILE="$OPTIMIZE_DIR/optimize.sql"
DB_HOST="HOST"
DB_USER="USER"
DB_NAME="DATABASE"
DB_PASSWORD="PASSWORD"
echo "Backup initiated at: $(date +'%Y-%m-%d %H:%M:%S')"
node $OPTIMIZE_DIR/optimize.js
if [-f "$SQL_FILE"]; then
echo "Running generated SQL file"
PGPASSWORD=$DB_PASSWORD psql -h $DB_HOST -U $DB_USER -d $DB_NAME -f $SQL_FILE
if [$? -eq 0]; then
echo "SQL executed successfully."
else
echo "Error executing SQL file."
fi
else
echo "SQL file not found, script failed to generate the SQL."
fi
This script performs the database update and logs the execution status, ensuring no image is left unoptimized for long.
To keep this running like a well-oiled machine, set up a cron job to run the Bash script every 30 minutes:
*/30 * * * * /mnt/image_optimization/optimize_images.sh
Wrapping Up
With this setup, each image goes from raw upload to production-ready automatically. The database flag ensures no redundant processing, while the combination of Bash, Node.js, and cron keeps everything running efficiently. This workflow is a true testament to automation’s power, optimizing images and streamlining production without lifting a finger.
For more tips on web development, check out DailySandbox and sign up for our free newsletter to stay ahead of the curve!
Top comments (0)