Being able to manage incremental, reversible changes to a relational database schema, as well as enforce version control and easy deployments is an integral part of any web application development project (see an example).
If you’re leaning towards using JavaScript (TypeScript) cross-stack for your next web application project, you may find the following db migration workflow useful.
Technology Stack
-
PostgreSQL server and CLI tools (
psql
,pg_dump
,pg_restore
) - Node.js with Yarn (or, NPM), JavaScript for writing migration files
- Knex.js query builder and db migration tool
- Faker.js random data generator for use in seed files
- Optionally, VS Code editor (autocomplete, code snippets, etc.)
Hint: MacOS users may prefer installing PostgreSQL server (for local development) using PosrtgresApp complimented with brew install libpq
for bringing in CLI tools using Homebrew.
Directory Layout
├── migrations
│ ├── 001_initial.js
│ ├── 002_programs.js
│ ├── 003_indexes.js
│ └── ....
├── seeds
│ ├── 01_setup.js
│ ├── 02_users.js
│ ├── 02_users.json (auto-generated)
│ ├── 03_programs.js
│ ├── 03_programs.json (auto-generated)
│ └── ...
Normally you create a separate folder for all the db migration files in your project which will be used by the db migration scripts. Optionally, you can create a folder with seed files (more on that later).
Migration files need to be prefixed with version numbers such as 023
or 20200815T234500
(timestamp). As long the migration tool is capable of differing the order of the migration files using this prefix, it all should be fine.
While you can use plain .sql
files for migrations, using .js
will give you more flexibility. For example, it will be easier to rename tables, columns using Knex.js Schema Builder as opposed to using plain SQL. Yet, you can always include plain SQL into your JavaScript-based migration files if you want to.
Migration Files
Adding a new migration is as easy as creating a new file in the migrations
folder and writing down what needs to be changed in the schema (using Knex.js Migrations API). Here is an example:
TIP: Use VS Code code snippets for that.
Migration Scripts
Optionally, you can add some global Yarn scripts that call Knex.js CLI.
$ yarn db:version # Shows the current database version
$ yarn db:migrate # Applies pending migration (files)
$ yarn db:seed # Seeds database with test dataset(s)
$ yarn db:rollback # Rolls back the latest migration
$ yarn db:repl # Opens REPL shell for Knex.js
$ yarn db:psql # Opens REPL shell for PostgreSQL
You'll be able to execute these scripts from any Yarn workspace folder (assuming you're using Yarn v2).
Adding --env=?
argument to any of the scripts above will ensure that db script will connect to the selected instance of PostgreSQL database (from one of the pre-configured environments, e.g. local
, dev
, test
, prod
).
Seed Files
Seed files don't have to be prefixed with version numbers, but they will be executed in alphabetical order. A typical seed file may look like this:
/**
* Test (reference) user accounts.
*
* @typedef {import("knex")} Knex
*/
module.exports.seed = async (/** @type {Knex} */ db) => {
const users = /* generate fake users using Faker.js */;
await db.table("users").delete();
await db.table("users").insert(users);
});
Optionally, when a seed file is being executed, you may want to save the generated (fake) dataset into the corresponding .json
file. From there on, this file will be used for seeding / re-seeding the database. This can be useful if you'd like to ensure that the test (reference) dataset remains constant.
Q & A
-
Why using
.js
files for migrations as opposed to.ts
(TypeScript)?Running migrations will work faster using.js
-based migrations since Node.js doesn’t currently understand TypeScript natively. Yet, you have almost the same syntax highlighting and auto-complete experience as you would expect when using TypeScript (by adding a couple of JSDoc annotations).
Feel free to contact me on Discord if you have more questions (koistya#2673).
Source Code
https://github.com/kriasoft/nodejs-api-starter - Yarn v2 based monorepo setup pre-configured with PostgreSQL and GraphQL API.
Top comments (0)