DEV Community

Cover image for Declarative Schemas for Simpler Database Management
Yuri for Supabase

Posted on • Originally published at supabase.com

12 11 10 10 10

Declarative Schemas for Simpler Database Management

Today we’re releasing declarative schemas to simplify managing and maintaining complex database schemas. With declarative schemas, you can define your database structure in a clear, centralized, and version-controlled manner.

⚡️ More on Launch Week

What are declarative schemas?

Declarative schemas store the final desired state of the database in .sql files that can be saved and versioned alongside a project. For example, here is the declarative schema for a classic products table:

create table "products" (
  "id" serial primary key,
  "name" text not null,
  "description" text,
  "price" numeric(10,2) not null,
  "created_at" timestamp default now()
);

alter table "products"
enable row level security;

Enter fullscreen mode Exit fullscreen mode

Declarative schemas offer numerous benefits over making changes to your database schema directly:

  • Single pane of glass. Maintain your entire database schema in one place, reducing redundancy and potential errors.
  • Versioned migrations. Automatically generate migration files, ensuring consistent schema updated across environments. Store your declarative schema files alongside your project files in your version control system.
  • Concise code reviews. Easily review changes to tables, views, and functions without manually repeating complex migration scripts.

Declarative schemas vs migrations

It's best practice to use Migrations to track and apply changes to databases. Every time you make a change, you create a new new file with all the changes, keeping changes versioned and reproducible.

However, as the complexity of a database schemas grows, it becomes increasingly difficult to develop using versioned migrations as there isn't a single place to see the entire database schema.

For example, at Supabase we have a complex and frequently-updated projects table. Here's partially what it looks like with RLS enabled:

create table private.projects (
  id              bigint    not null,
  name            text      not null,
  organization_id bigint    not null,
  inserted_at     timestamp not null,
  updated_at      timestamp not null
);

alter table private.projects
enable row level security;

create policy projects_insert
  on private.projects
  for insert
  to authenticated
with check auth.can_write(project_id);

create policy projects_select
  on private.projects
  for select
  to authenticated
using auth.can_read(project_id);

-- Users can only view the projects that they have access to
create view public.projects as select
  projects.id,
  projects.name,
  projects.organization_id,
  projects.inserted_at,
  projects.updated_at
from private.projects
where auth.can_read(projects.id);

Enter fullscreen mode Exit fullscreen mode

The projects table is created in a private schema, with a public view exposed for reads. Attribute-based access control (ABAC) is implemented on top of RLS policies to ensure queries only return projects that the user has access to.

Since Postgres views are not updatable by default, we have defined trigger functions to cascade writes to the underlying table when a Supabase user creates a new project. This makes development easier because the projects view can be inserted with regular PostgREST calls while invoking the appropriate RLS policies on the underlying table.

-- Triggers to update views from PostgREST: select('projects').insert({ ... })
create function public.public_projects_on_insert() returns trigger
as $$
begin
  insert into private.projects(
    name,
    organization_id,
    inserted_at,
    updated_at
  ) values (
    NEW.name,
    NEW.organization_id,
    coalesce(NEW.inserted_at, now()),
    coalesce(NEW.updated_at, now())
  ) returning * into NEW;
  return NEW;
end
$$ language plpgsql;

create trigger public_projects_on_insert
  instead of insert
  on public.projects
  for each row
execute function public.public_projects_on_insert();

Enter fullscreen mode Exit fullscreen mode

This complexity slows down development velocity, as changes to the table might break other views or functions. Back in early 2022, a simple change to add a new column involved the following steps.

  1. Find the latest schema for projects table in our migration files or by querying our database.
  2. Write the alter table statement in a new migration file.
  3. Copy and update the projects view definition to include the new column.
  4. Copy and update the trigger function definition to include the new column.
  5. Add new pgTAP tests and verify that existing tests pass.
  6. Submit the new migration file for review, which would be at least a few hundred lines.

This process is tedious and it's frustrating to have multiple engineers working on the projects table concurrently. Merging PRs would result in a merge conflict that must be resolved by repeating steps 1-5.

Using declarative schemas in production

Adopting declarative schemas gave our engineers a single pane of glass when updating database schemas. Instead of manually duplicating affected postgres entities in a migration file, we only need to change the schema definition in one place.

We then use a schema diff tool, like migra, to figure out the necessary updates to views and functions when generating the migration file.

For example, adding a new metadata column to the projects table now becomes a single line diff.

--- a/supabase/schemas/projects.sql
+++ b/supabase/schemas/projects.sql
@@ -2,6 +2,7 @@ create table private.projects (
   id              bigint    not null,
   name            text      not null,
   organization_id bigint    not null,
+  metadata        jsonb,
   inserted_at     timestamp not null,
   updated_at      timestamp not null
 );

Enter fullscreen mode Exit fullscreen mode

The same process also applies to views, database functions, RLS policies, role grants, custom types, and constraints. While manual reviews are still required on the generated migration file, it has cut down our development from hours to minutes. It's also much easier to rebase on merge conflicts introduced by other PRs.

Get started with declarative schemas

Declarative schemas are available today on Supabase.

We added the same set of tools that we used internally for the last 2 years to Supabase CLI. Whether you are just getting started with migrations or already fed up with managing hundreds of migration files, give declarative schemas a try as it will likely simplify your development process.

Check out our blog post on Postgres Language Server for better tooling and IDE integration when developing with declarative schemas.

Launch Week 14

Main Stage

Day 1 - Supabase UI Library
Day 2 - Supabase Edge Functions: Deploy from the Dashboard + Deno 2.1
Day 3 -Realtime: Broadcast from Database
Day 4 - Declarative Schemas for Simpler Database Management

Build Stage

Community Meetups

Quadratic AI

Quadratic AI – The Spreadsheet with AI, Code, and Connections

  • AI-Powered Insights: Ask questions in plain English and get instant visualizations
  • Multi-Language Support: Seamlessly switch between Python, SQL, and JavaScript in one workspace
  • Zero Setup Required: Connect to databases or drag-and-drop files straight from your browser
  • Live Collaboration: Work together in real-time, no matter where your team is located
  • Beyond Formulas: Tackle complex analysis that traditional spreadsheets can't handle

Get started for free.

Watch The Demo 📊✨

Top comments (0)

Image of DataStax

AI Agents Made Easy with Langflow

Connect models, vector stores, memory and other AI building blocks with the click of a button to build and deploy AI-powered agents.

Get started for free

👋 Kindness is contagious

Engage with a wealth of insights in this thoughtful article, valued within the supportive DEV Community. Coders of every background are welcome to join in and add to our collective wisdom.

A sincere "thank you" often brightens someone’s day. Share your gratitude in the comments below!

On DEV, the act of sharing knowledge eases our journey and fortifies our community ties. Found value in this? A quick thank you to the author can make a significant impact.

Okay