Persistent data stores are a key component when building microservices, narrowing them down to relational databases specifically it's understandable that learning the proper SQL dialect used by the database engine allows us to take full advantage of the resources being used, and the same applies when choosing a way to programmatically access that dialect in our programs.
This is the first of a series of posts covering a few different ways to access PostgreSQL databases.
The full code example mentioned in this post is available on Github, please make sure to read the README for specifics.
Using Plain SQL packages for accessing PostgreSQL databases
Using plain-old SQL statements is usually the first thing to do when interacting with databases, and it totally makes sense because that is literally the programming language to use when talking to most relational database engines.
I will be covering the standard packages that support any database engine emphasizing those PostgreSQL nuances and the ones that are concrete to what we are discussing today, specifically:
- database/sql and jmoiron/sqlx: both packages work with any database driver that implements the corresponding database engine driver, and
- lib/pq and jackc/pgx: which are PostgreSQL-specific.
The examples below use a database table with the following structure:
-- db/migrations/20210126023417_create_names_table.up.sql
CREATE TABLE names (
nconst varchar(255),
primary_name varchar(255),
birth_year varchar(4),
death_year varchar(4) DEFAULT '',
primary_professions varchar[],
known_for_titles varchar[]
);
The goal is to map the results coming from that table into a struct type Name
defined as:
// server.go
type Name struct {
NConst string
Name string
BirthYear string
DeathYear string
}
database/sql
database/sql
is the entry point for anyone using Go trying to interact with relational databases. The API is really simple to understand and, compared to all packages covered in this post, it provides the highest flexibility, but it is definitely the one that requires more boilerplate and code to write when using it.
The database/sql
instructions to map our table record to our struct will look like something like this:
// postgresql_sql.go
query := `SELECT nconst, primary_name, birth_year, death_year FROM "names" WHERE nconst = $1`
var res Name
if err := p.pool.QueryRowContext(context.Background(), query, nconst).
Scan(&res.NConst, &res.Name, &res.BirthYear, &res.DeathYear); err != nil {
return Name{}, err
}
return res, nil
As you can imagine repeating similar steps for all queries is a bit repetitive and is not common to implement your own little helpers around SQL commands, but still the way to map the database values to struct types is done manually.
Specifically to this SELECT
command, the important bit to call out is: the mapping that happens between the database column and the struct fields:
- It is something to be done manually for each selected column,
- It is a tedious process that could lead to application errors, for example when trying to Scan database column types that do not match to Go types.
Go database/sql tutorial is the de-facto resource for understanding how to work with this package together with examples using MySQL and PostgreSQL.
jmoiron/sqlx
jmoiron/sqlx
is an extension to database/sql
that adds some extra features for mapping arguments and results much more easily, for example:
query := `SELECT nconst, primary_name, birth_year, death_year FROM "names" WHERE nconst = $1`
var result struct {
NConst string `db:"nconst"`
Name string `db:"primary_name"`
BirthYear string `db:"birth_year"`
DeathYear string `db:"death_year"`
}
if err := p.db.QueryRowx(query, nconst).StructScan(&result); err != nil {
return Name{}, err
}
return Name{
NConst: result.NConst,
Name: result.Name,
BirthYear: result.BirthYear,
DeathYear: result.DeathYear,
}, nil
Compared to the previous database/sql
implementation:
- The records returned from the database are mapped automatically thanks to the struct tags defined in the anonymous struct in the variable
result
, - The
result
variable itself is populated using theStructScan
method, there's a similar one defined on the package level for sets to populate slices:sqlx.StructScan
.
Similarly to database/sql
, there's the Illustrated guide to SQLX which should help you navigate the usage of this package, it also includes examples using MySQL and PostgreSQL.
lib/pq
lib/pq
itself can not be used directly except in specific cases:
- When using the
COPY
command via thepq.CopyIn
function, or - When referring to PostgreSQL-specific types, like when calling
pg.Array
orhstore.HStore
.
So in practice the code you saw used in database/sql
is the same.
The important thing to mention about lib/pq
is the disclaimer on their README (emphasis mine):
This package is effectively in maintenance mode and is not actively developed. Small patches and features are only rarely reviewed and merged. We recommend using pgx which is actively maintained.
Which brings us to our last package to cover today.
jackc/pgx
jackc/pgx
is, as mentioned above, the recommended package to use when working with PostgreSQL databases, it has a lot of features specific to this database, not only it can used via database/sql
but directly using their own API; in those cases according to the benchmarks there are some benefits when taking that route instead.
The API is close to database/sql
's, with the exception of having context
support from the beginning::
query := `SELECT nconst, primary_name, birth_year, death_year FROM "names" WHERE nconst = $1`
var res Name
if err := p.pool.QueryRow(context.Background(), query, nconst).
Scan(&res.NConst, &res.Name, &res.BirthYear, &res.DeathYear); err != nil {
return Name{}, err
}
return res, nil
Final thoughts
Using persistent data stores like a relational database is considered a given when building microservices, however there a lot of things to think about when making a choice, selecting the engine to use is one of them but so is selecting the packages to use for programming the database calls.
The four packages I mentioned in this post are the beginning of this series, please keep an eye out because in the end I will share with you the setup I use, spoiler: it is not only one package.
Talk to you later.
Keep it up. Don't give up.
Top comments (0)