DEV Community

peetss
peetss

Posted on • Edited on

Postgres locally and in CI.

It all started with a little nodejs library called pg-mem. It promised an in-memory postgres database that would be a great lightweight option for testing in CI. It was advertised as experimental but figured it would be worth it just for its sheer convenience.

Getting it up and running was easy but I did run into two small issues, certainly not deal breakers.

  1. Some queries return a slightly incorrect result (primary key of joined data a bit mangled when using GROUP BY clauses).

  2. The entire suite of functions available in Postgres are not implemented, so depending on your queries you may have to implement them on your own (ie: jsonb_build_object(...)).

A quick pop onto their Github page revealed at least a few pages of unresolved issues. In any case, tests were running smoothly both locally and in CI so I was happy.

Sadly, as queries became more complex, pg-mem became less able to cope and over time it became clear that we had... Outgrown each other. To be clear, this isn't a knock on pg-mem, not even a bit. In fact, the opposite, it's a great library I even attempted to fix the bugs I encountered. Finding enough time to work for free is just the reality of open source development. I have no doubt that with more resources it could truly realize its potential. The allure of an in-memory Postgres database for testing remains strong but I needed proper query execution and so along came Docker.

Docker is an easy technology to work with. It's well-documented, ubiquitous, and there are libraries for it in almost every programming language you can imagine. The only real stumbling block was coming to the realization that a volume needs to be mounted at the absolute path on a Windows system. Once ChatGPT helped me with that, we had one running database in a Docker container. I refactored the tests that relied on weird pg-mem results and we were successfully back to square one. Integrating the nodejs library dockerode to start and stop the container using Jest's global setup/teardown functions further automated the solution.

One weird issue worth mentioning is that sometimes when Jest would exit it would leave a nasty error in my terminal and the Docker container wouldn't spin down. After some research, the issue turned out to be a result of not releasing all the client connections to the database pool. Ensuring client.release() occurs during the finally block of a try in client.query(...) resolved the issue completely.

At this point I was fairly impressed by just how well this worked, thinking little about how CI would dash all my hopes and dreams. After naively thinking dockerode would just work in Gitlab CI, I had to go completely back to the drawing board. After doing some research, Postgres turned out to be pretty simple to setup within the gitlab-ci.yaml file. Unfortunately, it didn't support seeding the database with a schema in any capacity. Apparently, I'm not the only person to bemoan this lack of functionality. My local solution mounted my schema file at dockerinit.d. After some thought I decided to refactor the logic in the globalSetup Jest function to simply start the container, using the pg library directly to insert the schema in the beforeAll method of my test suite. Importantly, the code to start the container would only run if the CI env var injected by Gitlab was not present.

After separating container logic from SQL inserts, we finally had our golden calf. Tests ran both locally and in CI. It wasn't particularly easy to accomplish and documentation on this entire effort was oddly lacking. Hopefully, this post will make it easier for others to accomplish similar outcomes on the future.

Cheers 🍻!

Top comments (0)