DEV Community

Sam Roberts
Sam Roberts

Posted on

CMS Sync between different machines

Currently I used a single laptop that I take between work and home depending where I'm working each day. Especially in the recent hot weather this is becoming uncomfortable.

Ideally I'd like a machine at work and at home, but syncing our CMS builds between machines each day is time consuming and not reliable.

We of course use Git for all files and luckily the CMS we build in (CraftCMS) uses config files that are pretty Git friendly these days.

CMS Uploads go to AWS S3 buckets so I don't need to worry about syncing those, the issue lies with syncing the DB. Any text or asset fields need to be synced between machines so I can continue work each day.

I suppose I could manually take a DB backup at the end of each day and commit it to git so its ready for the next day, but this isn't full proof and if I forget one day, the next day I need to redo a load of work to get back to where I was.

Does anyone else have this issue and any potential solutions?

Top comments (2)

Collapse
 
joelbonetr profile image
JoelBonetR πŸ₯‡ • Edited

By your text I'm assuming that either you have the DB in iron at the office or the DB is online but with restricted access.

If it's the first one and you're the one taking the decisions maybe would be a good idea to set it up online.
If it's the second one simply whitelist your external home IP or, if you want/need to move and connect from different places, set up an identity provider in between (the server and the DB) instead.

Does that suit your needs? Sorry if I didn't catch the question properly.

Collapse
 
bad_request400 profile image
Bad Request 400 • Edited

Hey, i was more or less in the same boat.
Though my motivation was a bit different. I needed it to continue coding when im home bc im obsessed with my project (its my baby) but it wasnt "needed" (more wanting than needing to).

In my case i have a couple of Postgres DB's which are clustered and running in containers.

I have a different container which basically backups my postgres DB and saves it into its volume. This i can then upload or have automatically commited and pushed to my GH Repo.

So when im home and the itch to continue working on lets say the frontend or middleware comes up and i need to validate it with "real data" i can use the backup which i then can load into my local postgres instance.

As you've said its a "sub optimal" solution but its a solution which works for me. But im also ok to have a week old Dataset to work with, so it depends really on your needs.

If something like that could work out then this could help you cut down the manual process quit a bit.

Just in case (crossing fingers) its a postgres DB the image im using is:

hub.docker.com/r/prodrigestivill/p...

The far better solution ofc would be if you could talk your company/boss into letting you vpn into the company network so you dont have to hold two datasets.

Especially, speaking for me here, my company gets really butt hurt bc in terms of endpoint security. So i had to fight really really hard for them to say "yes ok".

so long

EDIT: Just a quick add. The Data im exporting is Meta Data only, so you should think about what data you want to export and pump over the net from a security point of view.