Until recently, I had a very simple git workflow: I worked in a local repository, and then pushed my changes to a single remote, which lived at GitHub. In the case of the code for this site, pushing to GitHub would automatically trigger a rebuild of the website and publish the changes live to the Internet. (Thanks, Netlify!)
This setup had the virtue of simplicity, but it had three important drawbacks:
- I had no backups of any changes I'd made but not published. For example, if I had a draft blog post, it lived only on my laptop, and thus could easily be lost. This isn't that big of a deal with my current workflow—unlike some some bloggers, I don't tend to keep a ton of posts as drafts. Still, though, I'd like to not lose what I do have, and I always might keep more drafts in the future.
- I was relying heavily on GitHub to publish updates to my site. If GitHub were down or locked me out of my account or something, I would have no easy way to make posts to my blog. I could migrate to a different git host or manually update Netlify, but it wouldn't be seamless.
- I had only one backup of the site, and that was on GitHub. If anything happen to my computer, I'd be 100% relying on GitHub for all of my site content.
So, over the past week, I decided to fix all of these issues. With my new setup, I have:
- A git server on a Raspberry Pi on my home LAN.
- A git server on a cheep shared host.
- A GitHub-hosted repository.
- A GitLab-hosted repository.