[Part 2] Cutting Costs: Automating Backups
In the previous article, I talked about how in response to an expiring CMS hosting subscription, I embarked on a cost-cutting journey. Transitioning from an expensive setup to a Raspberry Pi-based solution, I learned more about Docker, WordPress, ngrok, and the Raspberry Pi ecosystem. My article details this migration, emphasizing strategic cost reduction and maximizing project impact. Read more about it here.
This new setup promises resilience and flexibility for future endeavors. Initially, the plan was to run backups whenever updates to the CMS were made. After posting the last article, and doing a second manual backup, I realized how much friction these manual steps added, and I embarked on a side quest to automate them.
The Outgoing Setup
For the record, I think Flywheel is a great hosting provider. The developer experience they offer has been one of the best I have worked with. Unfortunately, the ongoing expense became untenable for me personally.
The Manual Setup
Transitioning to a Raspberry Pi, I came up with a manual backup process for the CMS, involving the following steps:
- Exporting the database to a sql file
- Copying the contents of the CMS to an export folder.
- Compressing the folder by date and saving it to a USB flash drive form a separate device.
Detailed instructions for the first two steps can be found in a comprehensive YouTube tutorial. Given the new hosting setup, files had to be transferred either via Secure Shell (SSH) or a Graphical User Interface (GUI).
The Automated Setup
Upon researching methods to dump MySQL databases running on Docker containers via the CLI, I discovered it is supported. Given my familiarity with zipping and copying files through the command line, I realized these tasks could be automated. Further exploration led me to the conclusion that Shell Scripts were the key to automation.
I am not an expert at writing Shell Scripts, but with the help of Codeium, I managed to develop a script to streamline the backup process.
Writing the Script
The initial script iteration focused on dumping the database, but a warning about potential password exposure prompted enhancements. With Codeium’s guidance, I refined the script to:
- Check for existing backup files.
- Create a temporary directory for CMS content and database dumps with the current date.
- Dump the MySQL database into this directory.
- Copy the CMS content.
- Zip the directory with the current date label.
- Remove the temporary directory.
Though the script execution took a few seconds, lack of progress indicators prompted the addition of print statements for clarity.
Troubleshooting the USB
At this point, I had a zipped file waiting to be either pushed to the cloud or saved somewhere else that is not on the Raspberry Pi. The initial plan was to push to GitHub, but the files were too large for GitHub. Then Azure came to mind, but the feature needed (Blog Storage) was only free for 12 months. This is when saving to a flash drive came to mind.
Issues arose when attempting to use the USB on both MacOS and Raspberry Pi. After formatting the USB to ExFat
and configuring permissions, it became readable and writeable on both devices.
Setting up a Cron Job
To automate the process, a Cron job was necessary. These tasks, summarized by Codeium, allow for scheduled automatic executions. Configuring backups to run at specified intervals ensures regular data protection.
Finalizing the Script
After adding the required commands to copy the zipped file to the USB drive and the cron job, testing showed that the automated cron jobs were not saving the logs. This meant that if the automated job failed, there was no way of knowing if it failed to run, or if it ran at all. This is when I added a logger function to output logs to the terminal when executing the script manually, and saving to a log file that could then be referenced for troubleshooting or debugging.
Conclusion
This cost-cutting journey has been a rich learning experience, expanding my knowledge in diverse areas. From containerizing a CMS to operate on my Raspberry Pi within my local network to leveraging ngrok tunnels for frontend access to GraphQL endpoints, each step brought new insights.
However, my greatest satisfaction stems from overcoming minor challenges and investing effort in automating backups. This endeavor deepened my understanding of Linux permissions, cron jobs, shell scripting, and file logging, culminating in streamlined backups to a USB. These newfound skills not only bolstered efficiency but also empowered me to navigate future projects with confidence and agility.