Docker volumes are an essential part of the containerized architecture, often used to persist data across container lifecycles. To safeguard against data loss, it’s vital to back up these volumes regularly. This article provides a shell script to automate the process of daily backups of Docker volumes, uploading them to AWS S3, and cleaning up old backups.
Prerequisites
- Docker installed and running.
- AWS Command Line Interface (CLI) installed and configured with appropriate permissions.
- The `jq` tool installed to process JSON content (often used to parse Docker command outputs).
Script Overview
The script will do the following:
- Loop over each Docker volume.
- Create a backup of the volume using docker cp.
- Compress the backup.
- Upload the compressed backup to an S3 bucket.
- Remove local backups older than 30 days.
- Remove S3 backups older than 30 days.
The Script
Execution
Save the script to a file, for example `docker_vol_backup.sh`.
- Make the script executable:
chmod +x docker_vol_backup.sh
- Schedule the script to run daily using cron:
This cron configuration will run the script every day at 2 AM and log the output to a specified logfile.
Conclusion
Automating the backup process of Docker volumes ensures data safety and minimizes human intervention. By leveraging AWS S3, data can be stored in a scalable, secure, and accessible environment. Periodically cleaning up old backups both locally and in S3 helps manage storage costs and prevent unnecessary clutter. Always ensure you test backup and restore procedures in a safe environment before relying on them for production data.