--- title: >- Self Hosting: Backup with Rsync date: 2025-04-24 tags: - Tech - Linux coverImage: /img/content/rsync-backups.png description: >- .targz or rsync? Cronjobs, writing to a log and notifications. --- Setting up a server can be so exciting! But it's only truly useful if in the case of a drive failure your important data isn't lost. Personally; I have a hard drive dedicated to important personal data, the root drive with docker data among other things, and finally **an external drive ideal for backups**. Let's go through how I've setup nightly backups to that very external drive. ## First, some goals 1. An easy to call script that we can supply a directory to backup 2. Write to a log with backup results 3. Set up a cron job to run backups automatically Note: _Much of this is applicable to personal devices, but I've written this from the perspective of backing up a **linux server**._ ## Script ### Thinking through it Note: _The full script is in the next section_ First we'll write a new shell script and assign it as executable. ```shell echo "#!/bin/bash">>backup.sh chmod +x backup.sh ``` #### Logging Next it makes sense to think about logging. This will help us test our script is running as expected, almost like test-driven development practices teach. `backup.sh` ```shell backupPath=$1 # first argument to script will be the directory to backup backupName=${2-manual} # second argument will be the name of the backup, or "manual" if not specified currentDate=$(date +"%m_%d_%Y") # current date as month, day and full year currentTime=$(date +"%T") backupLocation="/where/your/backups/go" # for example, I mount my external drive at `/disks/external` logFile="/var/log/server-backups.log" # can be anywhere, any filetype that's plain text - a `.txt` would do as well echo "$currentDate $currentTime Backing up $backupName data from $backupPath">>$logFile ``` Now when we run the script it should write the current date, time, backup name and directory attempting to be backed up to a log file. We can run this script and check the logs like so: ```shell ./backup.sh directory-to-backup example-backup tail /var/log/server-backups.log ``` Tail will show the final few lines of the file specified. If it does not, verify the log file exists and your user has permissions to edit it. Press `q` to exit `tail` at any time. #### The backup itself Depending on your needs you'll want to decide how to run your backup. One option, is compressing your backup into a compressed file such as a `.tgz`. This is easy enough to do with a command like `tar -zcf output.tgz -C directory-to-backup` - try it sometime, it's cool! However, I've found it can be difficult to verify your backups are working while hidden away in a compressed file... Instead, I opt to leverage a folder syncing option like `rsync` which come on many Linux distrobutions pre-installed. It'll compare two directories and copy any changes between one to the other. If you're careful you can specify it should delete files removed intentionally from your directory to backup as well. **Tip**: I strongly suggest running rsync with the `--dry-run` parameter to ensure it's doing exactly what you expect and not the opposite. Lastly I'll suggest many default linux system folders you _likely_ don't want to backup just in case you wish to backup your root drive `/` like I do. `backup.sh` ```shell cd $backupLocation mkdir -p $backupName RESULT=rsync -a --stats --human-readable --delete --dry-run \ # remember to remove --dry-run when you're ready to actually back up your files for real --exclude=/snap \ --exclude=/dev \ --exclude=/mnt \ --exclude=/proc \ --exclude=/sys \ --exclude=/tmp \ --exclude=/var/tmp \ --exclude=/media \ --exclude=/disks \ --exclude=/usr/lib \ --exclude=/usr/src \ --exclude=/lost+found \ $backupPath $backupName) EXIT_CODE = $? if [[ $EXIT_CODE = 0]]; then echo "$currentDate $currentTime Backup results for $backupName data:\n\n$RESULT">>$logFile echo "$currentDate $currentTime Backed up $backupName data from $backupPath">>$logFile else echo "$currentDate $currentTime Error backing up for $backupName in $backupPath! rsync error code: $EXIT_CODE">>$logFile fi ``` I recommend making some test folders with some example files in it you don't mind losing while testing this script. Never can be too careful, measure twice cut once! Once again you may run your script and tail the results. ### The full script `backup.sh` ```shell #!/bin/bash backupPath=$1 # first argument to script will be the directory to backup backupName=${2-manual} # second argument will be the name of the backup, or "manual" if not specified currentDate=$(date +"%m_%d_%Y") # current date as month, day and full year currentTime=$(date +"%T") backupLocation="/where/your/backups/go" # for example, I mount my external drive at `/disks/external` logFile="/var/log/server-backups.log" # can be anywhere, any filetype that's plain text - a `.txt` would do as well echo "$currentDate $currentTime Backing up $backupName data from $backupPath">>$logFile cd $backupLocation mkdir -p $backupName RESULT=rsync -a --stats --human-readable --delete --dry-run \ # remember to remove --dry-run when you're ready to actually back up your files for real --exclude=/snap \ --exclude=/dev \ --exclude=/mnt \ --exclude=/proc \ --exclude=/sys \ --exclude=/tmp \ --exclude=/var/tmp \ --exclude=/media \ --exclude=/disks \ --exclude=/usr/lib \ --exclude=/usr/src \ --exclude=/lost+found \ $backupPath $backupName) echo "$currentDate $currentTime Backup results for $backupName data:\n\n$RESULT">>$logFile ``` ### Adding as a cron job For this you'll need super user access. Note, the below command will open in the code editor `vim` in default. ```shell sudo crontab -e ``` Most cron files have helpful comment text by default. Give it a read, it's quite useful. Determine how [often you wish to run the script in crontab time](https://crontab.guru/) then write a similar command as to how you've been testing it. **Try with dry-run enabled before trusting your cron job to not harm your precious files!** Below I've given some examples of backups you can run, customize for your own situation. `crontab` ```shell @daily /path/to/my/backup.sh /disks/my-awesome-files my-awesome-backedup-files @weekly /path/to/my/backup.sh / system ``` Save your crontab changes and you're done. Keep checking your logs, test test test, and be happy you're not running backups like a pro.