Backup Files from Ubuntu to Azure

Hey there, how are you doing? I have been busy setting up a VPS (not in Azure) with PostgreSQL and wanted to have a recovery mechanism set for it. Because the VPS is not in Azure, I have zero options that would allow me to take a snapshot or backup of the server. So, I decided to write a simple script allowing me to take backups of files and upload them to Azure Blob Storage using Azure CLI. If you want to use something other than Azure, feel free to do so. I use Azure on my account, and the charges are low. You can check the pricing here.

Alright then, let’s start taking a backup of the database using the pgdump command:

#$USERNAME is the admin username that you have set
#$DATABASE_NAME is the Database name
#$BACKUP_FILE is the path where the file will be dumped
pg_dump -U $USERNAME -d $DATABASE_NAME > $BACKUP_FILE

Now that we have a way to export a dump of the database, let’s go ahead and identify the Access Key for the Container where I want to store the files. This can be found in the Access keys within the Security + networking section of the Storage Account. If you want to learn more about Azure Blob Storage, please visit the documentation.

Here, either of the key1 or key2 values can be used.

Next, to use this, we need to install the Azure CLI. Let’s execute this command to install the Azure CLI:

curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
az --version

A successful installation would result in az --version giving the version info.

Now that the dependencies are sorted out, we can look into the backup script. Note that this script can be modified to back up any files. Save this file as backup_script.sh in /home/vps_backup/scripts directory.

#!/bin/bash

# Set the password for PostgreSQL user
export PGPASSWORD='YOUR_PASSWORD'

# Replace these values with your actual Azure Blob Storage account details
AZURE_STORAGE_ACCOUNT="MYBACKUP"
AZURE_STORAGE_ACCESS_KEY="STORAGE_ACCESS_KEY"
CONTAINER_NAME="backups"

# Define variables for the backup
USERNAME=postgres
DATABASE_NAME=mydatabse
# Get the current date and time
today=$(date +"%Y-%m-%d_%H-%M-%S")
todayDate=$(date +"%Y-%m-%d")

# Set the filenames to todays date. 
BACKUP_FILE=/home/vps_backup/backups/backup_$today.sql
BACKUP_ZIPFILE=/home/vps_backup/backups/backup_$today.tar.gz

# Perform the backup using pg_dump
pg_dump -U $USERNAME -d $DATABASE_NAME > $BACKUP_FILE

# Unset the password to clear it from the environment
unset PGPASSWORD

# Generating a compressed file using tar
tar -czvf $BACKUP_ZIPFILE $BACKUP_FILE

# Upload the backup files to Azure Blob Storage using Azure CLI
# using -d for directory using the $todayDate variable to store the files based on dates
az storage blob directory upload --account-name $AZURE_STORAGE_ACCOUNT --account-key $AZURE_STORAGE_ACCESS_KEY --container $CONTAINER_NAME --source $BACKUP_ZIPFILE -d $todayDate

The final step is to set the cron so that our script gets executed every hour.

crontab -e
0 * * * * /home/vps_backup/scripts/backup_script.sh

Although I have used Azure CLI for Azure, you can use any medium to store your files, such as a database dump from PostgreSQL, MongoDB, SQL, MySQL, etc., or any other file.

Here is a snapshot of the Azure Storage Browser showing the Container with a specific date directory: