The Ultimate Guide to Automating VPS Backups to Google Drive (WHM & CLI)
Learn how to automatically back up your Linux VPS or cPanel/WHM server directly to Google Drive. Complete guide covering rclone scripts, cron jobs, and database dumps.
Have you ever experienced that creeping anxiety when you realise you haven't backed up your server in three months? You are not alone.
Managing a Virtual Private Server (VPS) gives you ultimate freedom, but it also means you are completely responsible for your own disaster recovery. If your server crashes today, or you accidentally run a destructive command, your data is gone.
The good news? If you have a Google One subscription or a Google Workspace account, you likely have plenty of gigabytes (or terabytes) of Google Drive storage sitting idle.
In this guide, I'm going to show you how to automatically back up your entire VPS to Google Drive. We will cover two approaches:
- The WHM/cPanel Route: For those managing their servers via Web Host Manager.
- The Command Line (CLI) Route: For pure Linux servers using a powerful tool called
rclone.
Let's dive in.
Method 1: The WHM Route (The Easy Way)
If your VPS runs WHM and cPanel, you are in luck. WHM has native support for Google Drive as a remote backup destination. It handles packaging your files, databases (with passwords intact), emails, and DNS records automatically.
Step 1: Configure the Backup in WHM
- Log into your WHM dashboard (usually
https://your-server-ip:2087). - Search for Backup Configuration in the left sidebar.
- Under Backup Status, check Enable Backups.
- Set the Backup Type to Uncompressed.
Why Uncompressed? Compressed backups save everything into opaque
.tar.gzfiles. Uncompressed backups store your files as normal, browsable folders in Google Drive. You can open a PHP file or an image directly in your browser without needing to download and extract a massive archive. It uses more space, but Drive space is generally cheap. - Under Scheduling and Retention, select how often you want this to run (e.g., check Monthly and set the day to the
1st). - Under the Files and Databases sections, make sure you select everything. Crucially, set the Databases selection to Per Account and Entire Data Directory to ensure all users and passwords are saved.
Step 2: Create Google OAuth Credentials
WHM needs permission to talk to your Google Drive safely. To do this, you need to create an OAuth app securely in the Google Cloud Console. This is the part that trips most people up, so follow closely:
- Go to the Google Cloud Console and sign in.
- Create a new project and name it
WHM Backup. - Navigate to APIs & Services > Library, search for Google Drive API, and click Enable.
- Go to APIs & Services > OAuth consent screen.
- Choose External and fill in the required names.
- Crucial Step: Scroll down to Test users, click + Add Users, and add the exact email address of the Google account you want to save the backups to. If you skip this, Google will block the authorization later!
- Go to APIs & Services > Credentials.
- Click Create Credentials > OAuth Client ID.
- Set the Application type to Web application.
- Under Authorized redirect URIs, add your exact WHM callback URLs. It will look something like this (replace with your actual server hostname or IP):
https://your-server-hostname:2087/googledriveauth/callback
- Click Create. Copy the Client ID and Client Secret that appear.
Step 3: Link WHM to Google Drive
- Back in your WHM Backup Configuration page, scroll down to Additional Destinations.
- Select Google Drive from the dropdown and click Create New Destination.
- Name it (e.g., "Monthly Google Drive Backup").
- Paste the Client ID and Client Secret from the previous step.
- Click Save and Validate Destination.
- A Google popup will appear. Sign in with the Google account you added as a test user and grant the requested permissions.
You should see a green success banner in WHM. Your automated backups are now fully configured!
Method 2: The Command Line Route (For Pure Linux VPS)
If you are running a bare-metal Ubuntu or Debian VPS without cPanel, you can achieve the exact same result using a free, incredibly powerful command-line tool called rclone.
We are going to use the rclone sync command. Instead of re-uploading your entire server every month, sync only uploads new or changed files, and leaves identical files alone. This saves massive amounts of bandwidth and time.
Step 1: Install and Configure rclone
-
Connect to your VPS via SSH.
-
Install rclone using their official script:
curl https://rclone.org/install.sh | sudo bash -
Configure your remote connection:
rclone config- Press
nfor a new remote and name itgdrive. - Select
drive(Google Drive) from the list of storage options. - Follow the prompts to authorise rclone via your browser.
- Press
Step 2: The "Backup Everything" Script
A common mistake when manually backing up a Linux web server is just copying the /var/www folder. That misses your databases, system configurations, SSL certificates, and cron jobs.
Here is a robust bash script that securely grabs everything you need to revive a server.
Create the script:
sudo nano /usr/local/bin/vps-backup.shPaste the following code (make sure to replace the EMAIL variable with your own email address):
#!/bin/bash
# ── Config ─────────────
GDRIVE_DEST="gdrive:VPS-Backups"
EMAIL="your-email@example.com"
LOG="/var/log/vps-backup.log"
DATE=$(date +%Y-%m-%d)
ERRORS=0
echo "===== Backup started: $DATE =====" >> $LOG
# 1. Website & App Files
rclone sync /var/www "$GDRIVE_DEST/var/www" --log-file=$LOG
[ $? -ne 0 ] && ERRORS=$((ERRORS+1))
# 2. User Home Folders
rclone sync /home "$GDRIVE_DEST/home" --log-file=$LOG
[ $? -ne 0 ] && ERRORS=$((ERRORS+1))
# 3. System Config Files
rclone sync /etc "$GDRIVE_DEST/etc" --log-file=$LOG
[ $? -ne 0 ] && ERRORS=$((ERRORS+1))
# 4. Dump MySQL/MariaDB Databases (Individual files per DB)
mkdir -p /tmp/vps-backup-mysql
for DB in $(mysql -e "SHOW DATABASES;" | grep -Ev "^(Database|information_schema|performance_schema|sys)$"); do
mysqldump "$DB" > "/tmp/vps-backup-mysql/$DB.sql"
done
rclone sync /tmp/vps-backup-mysql "$GDRIVE_DEST/databases/mysql" --log-file=$LOG
[ $? -ne 0 ] && ERRORS=$((ERRORS+1))
rm -rf /tmp/vps-backup-mysql
# 5. Dump PostgreSQL Databases (if applicable)
mkdir -p /tmp/vps-backup-pgsql
for DB in $(sudo -u postgres psql -t -c "SELECT datname FROM pg_database WHERE datistemplate = false;"); do
sudo -u postgres pg_dump "$DB" > "/tmp/vps-backup-pgsql/$DB.sql"
done
rclone sync /tmp/vps-backup-pgsql "$GDRIVE_DEST/databases/postgresql" --log-file=$LOG
[ $? -ne 0 ] && ERRORS=$((ERRORS+1))
rm -rf /tmp/vps-backup-pgsql
# ── Email Notification ──
if [ $ERRORS -eq 0 ]; then
STATUS="✅ SUCCESS"
MSG="Your VPS backup completed successfully on $DATE. Check your Google Drive."
else
STATUS="⚠️ ERRORS OCCURRED"
MSG="Backup finished on $DATE but errors occurred. Check /var/log/vps-backup.log."
fi
# Send email
echo "$MSG" | mail -s "VPS Backup $STATUS" $EMAILMake the script executable:
sudo chmod +x /usr/local/bin/vps-backup.sh(Note: To receive the email notification cleanly, ensure your server has a mail utility installed. On Ubuntu/Debian, you can install it via: sudo apt install mailutils)
Step 3: Schedule with Cron
Finally, tell your server to run this script automatically in the background. If you want it to run on the 1st of every month at 2:00 AM, open your crontab editor:
crontab -eAdd this line to the bottom of the file:
0 2 1 * * /usr/local/bin/vps-backup.shThe Peace of Mind
Whether you construct an automated pipeline via WHM's GUI or use a custom rclone script, the end result is the same: your Google Drive will now contain a perfectly browsable, up-to-date snapshot of your entire server.
If disaster strikes, or if you simply decide to migrate to a newer, faster hosting provider, you have everything you need to recreate your environment from scratch securely tucked away in the cloud. No downloads, no extraction, no panic.