Skip to main content

Automated Encrypted Backups from UNAS Pro to Azure Using Rclone

If you are running a UniFi UNAS Pro as your primary NAS, you have probably noticed that native cloud backup options are limited — and Azure Blob Storage is not supported out of the box. If you have used a Synology before, you may have relied on HyperBackup for offsite backups. On the UNAS Pro, we need to take a different approach.

In this post we will walk through how to set up rclone in a Proxmox LXC container to automatically back up your UNAS Pro to Azure Blob Storage, with client-side encryption so your data is protected before it ever leaves your network.

Why rclone?
rclone is an open-source tool that supports dozens of cloud storage providers including Azure Blob Storage. What makes it a good fit for this use case:

  • Client-side encryption — data is encrypted on your end before it is uploaded. Azure only ever sees encrypted blobs.
  • Incremental sync — only new or changed files are transferred after the initial backup.
  • SFTP support — rclone connects to your UNAS Pro over SSH, so nothing needs to be installed on the NAS itself.
  • Lightweight — runs comfortably in a small Proxmox LXC container.

The architecture looks like this:

UNAS Pro → (SSH/SFTP) → rclone (Proxmox LXC) → (encrypted) → Azure Blob Storage

The rclone container does all the work. It connects to the UNAS over SSH/SFTP, encrypts files on the fly, and uploads them to Azure. Nothing is installed on the NAS, so firmware updates will not break anything.

Prerequisites
  • Proxmox host with an existing LXC container — in this setup we used an existing container called backup running Debian 13 (2 cores, 1GB RAM, 8GB disk, 512MB swap). No new container needed to be created.
  • Azure Storage Account set up and ready — this is the main thing to have in place before starting. rclone will automatically create the Blob container named backup on first sync.
  • Azure Storage Account name and access key
  • SSH enabled on the UNAS Pro (Settings → Control Plane → Console → SSH)

Step 1: Prepare the Container and Azure Storage Account
Create a new LXC container in Proxmox or use an existing one. The main thing to have in place on the Azure side is the Storage Account itself. rclone will automatically create a Blob container called backup inside it on the first sync.

Once inside the container, run a full update before proceeding:

apt-get update -y && apt-get upgrade -y && apt-get dist-upgrade -y && apt autoremove -y

By default, Debian LXC templates disable password authentication over SSH. To fix that:

sed -i 's/#PermitRootLogin prohibit-password/PermitRootLogin yes/' /etc/ssh/sshd_config
sed -i 's/#PasswordAuthentication yes/PasswordAuthentication yes/' /etc/ssh/sshd_config
systemctl restart ssh

Step 2: Install rclone
apt install curl unzip -y
curl https://rclone.org/install.sh | bash
rclone version

Step 3: Enable SSH on the UNAS Pro
In the UniFi UI go to Settings → Control Plane → Console → SSH and enable it. Note your root credentials — you will need them for the rclone SFTP configuration.

Test the connection from the rclone container first:

ssh root@<your-unas-ip>

If you can log in, you are ready to configure rclone.

Step 4: Configure rclone
Run the interactive config:

rclone config

You will set up three remotes:

Remote 1 — Azure Blob Storage
  • Type: azureblob
  • Account: your Azure Storage Account name
  • Key: your Azure Storage Account access key
  • Leave all other options as default
Name this remote something like myazurestorage.

Remote 2 — SSH/SFTP (UNAS Pro)
  • Type: ssh/sftp
  • Host: your UNAS IP address (use an SFP+ port IP for faster transfers if available)
  • User: root
  • Password: your UNAS SSH password
  • Leave all other options as default
Name this remote something like unas-sftp.

Remote 3 — Crypt (Encryption Layer)
This remote wraps the Azure Blob remote and handles encryption transparently.
  • Type: crypt
  • Remote: myazurestorage:backup (your Azure remote name + blob container name)
  • Filename encryption: off (adds a .bin extension to indicate encrypted content but keeps filenames readable)
  • Directory name encryption: false (keeps folder structure readable in Azure)
  • Password: set a strong password — save this to a password manager immediately
  • Password 2 (salt): set a second password — save this as well
Name this remote azure-crypt.

These two passwords are the keys to your encrypted data. If you lose them, your backup is unrecoverable. Store the entire rclone config file (/root/.config/rclone/rclone.conf) in a secure location such as a password manager secure note.

Step 5: Find Your UNAS Data Path
SSH into the UNAS and find your volume UUID:

ls /volume

You will see a UUID folder such as /volume/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx. Your shares live under:

/volume/<uuid>/.srv/.unifi-drive/

List them to confirm:

ls /volume/<uuid>/.srv/.unifi-drive/

You should see all of your shares listed. Exit back to the rclone container when done.

Step 6: Test the Connection
Before running a full sync, verify rclone can see your UNAS shares:

rclone lsd "unas-sftp":/volume/<uuid>/.srv/.unifi-drive/

Then do a small test sync with a low-volume folder to confirm the full pipeline works end to end:

rclone sync "unas-sftp":/volume/<uuid>/.srv/.unifi-drive/SmallFolder azure-crypt:backup/SmallFolder --progress

Check the Azure portal — you should see the folder containing .bin files. The data is encrypted at rest in Azure.

Step 7: Set Up the Automated Cron Job
We use flock to prevent multiple instances from running at the same time. This is important because the initial sync of a large NAS can take many hours.

crontab -e

Add these two lines:

*/15 * * * * flock -n /tmp/rclone-backup.lock rclone sync "unas-sftp":/volume/<uuid>/.srv/.unifi-drive/ azure-crypt:backup >> /var/log/rclone-backup.log 2>&1
@reboot flock -n /tmp/rclone-backup.lock rclone sync "unas-sftp":/volume/<uuid>/.srv/.unifi-drive/ azure-crypt:backup >> /var/log/rclone-backup.log 2>&1

This runs every 15 minutes and also triggers on container reboot. If a sync is already running when the cron fires, that run is skipped and rclone tries again at the next interval.

Back Up Your rclone Configuration
Once everything is set up and working, the most important thing you can do is back up your rclone config file. This single file contains all of your remote configurations including the Azure credentials and both encryption passwords. Without it, your encrypted backups are unrecoverable.

The config file is located at:

/root/.config/rclone/rclone.conf

Save the contents of this file somewhere secure such as a password manager as a secure note. If you ever need to rebuild the container from scratch, restoring this file is all that is needed to get back up and running.

Restoring Files
To restore a single file from Azure — rclone decrypts on the fly and removes the .bin extension automatically:

rclone copy azure-crypt:backup/FolderName/filename.pdf /tmp/restore/

To restore everything to a new or replacement NAS:

rclone sync azure-crypt:backup "unas-sftp":/volume/<uuid>/.srv/.unifi-drive/

Disaster Recovery
If you ever need to rebuild the rclone container from scratch:

  1. Install rclone fresh (Steps 1 and 2 above)
  2. Restore your rclone.conf from your password manager to /root/.config/rclone/rclone.conf
  3. Re-add the cron jobs

All credentials and encryption keys are stored in that single config file. No manual reconfiguration is needed.

A Note on Sync Behavior
rclone sync is a mirror operation — it makes the destination match the source. Files deleted on the UNAS will be deleted from Azure on the next sync, and there is no versioning by default. If you want protection against accidental deletion, add the --backup-dir flag to archive deleted files rather than remove them. For most home and small business use cases the mirror behavior is fine.

Summary
This setup provides encrypted, incremental, fully automated offsite backups of your UNAS Pro to Azure Blob Storage — with nothing running on the NAS itself and full disaster recovery from a single config file. The table below summarizes the key components:

ComponentDetails
ContainerProxmox LXC, Debian 13, unprivileged
Resources2 cores, 1GB RAM, 8GB disk, 512MB swap
ConnectionSFTP over SSH to UNAS Pro
DestinationAzure Blob Storage
Encryptionrclone crypt (client-side, AES-256)
ScheduleEvery 15 minutes, incremental
Install on NASNone required

Comments

Popular posts from this blog

Validating User Input In CRM Portals With JavaScript

When we are setting up CRM Portals to allow customers to update their information, open cases, fill out an applications, etc. We want to make sure that we are validating their input before it is committed to CRM.  This way we ensure that our data is clean and meaningful to us and the customer. CRM Portals already has a lot validation checks built into it. But, on occasion we need to add our own.  To do this we will use JavaScript to run the validation and also to output a message to the user to tell them there is an issue they need to fix. Before we can do any JavaScript, we need to check and see if we are using JavaScript on an Entity Form or Web Page.  This is because the JavaScript, while similar, will be different.  First, we will go over the JavaScript for Entity Forms.  Then, we will go over the JavaScript for Web Pages.  Finally, we will look at the notification JavaScript. Entity Form: if (window.jQuery) { (function ($) { if ...

Power Pages Update Last Successful Login Using JavaScript and Power Pages API

 Recently while working on a Power Pages implementation for a client, I had the requirement to show the last time a user logged in on their profile page.  I thought this would be easy to do as there is already a field on the contact record for "Last Successful Login" (      adx_identity_lastsuccessfullogin).  This use to update when a user logged in, but it appears Microsoft has removed that automation. While searching I came across a few different ways of achieving this task.  One used application insights in Azure and another one used an HTTP endpoint setup in Power Automate.  I thought, this needs to be simpler.  What I came up with is to use Liquid with JavaScript to tell if a user is logged in or not.  Then use the new Power Pages api to update the logged in users contact record to mark the last time they logged in. Here is the approach I setup: 1) Make sure you turn on the api for contact in Site Settings. 1) Link to Microsoft Do...

Reusable Method To Get Record By Id

I have a handful of reusable code that I use when creating plugins or external process (i.e. Azure Functions) for working with DataVerse. The first one I am providing is Getting a Record By Id: 1: private static Entity GetFullRecord(string entityName, string primaryKey, Guid recordId, IOrganizationService service) 2: { 3: using (OrganizationServiceContext context = new OrganizationServiceContext(service)) 4: { 5: return (from e in context.CreateQuery(entityName) 6: where (Guid)e[primaryKey] == recordId 7: select e).Single(); 8: } 9: } entityName = The logical name of the entity primaryKey = The primary key field for the entity. If using late binding you can create this dynamically by doing: $"{target.LogicalName}id" recordId = Guid of the record to get service = Service to interact with DataVerse