FileMaker Server Off-Site Backup & Retention Policy with rclone


I have published an article “FileMaker Server Off-Site Backup and Retention Policy with rclone” on how to use rclone with FileMaker Server on Ubuntu to create off-site backups and maintain a backup data retention policy in an automated manner.

The rclone script is published as an open source project on Github:


Philosophical view on backups: The ONLY purpose of an off-site backup is to recover from a major disaster; theft, fire, flood, etc. See below:

Most backups - except for the last couple days - should be on site.

The reason is that IF you have to recover a database, the LAN is always an order of magnitude or faster than your WAN connection, and local disk is another order or magnitude faster yet.To recover from off-site storage can be time consuming and business expensive.

As such, the mandatory off-site storage should be the last 2 days. 1 day is insufficient, as you may have one of these major catastrophic disasters in the middle of a backup, which would negate both in house and off site data integrity.

Yes, there might be a reason to off-site other intervals - say month-end, but in general, running data off site that will never have value, is a waste of resource.

Backing to S3 has advantages, but with one advantage that should also be applied to in house backups as well. Backup to an operating system other than the one that the server is on. Not foolproof, but almost every Ransomware attack works by encrypting the disk at the OS level, Having the backups on a different OS, provide a "potential" insulation factor that may protect your data.

There is no electronic data storage on the planet that comes close to the resiliency of S3, but that does not mean that the data shipped up to S3 is, in itself, valid. Running a disaster recovery exercise would be prudent, to insure that the integrity of the stored data can be effectively recovered.

It would also be best if the local backup was on a WORM device (Write ONCE, Read Mostly), so that once backed up, it is impervious to malware/ransomeware attacks. Tape is a good, cheap solution to that issue.

Point-of-time backups are essential (your retention policy). I had a $20B client come to me about a backup issue. They ran daily backups, but with no retention of point-in-time historical records. A creeping data corruption issue was discovered (not FM), that went back to some indeterminate time in the past, Millions were spent to attempt to piece back together past data. Very painful.......



There is a small mistake in your "Download & make executable" section.

The chmod command filename is not the downloaded one.

Thank you and nice job!

Hi again,

Would it be easy to make it work for Google Drive instead of AWS?
I gave it a try and here what I got (I added some 'echo' to make it more clear):

/opt/FileMaker/FileMaker Server/Data/Backups/Daily for a week_2023-12-03_0000
fms-backup:fms-backup1/daily/Daily for a week_2023-12-03_0000
2023/12/03 07:36:37 Failed to create file system for "fms-backup:fms-backup1/daily/Daily for a week_2023-12-03_0000": didn't find section in config file
2023/12/03 07:36:37 Failed to create file system for "fms-backup:fms-backup1/weekly/Daily for a week_2023-12-03_0000": didn't find section in config file
2023/12/03 07:36:37 Failed to create file system for "fms-backup:fms-backup1/daily": didn't find section in config file
2023/12/03 07:36:37 Failed to create file system for "fms-backup:fms-backup1/weekly": didn't find section in config file
2023/12/03 07:36:37 Failed to create file system for "fms-backup:fms-backup1/monthly": didn't find section in config file
! END !

Thank you

Hi Kirk

Really appreciate your feedback.

Regarding the location of the off-site backups rclone supports many storage backends so you can use a NAS for your off-site backups as well.

I used s3 as it's the most familiar and affordable way to get multi-zone high availability storage for your off-site backups.

In our case the off-site backups retention policy or Point-in-Time copies as you called them are required for legal archiving reasons.

I completely agree with you that you should always validate your backups and perform disaster recovery exercises from time to time.

Your point about using a WORM device is excellent and something I haven't looked into before, I just did some research and it appears you can do this with AWS S3 as well which I plan to investigate and maybe update the guide for it later:

Hi f.i

Thank you for catching this, I have corrected it.

Yes, you can use it with Google Drive you just need to configure an rclone Google Drive Remote first, take a look here:

The script should not need any changes to work with Google Drive.

Please note however that Google Drive might not be ideal for this as it has some limitations and applies rate limits.

OK, but what are all those errors I receive from your script?

I assume you ran "rclone config" and configured the Google Drive remote correctly.

Do you have a folder "fms-backup1" int the root of your Google Drive? if not please create this folder then try again.

If you are still getting an error let me know and I will do a test with Google Drive to check if there are other issues.

A big advantage to S3 over any other storage medium in the way they store the Data. The durability calcs work out to something crazy like one byte error every billion years.

Most storage services - including local storage - store files as files in a directory structure and subject to all the potential corruption issues that can plague files at the file level

Yes, config was done and testing also (rclone test memory).
Directory exist but getting same error.

Thank you.

@f.i.SCIENCES , welcome to The Soup !

Merci Gilles!

Note 360 Works has a free offsite backup to S3 utility for FileMaker.