Should I be using progressive backup?

I have 8 customers that updated to FMS18 in the last month and 3 of them have had there server crash since. In the latest crash some data was lost as the database would not recover after being corrupted. It was not the end of the world but it has me considering progressive backups.

My server has 46 databases ranging from very small to just under 50GB on the largest one. We have 8 users logged in making changes often. So it seems like progressive backup would be a good idea. But I have some questions on how it works and if I should use it.

  • Do people use progressive backup and do they like it?
  • How dose it work if you have auto backup turned off and you use scheduled backup? I'm just not sure how to restore them.
  • If you have to use auto backup can you store the backups on a NAS?
  • If you can't store them on a NAS can you change how many backups are kept to less than 7?

Other information

  • FileMaker 18 for both workstations and server
  • Windows server 2019 standard
  • All databases add up to 255GB
  • 32GB ram
  • Database drive is 1.79TB
  • all backups are moved to NAS keeping 7 days backups
  • Currently all backups are managed with backup schedules
  • Auto backup is turned off because I could not find a way to limit the amount of backups and could not back them up on the NAS
  • I also do a weekly backup online using a combination of backup schedule, Iperius backup, and Crashplan.
2 Likes

Welcome to the fmsoup, @Susurrus!

  • Progressive backup is used both on my dev server and on client's production servers. So far, no issue or problems to report (no server crashes either).

  • Auto backup files and progressive backup files can be stored on a NAS, they are regular fmp12 files.

  • For restoration of PBs, refer to the server documentation

  • to my knowledge, auto backup can only be switched on or off. There is no option to control the number of backups

There have been quite a lot of reports I’ve seen where the new ‘startup restoration’ feature that is enabled by default with Server 18 has been causing crashes and corrupting databases.

Many people are disabling it as per details given here: https://fmhelp.filemaker.com/help/18/fms/en/index.html#page/FMS_Help/hostdb-startup-restoration.html

The command line is:

fmsadmin set serverprefs StartupRestorationEnabled=false

You will need to restart server after disabling it.

1 Like

Thanks for this, I had not heard about it. I will look into it some more.

In the case you describe it is possible it may have helped you with the data loss.

Saying that, on a personal level we don’t use it and haven’t yet had anyone explain a reason for us to change our opinion.

First, you cannot backup to a NAS, only the internal system drive, another volume created on the same disk or a volume on a directly attached disk, such as an external USB drive. Third party software or a system task has to be used to transfer the data to other storage over the network.

Scheduled backups are essential regardless of the progressive settings and it runs perfectly well alongside the scheduled backups.

An initial backup is made and after that changes are made to a progressive folder recording the changes made since then. The key thing is that the main backup files are updated from the progressive files by a pre-set time interval that is set to 5 minutes by default but can be between 1 and 99.

Our problem Is that there is a small window of opportunity to retrieve the files before they match the live hosted ones. This time is the time interval set above.

We haven’t found a scenario we can justify using this. If we accidentally delete records we probably won’t know or be able to stop the server before the update takes place. If the files corrupt same scenario. In your case, as long as the server did actually crash and all services stopped, this may have helped you. If the files crashed but backups continued maybe not. To our knowledge you can’t go back to an incremental version.

We have some clients who backup every 2-hours but we usually run a lunch and nightly standard backup (which gets pushed automatically to AWS S3). We also copy each client’s system weekly to offline hard disks, hence we have weekly versions we can go back in time to.

A useful link: FileMakerPKB

I hope this helps

Andy

Thanks for the welcome. I hope this will be my new home for all FileMaker related topics.

I am glad to hear you are happy with progressive backups. based on its abilities promised it sounds almost too good to be true. I read the restoration instructions and I think I will just have to try it on a test server to see if it works the way I think they are describing.

I did try turning it on with our main server and the results confused me. I configured the progressive backups to be stored on the E drive and it looked like that was working but it also copied a full copy of all the databases totaling 255GB on the C drive. The C drive only has 243GB free out of 278GB. I am glad I noticed before it caused the server to crash. Is this normal?

1 Like

The server should handle a lack of disk space more graciously than by just crashing. However, I would never count on it and always check if enough disk space remains. Using a 1TB drive for 255GB of DBs would put you on the save side :slightly_smiling_face:.

I was hoping to continue using the same scheduled backups and only use the progressive backup for server crashes. It sounds like I will be able to do that. We also have a weekly one that is uploaded online with Crashplan. One of the things I like about Crashplan is storage space dose not add to cost so we can keep every week that is backed up for ever. Very nice.

We have a 1.79TB drive we use for the databases the 278GB is OS only and should never have any databases stored on it. When I turned it on I set the folder to the 1.79TB drive and it did make a progressive folder and started coping files to it. But it also made a progressive folder on the C drive and proceeded to make a full copy of all the databases there.

Can you use different drives for databases and backups?

If the drive has to be local not easily. We have 4 drive slots on the server and the OS and storage are both Raid 1. I noted they recommended the progressive backups not be stored on the same drive as the databases but it was not required so I was hoping to for now to keep them on the same drive. This would be the data drives though not the OS drive. There is room on the data drive but the OS was always designed to only have the OS and software.

If there is a Raid 1 mirror of the data disk and you have external backups in addition, it should be no matter of concern having data and backup folders on the same disk.

Specify a progressive backup folder on a different hard drive from the one where the hosted databases reside. Since the progressive changes may be written to the progressive backup folder at nearly the same time that changes are written to the hosted databases, using two different hard drives may improve server performance. Source

From this I assume it is not needed but would be recommended. But the auto copping a full backup to the C drive even though I set it to the E drive is an issue.

Fully understand and glad you are using something like Crashplan.

However, if you are not backing up to either a separate internal disk or external disk daily, I’d be running the offsite backup daily. In fact, regardless of the disks, I’d still run it.

Andy

We have a monthly data cap of 2tb Running the weekly backup with our other online backups we come close to using it all. For this reason daily is not an easy option. The daily backup to the NAS will have to work.

That being said anymore insights on progressive backups? I plan on testing it in the near future I am sure that will inform me on most of what I need to know but still would be cool to hear others stories or suggestions.

The copy of the files on the C drive worries me. Unfortunately all our Windows servers are virtual and don’t have separate drives, but I’d love to test this.

If I get time and knock this infection ruining my productivity at the moment out of the ball park, I’ll set up a test system.

Unless someone else has experience and can advise?

Torsten, would you do me a favour and explain the problem you are solving using the progressive backup?

Many thanks

Andy

Hi @AndyHibbs,
Progressive backups happen every 5 minutes and are created in addition to 2-hourly backups. In the event of a server crash, it will restore with only a loss of about five minutes of work.
If PB fails, the 2-hourly backup will serve.
It is a quite simple setup.

1 Like

@Torsten

Do you use the same drive for progressive backups as your hosted data drive? If it is different then the C drive do you notice the C drive still being used for progressive backups?

If it has to use the C drive it really limits my ability to use it without upgrading my C drives. They are high RMP drives and would be expensive to expand.

Backups, progressive included, are stored on a separate drive. A daily copy is stored on NAS, which in turn is backed up daily.