This is the same behaviour as on a macOS system. You first need to change the security settings for
your directory filelinux:/home/centos/S3/
on the Linux system the FileMaker Server is installed as a member of the group fmsadmin. Your directory has to be set to "read and write" for this group.
Since I'm no Linux command line wizzard I installed GNOME and xRDP on the CentOS instance. Now I can take the remote route via MS Remote Access tools from any computer, navigate in the filesystem of the Linux server to /home/holger/FMSBackup, right-click on the the folders icon and change the settings accordingly.
Afterwards the backup schedule can write to that directory.
I'm with @AndyHibbs on this; I would use the AWS CLI to copy a backup directly from the FMS backup folder to S3.
Without knowing all the details it feels like @wizardconsulting's setup involves an unnecessary detour to the home folder into an S3 folder...
While that obviously keeps it within your comfort zone; I would strongly urge you to learn the basics of the Linux CLI. It really isn't hard. And we're here to help get you through that learning curve.
(and with a good text snippet tool you can store and recall 99% of the commands you'll ever need; you may find it's actually faster and easier than wading through a UI).
When you go to deploy Linux FMS for organizations with lots of Linux; you will risk alienating those IT departments if you insist on adding a desktop and remote access packages.
So don't do it automatically; consider it carefully.
I cannot upload only the files in the directory starting with S3. As I said, I've reviewed the documentation for the AWS CLI, specifically the cp (copy) command. While I can filter the uploaded files by a partial file name, I cannot filter them by a partial directory name (and I have tried).
If you know how to do this, I would be eager to hear how you did it.
For that reason, I've put the files into a separate directory outside of the normal backup folder for FileMaker Server. As I mentioned, I did this successfully on two Windows Server systems (2016 I think) and it worked without making any changes to the privileges of the folders.
My next foray into this would be to try changing permissions on the custom folder.
Thanks,
Michael Frankel
President & CEO, Wizard Consulting Group, Inc.
Mobile: (310) 291-3419
The normal and easy solution for this is to create a FMS backup schedule with the # of sets set to 0. What will dump the backup into a statically named folder so that you don't have to worry about the folder name.
The alternative is to use the OS scripting tools to identify the full name of the latest S3 folder and feed that full name to the S3 command.
Your answer was clear and to-the-point, and it steered me in the right direction. With that, plus a dive into cron jobs, I now have a working upload from FM Server 19 on CentOS to Wasabi, which is similar to Amazon S3.