I have to transfer small dumps of records on a daily basis. There are three tables with several thousand records. As CSV files they are less than 3Kb.
I had hoped to be able to place the data into a container field and collect it at the other end where it could be loaded to the Documents folder and imported. That method is failing silently ( it would be nice to get an error ).
All suggestions appreciated
At which point does it fail?
The data from the external data source is not there. For instance, a date field contains the date on which the container field was modified. The script grabs that date and modifies it using this formula:
year( $date ) & "-" & right ( "00" & month( $date ) ; 2 ) & "-" & right ( "00" & day( $date ) ; 2 )
When I run the script through the debugger I get an output such as "2021-01-11", when it is run via PSOS the output is "-00-00".
I have been reluctant to chime in on this one because I don't even have a handle on whether or not this type of arrangement is officially supported.
One thing keeps coming to mind, however, as I read the posts:
I once worked on a system where there were two solutions on separate on-premise servers (no Cloud involved). It was possible to have a server-side script running on one server successfully use a TO that had its base table defined in the solution on the other server. However, what we learned was that the very first attempt (within the server-side session) to access the external data would fail silently and return no data. We discovered, however, that subsequent attempts to utilize the TO would work, and the cross-server data would be accessible. This kind of behavior leads me to believe that this might be an unsupported feature -- but, I haven't done my homework to confirm or refute that.
EDIT: This was also quite a while back. FMS 15, I believe...
I had wanted to automate this process from the Cloud end and I've just noticed that Claris Cloud Admin Console does not provide an option to schedule scripts.
( The Admin API for Cloud does allow script schedules to be created ).
There are very distinct authentication processes for Cloud and On-Premise so that is an issue in itself.
It is possible to link to a Cloud app using external data sources but the user has to authenticate twice, once for the local DB and a second time for the Cloud app. They both use different UI for the process and a local on-premise account is not the same as a cloud account, so there isn't one-to-one seamless login that we can configure with on-premise databases.
The Data API or oData approaches are now looking best, with automation being provided from on-premise installation.