Hi. My goal is to get a lot of data from a remote FM Server. Ideally, I would create a layout and put on it only the fields I''m interested in. Then I would loop thru the found set, adding each record's info to a big variable, and then download that variable to a local FM file and create records from the big json variable.
Is there a custom function or tutorial somewhere about doing something like this? I'm confused about what the right approach would be, if it's possible at all. For example, should I write each field to the variable separately, or would it be possible to do all the fields on the layout in one step? I didn't see any native FM functions for this.
For each record, you could build a json array with all the relevant fields’ data and build a json that contains these arrays for all records. This json can be saved as a text file for further processing or passed on as a script parameter.
It sure can be done… however I suggest you instead create JSON records: a field containing a JSON version of a table's field contents. This field could be calculated or auto-entered (latter can give better performance). You would then find the wanted records and return a JSON containing an array of JSON records for the found set.
There are a number of advantages to this approach, which includes better performance than calculating JSON values on the fly for a layout-based set of fields for a given found set of records.
Thanks Torsten. So I understand, you recommend adding the record's fields one at a time to a json record, and then combining all the json records into one json blob (for lack of a better term)?
Thanks bdbd. You recommend an extra field which has the whole json content of the record's fields I'm interested in, and then getting all of those json fields combined in one operation like with a List (related::values) function on the found set? So that each json record is separated by a CR? I'm not sure where the array function comes from.
Hi: I do have relationships to the Server. However, getting the data for 10,000 records at 50 fields per record is very, very slow, like an hour or more. I'm trying to speed up the process. Thanks!
@kerplop Are you using FM19? Have you looked at the Execute FileMaker Data API script step?
(10k records can take some time, but if all your data is stored and indexed, I would not expect that to take one hour. If you have fields that are unstored calcs, then time is related to how complex your calculations are and how many extra records those calcs need to pull)
Things become even simpler if you use relationships to point to found sets of records. In this case, the JSON field of a parent record can include JSON arrays of JSON field(s) of child records. It then becomes trivial to get a JSON representation of a collection of records that have parent / children relationships. In my experience, this executes quickly for moderately large record sets.
Another variant uses a variable with repetitions. This is possible if all processing is handled within the script that does the data gathering or in a subscript.
Each repetition of the variable (i.e. $json[n]) holds a json array of field values of a record.
Advantages:
data source records are processed in a single loop
fields from related records can be retrieved, too
the json in each repetition is small in size, which benefits speed when extracting data in a loop with JSONGetElement ($json[n]; "field")
Malcolm, when the server was on the local network, it took a minute or two. Now that the server is across the country and I have to use their vpn, it does take more than an hour. The table I'm importing from has a couple million records and a lot of unstored calcs and fields. Exporting data and importing it locally might be better, but my tests show that's pretty slow too. The server's internet connection speed is not fast.
I was thinking maybe I could have a "shadow" table with only the data records and fields I need for any particular import, but don't know if that would make a significant difference. Thanks!
Interesting, Torsten, thanks. Are you saying the repetition variable could have 10,000 repetitions, one per record, and that each repetition contains all the json for one record?
Whoa! A couple million records on a periodic basis? I'm not sure my technique, or any other serialization technique for that matter, will help you. Your thoughts of optimizing your data and found set seem more on the button.
Thanks, Andy. I'm going to check it out, but can you tell me how much setup is needed? I don't control the server, so do I need extra things besides the client (latest v19)?