JSON from only the fields on a layout?

Hi. My goal is to get a lot of data from a remote FM Server. Ideally, I would create a layout and put on it only the fields I''m interested in. Then I would loop thru the found set, adding each record's info to a big variable, and then download that variable to a local FM file and create records from the big json variable.

Is there a custom function or tutorial somewhere about doing something like this? I'm confused about what the right approach would be, if it's possible at all. For example, should I write each field to the variable separately, or would it be possible to do all the fields on the layout in one step? I didn't see any native FM functions for this.

Thanks in advance for any advice.

1 Like

Hi @kerplop, welcome to the soup!

For each record, you could build a json array with all the relevant fields’ data and build a json that contains these arrays for all records. This json can be saved as a text file for further processing or passed on as a script parameter.

It sure can be done… however I suggest you instead create JSON records: a field containing a JSON version of a table's field contents. This field could be calculated or auto-entered (latter can give better performance). You would then find the wanted records and return a JSON containing an array of JSON records for the found set.

There are a number of advantages to this approach, which includes better performance than calculating JSON values on the fly for a layout-based set of fields for a given found set of records.

1 Like

Thanks Torsten. So I understand, you recommend adding the record's fields one at a time to a json record, and then combining all the json records into one json blob (for lack of a better term)?

Thanks bdbd. You recommend an extra field which has the whole json content of the record's fields I'm interested in, and then getting all of those json fields combined in one operation like with a List (related::values) function on the found set? So that each json record is separated by a CR? I'm not sure where the array function comes from.

Hi @kerplop, nice to see you here.

Why don't you use a local file that has a reference to the remote database to do this job?

Yes, it is building a json array, adding each record’s field data to a json array, as described here.

Depending on the number of records and fields, @bdbd’s approach may be more advantageous.

Hi: I do have relationships to the Server. However, getting the data for 10,000 records at 50 fields per record is very, very slow, like an hour or more. I'm trying to speed up the process. Thanks!

Gotcha, thanks!

@kerplop Are you using FM19? Have you looked at the Execute FileMaker Data API script step?

(10k records can take some time, but if all your data is stored and indexed, I would not expect that to take one hour. If you have fields that are unstored calcs, then time is related to how complex your calculations are and how many extra records those calcs need to pull)

https://help.claris.com/en/pro-help/content/execute-filemaker-data-api.html

1 Like

There must be reasons for the slowness. Those numbers should be transferred in seconds or minutes, not hours.

Have you tried exporting the data set that you want? That's usually efficient.

1 Like

Things become even simpler if you use relationships to point to found sets of records. In this case, the JSON field of a parent record can include JSON arrays of JSON field(s) of child records. It then becomes trivial to get a JSON representation of a collection of records that have parent / children relationships. In my experience, this executes quickly for moderately large record sets.

Another variant uses a variable with repetitions. This is possible if all processing is handled within the script that does the data gathering or in a subscript.

Each repetition of the variable (i.e. $json[n]) holds a json array of field values of a record.
Advantages:

  • data source records are processed in a single loop
  • fields from related records can be retrieved, too
  • the json in each repetition is small in size, which benefits speed when extracting data in a loop with JSONGetElement ($json[n]; "field")

Malcolm, when the server was on the local network, it took a minute or two. Now that the server is across the country and I have to use their vpn, it does take more than an hour. The table I'm importing from has a couple million records and a lot of unstored calcs and fields. Exporting data and importing it locally might be better, but my tests show that's pretty slow too. The server's internet connection speed is not fast.

I was thinking maybe I could have a "shadow" table with only the data records and fields I need for any particular import, but don't know if that would make a significant difference. Thanks!

Thanks for the Data API tip, Bobino. Never tried it before but will look into it.

Interesting, Torsten, thanks. Are you saying the repetition variable could have 10,000 repetitions, one per record, and that each repetition contains all the json for one record?

Whoa! A couple million records on a periodic basis? I'm not sure my technique, or any other serialization technique for that matter, will help you. Your thoughts of optimizing your data and found set seem more on the button.

1 Like

Your description of your setup would appear to be perfect for using the Data API.

1 Like

That's the total records in the table. I only retrieve 5k-15k at a time. Thanks!

Thanks, Andy. I'm going to check it out, but can you tell me how much setup is needed? I don't control the server, so do I need extra things besides the client (latest v19)?