JSONFormatElements Performance

Has anyone carried out a speed comparison for when building JSON using recurring scripts for when JSONformatElements is used during each iteration compared to without?

I appreciate this is very useful for error trapping, but as yet don’t know what the performance hit is.

A previous thread (I will find its reference later) noted that FileMaker's JSON parser steps through the JSON string from the start during each call. The longer the JSON string, the slower the performance. You can improve performance for a structure by writing multiple keys in a single operation. You can improve performance for sub-structures by creating the sub-structure separately and then incorporating the sub-structure into the parent JSON string in one operation.

1 Like

Thanks, we are carrying out a lot of those recommendations. I’m really looking for any practical examples of with/without using JSONFormatElements.

I guess by the time we’ve finished this we’ll be able to do our own tests.

What do you need JSONFormatElements() for? To display a JSON object to users? I can't think of any reason to use it: no script parameter passing or API passing or JS passing needs the format elements to be there.

I don't have time to go too into detail (but I want to soon), but I did some research on the JSON functions recently and got a much better understanding of FM's JSON caching behavior.

Basically, what other commenters said above is accurate, mainly about working with small chunks of JSON.

The biggest thing I discovered is that FM does cache parsed JSON behind the scenes, but it only caches one json object at a time. So jumping back and forth between objects forces FM to parse the JSON text each time it touches an object different than the one it touched before.

So only work on one json object at a time if possible, and get/set everything you need to in that object and then move onto the next.

Also note, the way FM checks if a JSON object is the same as the last one it "touched" is using some kind of string hash comparison. Therefore, if you do something as simple as append a space to the end of the JSON variable content, it will have to re-parse the JSON on the next run.

I didn't test explicitly, but I'd wager the only reason to use JSONFormatElements is to make it pretty. It does not change the caching behavior of FM, which happens automatically.

In general I've found the performance of the native JSON functions pretty slow once you get past objects of a certain size (let's say longer than 10K characters for example) - and have had to resort to workarounds to improve performance, so for this reason I never introduce the JSONFormatElements function into a process that is likely to become performance critical, especially considering it's really just a tool to present a developer with an easier to read version of the JSON - which can always be easily enough done in the Data Viewer when debugging etc.

1 Like

Thanks everyone. My main reason for asking was a roundabout way of saying ‘is anyone doing any error checking when passing a bunch of JSON from one script to another’, using JSONFormatElements seemed to be the most logical way of doing this, hence my performance concerns.

It seems to be unanimous here that the JSON isn’t being checked during operation, but only during development, which makes sense.

Many thanks for your input.

Andy

Would somebody please post an objective benchmark to do?

Below is a sample data point for this discussion. The Google "gson" library quickly formats JSON. I've used it with JSON documents tens of thousands of lines long. It's rocket fast!

For example... Using the 10K+ JSON Array, here:
https://api.covidtracking.com/v1/states/daily.json

Time to Pretty Print Huge JSON Array: 0.25216 seconds


Using other similar libraries to "gson", you can specify indent levels and other options.

These libraries work with JSONArrays or JSONObjects equally well.

I’ve worked with 2 million character JSON, and it’s remained fast. So it all depends on what you force FIleMaker to process or do, as @jwilling mention above.

If I was doing complex transformations, I would probably handle it outside of FileMaker.

As for JSONFormatElements, I would only use it when I need to look at the JSON myself. I’m all for avoiding premature optimization, but when you know something will be slower, I don’t include it. Formatted JSON has more characters, so at the least has to handle the extra white space. For me, whether it’s an extra 10% or an extra 1%, I’m not going to include it until I need it.

1 Like

Ah - actually Andy - you're entirely correct, I do (and hand forgotten) use JSONFormatElements to validate JSON - so I should be s little more wary of the performance implications of that.

A better way to validate JSON, IMHO, would be to use an actual JSON API.

Below is an example.

Method: If you can create the JSON Object, cool, otherwise, the code's local error handling would throw an exception and you know you have a problem.


Note the actual error returned from the JSON API displayed.

This technique is free and easy.

I'd be happy to share more info on this technique if it sounds like something you'd consider implementing.

Happy Computing :slight_smile:

To validate that you have valid JSON, there is an easy way to do it. Just ask for anything from the JSON, even a key that isn't in the JSON. So something like this:

Let ( 
[
	getElement = JSONGetElement ( text ; "anyTextWorksHere" ) ; 
	leftThree = Left ( getElement ; 3 ) ; 
	result = leftThree <> "? *"
 ] ; 
	result
)

As long as you don't get an error result, it is valid JSON.

That's pretty un-intuitive logic.

Using your approach, (I might not be using your "Let" correctly or have other FMP-required JSON formatting wrong) this intentionally invalid JSON below gave me a result of zero in the data viewer:

I removed the example since this forum was removing the "\" characters! I had to double up on them just to post here further complicating the JSON formatting.


In any case, my approach using a real JSON API takes the guesswork out of what's wrong -- not just giving an error, but an applicable error to the problem at hand.

    {"name" "applicant Name",
   "Age":"26",
   "academics":{  
      "college":"80",
      "inter":"67",
      "matriculation":"89"
   },
   "skill":{  
      "computer":"c,c++,java",
      "maths":"limit,permutation,statistics"
   },
   "dateOfBirth":"09-07-1988"
}

In this case:
"JSON Object Validation Failed. Expected a ':' after a key at 9 [character 10 line 1]"

HTH

@anon45965781 I think the rest of the string after the first 3 characters will give you more details about the error itself. But @jormond's caculation can let your code know if there is an error or not. Once you know an error is present, your code can then implement error handling any way you prefer.

If you rely on an API for something like this, can you be sure the API will always be available? I like the idea if it provides something extra, but for the initial check about if there is an error or not, my preference would be to keep things as close to native as possible.

1 Like

Interesting, as validation was the original reason for this post.

As per Bobino, we prefer to keep things as native as possible, but I do have sympathy for what fmdude is saying. It is possible to have well formed JSON and get your checks wrong.

Josh’s example seems OK, but we’ve experimented with other variations and it is possible to receive false positive or negative results depending on how the check is being made.

I guess we could put a $$DEBUG in to call JSONFormatElements during development, which would be skipped during active execution.

1 Like

Sure, the API is running in the micro-service -- it's always available as long as AWS doesn't crash, or, if you're running the service locally, your computer doesn't crash.

The service approach is available to any HTTP client also, not just FMP. :slight_smile:

The service approach is good in that it works, using the same approach, validating XML also:

The service code is "YOUR" code, not a third party. So you better not like plug-ins, either. :slight_smile: LOL

I get that. We go to plugins, micro services, JavaScript when we have to. However, if we can keep it within FileMaker we will :+1: :grinning:

1 Like

Thanks Andy,

Here's a quick video (10 seconds) showing how fast it is to validate the huge COVID JSONArray (642,098 lines, 54 fields):

(also shows validating a regular JSON object)

The other issue is code readability and performance.
JSONValidityPeformance.mp4.zip (1.1 MB)

Most of our frameworks (at D-Cogit) pass parameters using JSON strings. Our error handling framework uses JSONFormatElements as one of its parameter validation techniques. We made a decision, when we created this framework, to accept the performance hit in favour of error checking rigour for two reasons: developers can ignore the validation calls if performance is an important factor for a script or function; we normally separate recursive scripts and functions into two parts, where the first is public, non-recursive and handles the validation and initialization and the second is private, recursive and avoids validation and initialization.

1 Like