Need Help with Optimizing FileMaker Script Performance

Hello everyone....... :wave:

I'm currently working on a FileMaker solution, and I'm running into performance issues with one of my scripts. The script is responsible for importing large sets of data and performing a series of calculations, but it's running quite slowly, especially when dealing with larger datasets.

I’ve already made sure to use efficient relationships and indexes where possible, but the performance still isn’t where it needs to be. I’m wondering if there are any specific strategies or best practices for optimizing script performance in FileMaker, especially when handling bulk data processing.

I also check this: https://the.fmsoup.org/t/how-filemaker-performscript-triggers-scripdevopscertification But I have not found any solution. Could anyone guide me about this? If anyone has any insights or tips, or could point me in the direction of any resources, I’d really appreciate it!

Thanks in advance for your help!

Respected community member! :blush:

There are a LOT of possibilities as to what might be causing the performance issues. Some more information would be really helpful.

Imports are one of the very fastest processes in FileMaker. BUT a lot can happen after the fact that can degrade performance.

A few things to look at:
Are there a number of calcs that occur as part of the import (or post import)? IF those calcs are unstored, AND you are displaying them on the screen, that will really drag down performance. (there are some great techniques for eliminating unstored calcs). Unstored calcs are typically from the use of a global in a calculation, or the reference of data in a related table. FM relationship graphs, are effectively, a query, so all relationships in the TOG will evaluate, which can be a real drag.

EVERY record that processes using client side code, requires that each record (and ALL the fields in that table) be downloaded to the client and processed locally.

If this is where the performance hit is, you might (as a test) move the entire DB locally and run it, and see if the performance hit remains. If NOT, then you might consider getting server side scripts runnings, as they do NOT require the network traffic of moving the data from the server to the client and back.

Also, are you running any SQL post import? SQL in FM has an issue where an open record will pause the SQL until that record lock is resolved.

There are a zillion other places that performance can be degraded, but more info is needed to be of further assistance.

Oh, and Welcome to the Soup !!!

2 Likes

IDK how many steps are in your script, but here's my standard way to keep an eye on speed:

I have a table called :stopwatch:SpeedTests with the following fields:

TestName - this is usually script name Plus step in script. I.e. DataImport_Step1
TestTime
Avg_s - this is an average of TestTime
Notes - this is rarely used, but in your case could be very useful. You could (for example) add a count of found records here if you think that might be part of the problem. Or add a username to this. You could then ask a tech savvy your team to send you a screenshot whenever they get a beachball, and you'll see the clock on their screen and can cross-reference the speed logs, and see what's different.
ApplicationVersion - this is to see if (for example) staff having problems are using an old version of Filemaker, or in some cases a newer version. I've definitely had cases where the older version is faster.

Then I have a script called :stopwatch:SpeedTest_RecordTime_JSON which looks like this

# "TestName=Startup;TestTime="&Get(CurrentTimeUTCMilliseconds)-$StartTime
# JSONSetElement (
""
; ["TestTime" ; GetAsText ( Get(CurrentTimeUTCMilliseconds)-$StartTime ) ; JSONString]
; ["Notes" ; $Notes ; JSONString]
; ["TestName" ; "Optimize 2024 Detail" ; JSONString]
)

Set Variable [ $error ; Value: JSON_Create_Variables ( Get(ScriptParameter) ; "" ) ]
Set Variable [ $TestTime ; Value: GetAsTime ( $TestTime ) ]
New Window [ Style: Document ; Using layout: “⏱️SpeedTests” (⏱️SpeedTests) ]
New Record/Request
Set Field [ ⏱️SpeedTests::TestName ; $TestName ]
Set Field [ ⏱️SpeedTests::TestTime ; $TestTime ]
Set Field [ ⏱️SpeedTests::Notes ; $Notes ]

Enter Find Mode [ Pause: Off ]
Set Field [ ⏱️SpeedTests::TestName ; $testName ]
Perform Find []
Sort Records [ Restore ; With dialog: Off ]
Go to Record/Request/Page [ First ]

Then in any scripts I'm dealing with I grab the time in a few places:

When in Debug mode I then have a link on the home page which lets me see the results:

The other thing I do is test to see if there's anything on the screen that's causing a problems.

I have a button hidden so only people with $$DebugMode set can see it.

I duplicate the layout and then click on the button. The button finds a predetermined subset of records (in this case "currently active students") and then goes to record 100, and then goes to next record 10 times, refreshing the window and committing records in between and logs the time it takes to do this using the script above.

I then remove half of whatever is on the screen and run it again and see what changes.

Usually, if a screen is slow to load I can quickly find 1-3 fields on my screen which make up about 30-50% of my load time and I can find a way to fix that. Often I'll run an overnight script which manually caches joined data, or calculations or summaries.

Another thing I've done in the past, although I don't have the code handy for it, is I set up a ping test for users and I basically created my own speed test for the user, and I recorded that with the speed log. I was having a problem with one particular school I worked with with a horrible IT guy who throttled each Wifi hotspot to 40Mbs, so when all the students got onto their Chromebooks the teachers were seeing something like 2Mbps and they were complaining about beachballs. The problem only went away when the IT director finally got fired and the new person listened to me and fixed the configuration of the WiFi hotspots. There was literally NOTHING I could do about it on the Filemaker side, but the ping tests helped me document when the problem was happening.

3 Likes