Welcome @apjcs!
Unfortunately there is not a single bit of vendor documentation about the scope of sort.
We can only make educated guesses and ponder workarounds. Not ideal.
The testing session showed that sort has no discernible scope and limit. It is either sorted and slows down to a crawl when looping over large found sets or it is not sorted at all.
Torsten,
Thanks for your reply. If I was faced with this problem, I would eliminate the layout switching to establish if this is a factor. I would almost always avoid switching back and forth between layouts if possible. Of course I don't know what your actual scenario is,but here's what I would try...
Instead of actually going to the target layout and creating a record I suggest adding the data to a variable array instead.**
Insert Calculated Result [ Target: $arrayJSON ; JSONSetElement ( "[]"
; [ "0" ; $Text ; JSONString ]
; [ "1" ; $Order ; JSONString ]
) & ¶ ]
I would then run with the TimeLog script inactive and compare the overall time. (or put the timeLog data into a similar array list )
After I had gathered all the data I would switch to the target layout and run a second loop to create all the records from the variable array. (perhaps collecting times and adding to a second TimeLog array
Finally switch to the TimeLog layout and loop again to create the TimeLog records using the two log arrays.
**See this post at FileMaker Hacks for more info on this kind orf array Virtual List Simplified â FileMakerHacks
@apjcs, I tested the impact of layout switches on performance. It appears that it is negligible, compared to the effect of sort.
@apjcs, There was discussion around whether switching layouts and/or record creation were responsible
Your guess about the need to test for sort state is interesting. It could be involved - there is certainly something going on. I haven't tested this using a second window and won't have time to look at it today.
I also wonder if you forced and unsorted state, then process, then sort after the looping is finished would help. I've done a lot of sorting then looping, and never have seen this issue.
Off-topic for this thread, but I've just discovered that purpose of the down arrow in quotations is to reveal the original post in its full state, including graphics.
What a great feature.
There are many cases when a loop over sorted records is mandatory. Loping over unsorted records wouldnât make sense.
Comes in handy 
Did you test with the sample file attached to the OP or the one @Malcolm provided in a post here?
Right, but we are talking about diagnosis and not use-case. In testing and demolition, nothing is off limits.
Thatâs where the issue was raised (see posts above). Coincidentally the order of creation is identifal with the sort order in this particular case.
Regarding really pinpointing the cause of the issue, e.g. sort state, context changes, etc.:
-
We have seen that a combination of a sorted found set in conjunction with context/layout changes exhibits the slowdown.
-
We have seen that a combination of an unsorted found set in conjunction with context/layout changes does not exhibit the slowdown.
-
I, personally, have not tested whether or not a sorted found set on its own (without any context/layout changes) exhibits the slowdown.
As such:
-
I am content with concluding that the combination of the sorted state along with the context/layout change can cause a dramatic slowdown.
-
I'd need to see results that tested a sorted state without any context/layout change before I would assert that the sort state alone is responsible for the slowdown.
In hindsight:
In an earlier post where I attributed the slowdown to "the sorted state of the found set" (without any additional clarification), I wish that I had been a bit more precise in my description, and attributed the slowdown to "the combination of the sorted state along with the layout/context changes". That is what I feel I really observed, but my words to describe that at the time were, unfortunately, lacking in precision. (Sorry!)
As more data comes in to document the behavior of a loop over a sorted found set which has no context/layout changes, then I could see being able to be even more clear and certain about whether it is the combination of a couple of factors that causes the slowdown, or just a single factor of the sort state.
Guessing:
It has been my gut thought/guess that the slowdown is as @apjcs suggested, i.e. that there is overhead in FMP checking to see if its large found set is still in proper sorted order, and moreover that this checking of the sort order may be triggered by a context change.
One of the reasons I feel this way, and something that led me to hone in on the sort step and sort state is because while I was doing testing I once saw a brief dialog flash on the screen indicating that FMP needed to finish sorting some records. It appeared only for a split second, and its the type of thing that I normally would ignore if it does not get in the way, but at the time if felt strange to me because I did not see any reason that FMP would be needing to sort anything, having already sorted the found set earlier on in the script. Additionally, the fact that FMP opted to show me a dialog about the sort being in progress (again -- it was extremely brief), suggested to me that whatever sorting it was doing was not light work (given that it decided to show me an in-progress type dialog about it).
HTH & sincerely,
-steve
Hi @jormond,
Indeed, I believe that, when feasible, doing as you described would help a lot.
One of the reasons this topic really caught my eye is because I think many/most of us have done plenty of scripting of loops over sorted found sets, and have generally found the performance adequate for the particular task at hand.
Seeing what was happening here was a real stunner for me.
My hope is that we can collectively define a specific and accurate recipe for how such a scenario occurs, so that we can all keep that in the back of our mind as we architect things.
Some take-aways for me so far:
-
Even though I'm not usually a "process in new window" type of developer, I'd like to find out if that would help.
-
Also, @apjcs's idea of collecting all the data in a variable first, and then creating records in a batch is something I'd like to file away as a possible option.
More that I'd like to know:
Something that would be nice to know is whether this type of degradation only happens when the found count reaches some threshold number of records, or if it always exists from the get-go, and increases (linearly? non-linearly?) as the the found count grows...
Thanks, Josh,
-steve
So I modified the script to collect all the data in a single variable array and then use a second loop to create all the target records in one batch.
- Just looping over the records but not creating the target records: 8 seconds
- add in building the data array variable: 9 seconds
- second loop to create all 10,000 target records: 28 seconds
The times quoted above are the total time for the whole script
- Original script: ~900 seconds
I think this is another pointer to the fact that returning to a sorted layout many many times is a significant factor. All test run locally.
Seeing it this way feels like what I would expect. As another fun test, you could set auto-enters to parse the values, so when you create the record, it automatically picks up the value without any set fields.
I would expect that to be even a little faster. Even if the use-case is a little sparse.
Thank you @apjcs, thatâs an excellent example of how to solve the same problem in a different way.
Your solution has the advantage of honouring the need for a sorted source record set.
I am grateful for the inspiring exchange of ideas that is happening here.
Using Auto-enter instead of Set Field does not seem make any noticeable difference in this case. It is sometimes a second faster and sometimes a second slower - I guess it depends what else is running on my Mac at the time... (and whether I get bored and read a couple of emails while I am waiting)
Perhaps if there were significantly more fields...
Great info. Thanks for testing that so quickly. I'm old, I went to bed. LOL
