I have a join table (SUBSCRIPTIONS) between CUSTOMERS and SERIES and a card window with a portal as a picker for the Series. Currently, I’m using Series::g_title with an “on modify” script trigger after each character is entered to perform a find. After the user selects a series, I run a script to create a record in the join table.
This works, but I have 10K records in the Series table. Therefore, I’d like to experiment with a portal filter rather than a find to see how that affects performance. However, the card window is based on the SERIES table occurrence and therefore I can’t filter the portal.
Suggestions for next steps to create a SERIES picker that allows a portal filter?
+1 to @jwilling 's approach of only performing the find after the user has paused their typing for a fraction of a second. I'd suggest first adapting your current setup to try this before moving towards a filtered portal. I've found such a configuration (i.e. Card Window, Found Set Portal + Debounce + Perform Find) to be a good one.
I don't know if this is something which has changed in more recent versions of FMP, but it used to be that using a portal filter came with the cost of downloading all of the portal the record data from server to the client in order to evaluate the filter. This was always something which I avoided, as such record download can cost a lot of overhead. The exception might be if you know that all of that record data has already been downloaded to the client cache, and doesn't need to be re-pulled from the server, but I guess I think that to be at least somewhat unlikely.
I like @steve_ssh's idea to try adapting this to Perform Find instead of jumping straight to portal filter.
One extra thing you'll have to do with Perform Find is record the current search field cursor position and selection size so that you can properly restore the user's cursor after the find. But you may end up with a more performant result.
Get the best of both worlds. Use a portal based on the current table. You are able to use a scripted quick find, or a standard find. The portal contents are automatically modified to display the current found set and it provides an automatic picking mechanism, in that a click on a portal row takes you to that record without needing any scripting on your part.
Depending on your needs, this action may be beneficial or simply a side-effect. I can imagine lots of ways in which a few script triggers can be utilised, for instance to switch layouts, to provide the UI that user needs to continue their work.
I’ve experimented with both a scripted find and a portal filter. For now, I’ll stick with the filter. Next step is to delete the extraneous records in the SERIES table (I’m only using about 500 of the 10K records) to reduce the load. Then, I’ll create an On Timer script as suggested by @jwilling. I found an old blog post from @weetbicks discussing that technique.
I’ve discovered an issue - on the filter field, I’ve set an On Modify trigger to run an On Timer script that calls the Search script. The Search script has the steps: (1) commit records, (2) refresh window flush cached join results, and (3) go to object (to place the cursor back in the search field). However, the Search script is running repeatedly — even though the search field is not being modified — causing the search field above the portal to flash repeatedly. Anybody run across this before? (Edit: I’m using Win 10.)
to make the OnTimer run only once, the script installed ontimer must itself call install Ontimer script with an empty argument. See my example file. Doing this turns off the timer.
I think your search script sounds a bit "heavy". I doubt you need the commit, or the refresh window (especially with flush cached joins). Please take a look at the second file I added which performs a find and restores the cursor. uses Refresh Portal to refilter.
EDIT: I didn't read your last post closely enough about using portal filter instead of find. I crossed out the irrelevant part above.
I am especially interested in discouraging the use of Refresh Window with a flush cache option enabled. I'd reach for Refresh Portal first.
Refresh Window with a flush cache will have the desired effect of getting the portal to refresh, but clearing the cache comes with a side effect which can really affect performance: Namely, as the user works within the solution, they may now have to re-download information from the server which previously they had efficiently available to them in the local client cache.
(I am making the assumption here that the solution is hosted. If the solution is not hosted, i.e. the file just lives on the local machine, then I can't really say whether or not flushing the cache would have such a negative effect.)
I've implemented a few of the card picker windows using portal filtering. Initially, to try and have a singular, multi-purpose and reusable layout, I only created a cartesian relationship and then relied 100% on the filters. As already pointed out, this ended up having massive performance issues after a few thousand records (and crashes for Windows users).
I ended up needing to add more criteria, either hard-coded in the relationship or by setting more globals, to make them usable. Portal filtering can be awesome, but only if it's limited by a good relationship that preemptively narrows the related records.