I'm continuing on my optimization quest. I have a database hosted by a provider and want to go into the most frequent screens users use and narrow down which parts of the screen are significantly affecting page load time. So I made two layouts, one called "Control" which starts as an exact copy of my original layout, and one call "Streamlined".
I started taking things out of the Streamlined interface I then went and took a bunch of fields out of the "Streamlined" version of the interface, and wrote a script which cycled through 50 records on both Streamlined and Control.
I pretty quickly found a single field that accounted for about 40% of the load time, and removed it.
Cool beans.
But, the problem is caching. The server caches, so the second run is always faster than the first, so I actually run through it three times, the first time to make sure it's cached, and then the next two to actually get data.
Is there some way I could run these tests without cache turned on on the server? To basically force a reload (other than restarting the server before each test)? I just want to make sure that the first person to go to every record doesn't have a super long time if I can prevent it.
Another idea I had, since we only have around 250 "high priority" records which (along with their related data) account for around 95% of our data calls I wondered if there's a value in making a script that runs on the server every morning, or even a few times a day, which basically searches for and displays those 250 records and their most viewed related data to sort of force Filemaker to Cache that data?
Thoughts on either of this?