Let's deep dive and explore Performance

I think it is important to speak up. best example when FM16 new function SortValues was introduced in 16v1 it was slower then a naive home brewed bubble-sort custom function. And the community's complaint got it addressed in 16v2. And finally in 19.2.2 fixed values formatting preservation (silently)...

2 Likes

I think FileMaker is really incredible at some things and less so at others. Like any product, there are compromises. However, with FileMaker you can extend it so many ways -- a plug-in, REST service, etc., that there is usually a way to overcome a given limitation of the product out of the box, so to speak.
Yet, extending FileMaker, which sounds great, comes with another cost since you have to either pay for plug-ins and learn to use them or pay for a developer to extend FileMaker other ways. Extending FileMaker adds a dependency but may fix a product hole or performance issue.
In the end, it all comes down to matching the product to the need and the requirements (that includes, of course, non-functional requirements like performance).
FileMaker is incredible at what it does!
I would be lost without it!!!! :slight_smile:

2 Likes

The problem is here, instead of making this a valuable conversation about achieving better performance, @Vincent_L is going to make this about mud-slinging, and we will lose the value of the conversation.

This is the reason why @WimDecorte left this forum... And that didn't help with the goal that was the original vision of this space.

3 Likes

Let's focus on the positive/constructive comments. :slight_smile:

4 Likes

Absolutly, as I said:

If the product has some extra weight that is slowing us down and can easily be shaved off, let's lose it, by all means.

I see value in what @Vincent_L brings to the discussion, and I am sure things can be improved. But there is something about the tone that did not sound right. That's all I had to say. Trying to echo @OliverBarrett.

2 Likes

No it's not dishonest.

I'm saying that Filemaker performance is mostly terrible.
Then Nick wants to show that it's not the case by showing his gallery.
People are astonished by the performance he gets. That astonishment is precisely the point. If Filemaker performance wasn't mostly terrible, and if it was mostly good enough, people wouldn't be amazed at the "good enough" performance Nick got.
My comparison with other tools, wasn't to diss Nick's solution, but was to put in perspective the superlatives "so fast", "mind blowing" that were expressed.

So Nick demonstrated that to get "good enough" performance, you have to deploy quite an unusual and astonishing feat.

So Good enough performance is not the result of using Filemaker the filemaker way, but on the contrary optimize it to death, so much so that it astonishes Filemaker pundits.

And that's precisely the issue : Good enough performance, not even speed daemon, should be obtained using the default way of working with a RAD. If not, the RAD is just inefficient.

This reminds that when I was a kid, I got Sorcery, a new game for my Amstrad CPC 464. It had "amazing" graphics, 320x160 pixel, and instead of the 32 normal colors the dev managed to got 512 colors. It was a beauty. I showed it to my mother, she told me nicely that it wasn't that good. No, in retrospect, you can google it, it looks like crap. But as I was passionate, fully aware of my computer limitations, I raved about something that was absolutely crappy but relatively awesome.
But in the end, what counts is the absolute result not the relative one.

Filemaker's community and Claris, is engulfed in that relativity, in that distortion field fueled by our passion, but external people don't see this, and the job to be done neither.

It's time for a reality check

Diversity is not just a hyped word. We all must live it. That means that dissenters have a place, too. @WimDecorte made his decision, fair enough. Story of the past.

I've temporarily set this thread in slow mo. It's amazing and there are tons of contributions, and I want everyone to be able take the time to appreciate each point brought forth and if possible, as suggested by @steve_ssh, lets dress up a list of sub topics to create various work groups about the various aspects of performance, which is definitely a topic dear to our hearths!!!

2 Likes

The discussion about the speed of the images was not about "good enough". The discussion was about decent to great performance across 4,000 miles. Light only travels so fast. So speed typically comes from compressing that data and getting more across the wire in a shorter time.

Speed in this instance is slowed down by the number of round trips a data call needs to make. The fewer the round trips, the less you are limited by the travel distance.

The astonishment, is because typically, you can only achieve speeds like Nick is demonstrating by have a CDN in the region of the client machine.

2 Likes

Several interesting points to be discussed. Now please, let us focus on how each of these can be addressed. Please engage in the mission of this thread rather than argument your opinion about Claris failure in these areas. Regardless of failure or not, after all, in the here and now, we are all FileMaker developers trying to use the tool provided in the best way to serve our clients. Constructive conversation about Claris policies and their product development strategies can be held in the appropriate channel. This thread is to work about how WE can help orient the development by identifying the key points. Kindly, Cécile

3 Likes

P.S. to my previous post:

I agree that @Vincent_L's emphasis on telling what Claris did wrong does not forward the topic we are actually discussing.

3 Likes

I note that @Vincent_L wants to use the performance that I have demonstrated as a means of arguing that FileMaker performs poorly. I understand the disappointment that he is suffering as a result of the poor performance of his spider-graph solution.

I have been there too.

When I started in 1997 I made all the mistakes it is possible to make, seduced by the ease of getting started and a complete lack of rules I too created a spider's web of relationships. I experienced slowness, great slowness, it took 5 minutes to close down a 5 file solution. Eventually I paid Ray Cologon to explain what was going on and he showed me how the relationships between tables in different files were creating webs of complexity so that a file closing would open another file which had just been closed etc. This sort of thing is easily done.

I looked around for a better alternative, that was part of the reason I travelled in the US during 2011 & 2012. I visited and assessed every possible option. I eventually concluded that FileMaker was the best option, I just had to learn how to use it more efficiently.

My background is in racing sailing boats and in the law.

Both require rigour, discipline and lots of hard work to be successful.

So I started to apply that sort of approach to FileMaker.

FileMaker became, for me, a brilliant means of creating tools to solve problems.

With experience I have learned what works well and what does not work so well. I have extended that experience by experimenting and testing to find the best means of getting what I want.

My preference as I have grown older is for greater simplicity. So I will make something work and then iteratively try to find ways of improving it. Sometimes I end up deleting stuff I spent days on when I have found a better way. Its a bit like tidying your home or your office, or even your desk.

Every FileMaker Solution that has speed challenges can be improved and there are only two basic methods either:

  1. take Wim's advice - which is generally the best advice for a well resourced business or organisation - and buy lots of horse power and / or
  2. improve your design in order to ask the machine to do less.

If neither suits you then maybe rebuilding your system in another data management environment maybe your best option? However, I would bet that it would generally be significantly easier to spend your time improving your existing FileMaker system, and/or maybe paying someone to help you, then to rebuild in a new environment?

I coach people on performance. I recently started doing this with a European business owner whose successful company relied entirely on what he has built over the past 20 years. I think it has 65 files. He has a great deal of knowledge having spent so much time on this over the years.

I am helping him to start to look at what he is doing in a different way, to think about what he is asking the machine to do and to see where some of the newer FileMaker features of which he is unaware can help him make his system simpler.

I am not doing anything directly for him, he knows his system best, and it would take a great deal of time for me to understand it. But by me helping him to understand more about what FileMaker can do, what a modern solution should include, examples of other ways of doing things, he can then start to develop a realistic method of updating his systems, getting some guidance when he needs it.

The most difficult thing is looking at a problem on your own. Often, just being able to talk it through will result in new routes to a solution forming.

It is worth noting that in the US it seems to be becoming easier for FileMaker Developers to recruit young Computer Science Grads because of the way that integration with other technologies is now being actively encouraged under Brad Freitag's leadership. Some developers in the US are apparently finding their market is expanding, or so I am told. Joining one or two user groups in the US for their online meeting can provide a different perspective to the European one.

So I think that the future is bright. We have the immense benefit of a method of problem solving that we all - or most us - love simply because we know it does the job and also know that there is really no viable alternative. It is rare, I suspect, to find a software platform where the users are as passionate about it as FileMaker? There has to be a reason for that I would suggest?

Claris has now moved from a waterfall to an agile approach to development. So everything has changed, fundamentally. The big opportunity this gives us now is an ability to persuade Claris to take on and solve important issues in months rather than 3 years. So anyone who really has an urgent issue that they can justify as being commercially important and that they can explain clearly in a reproducible manner should reach out to Claris and ask for a solution. I am confident that we will be seeing the results in the near future and I think that we can already see such results in FileMaker Server 19.

So thanks again to @Vincent_L for ensuring that we all think carefully about our views on Performance and I confirm that I will use his brilliant summary list of performance methods as the basis for future discussions here and elsewhere.

Best regards to all

Nick

6 Likes

For the record, I couldn't care less about my solution. It's not the topic, and none of the issue I describe are due to bad design. They are actual filemaker limitations.

Each time I stumbled on a performance problem, I made a very small, very simple, very nimble sample file (so no legacy overhead) that perfectly demonstrated the issue in a reproductible manner and submitted it to Claris report issue. None of those were adressed by FMI/Claris.

Claris has tons of properly documented files, threads with everything to capture the problems in reproductible manner. They just don't care.

During their latest webinar I asked precisely "why all perf improvements are targeted to multi-user, and none targeted to the actual engine, like single user use, but also single server side scripts.
They replied that they if we have bottlenecks we should submit them.

Like if they didn't know anything about any performances issues. My jaw dropped at this either naiveness or blatant side-setpping. The community website is full of example, highly detailed ones.

Claris lives in another world. My role, and the real community role should be to wake them up.

1 Like

FILEMAKER PERFORMANCE: CORE PRINCIPLES

How can I understand more about how FileMaker works and how to get the best out of it performance and user experience wise?

  1. UNSTORED CALCULATIONS

When are unstored Calcs useful and when should they be avoided?

  1. INDEXING

What effect does indexing have on Performance? Why would you want to reduce it? How can you design to reduce indexing?

  1. RELATIONSHIPS

What effect do relationships have on performance and what are the alternatives? What are examples of performant and non-performant relationship graph designs?

  1. TABLE WIDTHS

What effect do wide tables (tables with many fields) have on performance? How can you reduce table widths? Where do repeating fields fit in?

  1. PREDICATES

What are the performance implications of how you create relationship predicates, how many you use and their order?

  1. STARTUP

How can you improve the user experience by designing a fast startup into your solution?

  1. SCREEN REDRAWING

What is the effect on performance of windows, lists and portals redrawing and what is the effect of using freeze? When is it useful to use a blank layout within a scripted process and how does it effect performance?

  1. EXECUTESQL

What are the performance implications of using ExecuteSQ? When should you use this and how should you use it properly?

  1. DATA IMPORTS

How do you ensure that data imports run as fast as possible?

  1. SCRIPTING

How do I write fast and easy to maintain scripts to automate my solution?

  1. TRANSACTIONS

What are Transactions and how and when should I consider using them? What is good practice in designing a Transactions feature?

5 Likes

@nicklightbody I'm really impressed by what you're doing. Using simple methods of observation and measurement to test the behaviour of your software has allowed you to get great outcomes.

I remember looking into the RG of a database that you'd provided as an example being a little envious :grinning:. At the time I was working with a broader team. Everything that we did had to be shared and understood, so clarity and simplicity were more important that raw power. What I did take away from that early example was the importance of testing.

6 Likes

I'm backing you on this @jormond. I'm not sure what the total download was for those tests. The fact is that I was on a really slow connection on, literally, the other side of the world. I'm sure that a web page delivering the same payload would have been quicker but that is a false comparison. If all I wanted to do was view the images at this end then zipped tarball delivered via SCP would have been even faster. Once I got behind a slightly faster connection the application behaviour was completely functional. And @nicklightbody will have the stats recorded to show exactly what those speeds were.

2 Likes

Add:

  1. TRANSACTIONS

What are Transactions and how and when should I consider using them? What is good practice in designing a Transactions feature?

1 Like

@nicklightbody I just want to point out that you can edit your past post if you want. In this case, adding transactions to the original list of items, so everything is listed neetly under a single post.

But maybe you already know. Just a suggestion.

2 Likes

I'm situated within the world's largest (?) natural demonstration of the venturi effect: Wellington.

Thanks for the helpful comments.

On the image load - the key to the optimisation, beyond extreme simplicity of the RG, is the auto-entry of 99px, 320px & 640px versions of the original image on import - so that the work done upfront as an investment in the user experience later on.

So in the Edit Ui for example you have 99px images in the list, 320px images in the popovers and 640px images when you open the record.

In the View Ui you have 320px in the list and 640px in the record.

When you open the 640px image we convert the full resolution original image into Base64 and display that in a responsive webpage in the web viewer hence you get zooming and resizing as appropriate on iOS and Mac / PC.

When we export an image to share it in the 1667px size we create the required resolution image at that point without storing it.

FileMaker seems to handle container fields differently to other types of fields in that there appears to be no or very little load where a container is not displayed in the Ui. If you place the four stored image containers on the layout and adjust the right edge of the viewable area the speed of the Ui reduces as the viewable container load increases. So we optimise performance by only displaying the necessary size of image at each stage.

With other types of field apparently the FileMaker client side receives the entire content of the record whether it is viewable or not, although there is a speed difference where the more that is viewable and hence requires rendering the slower the display and corresponding the less you need to render the faster. This difference is of course far less than that created by displaying container fields containing large files.