The ‘Set Next Step’ in the script debugger skips to ‘End If’ ignoring all ‘Else Ifs’ in between if set back to an earlier Else If script step. It also ignores all blue break points.
This has been driving me crazy today when trying to debug a 500 plus line script in v19.3
Malcolm, I haven’t carried out too much testing beyond what was causing me problems today, but I suspect you can. It seems to be if you set to an earlier ‘Else If’, then it skips to the ‘End If’, but I did go back to the top of my script and could then step all the way through it again.
I’m glad to report that Claris have confirmed that they can replicate the problem and have passed it on to the development team.
Although a big proponent of agile (fragile?) development, I am somewhat concerned about the lack of backward compatibility that has been coming with these updates. We’ve had to update some plug-ins with some of the releases and the phrase ‘breaking change’ seems to be acknowledged as an acceptable approach, but it means testing every client’s system in full multiple times per year now. Our primary SBA server is crashing roughly every 2-weeks and the most recent update appears to address what could be causing it. However, we can’t afford to upgrade, as we already know that WebDirect will break if we do, and if Claris can break their own Connect links, then we must check our own APIs.
I guess it is a case of ‘be careful what you wish for’. LOL!
Whatever the development approach is, we need to test the systems we are responsible for each time, we introduce an updated/upgraded component. The frequency of this cycle is limited by the available amount of money and resources.
This regression happens with the debugging tool, probably one of the worst place it may happen. I wonder if there are enough testers for each new releases. And event with the best will, who would think about regression tests, outside Claris ?
Thinking about all FileMaker does for the developer, the code is certainly huge. Is it possible to submit new builds to automated tests ? On top of that, usually beta tests come with guides that point testers to specific areas to test.
Does Claris do regression testing? Most IDEs automate regression testing using testing frameworks like JUNIT (as one example). These tests happen automatically when you build a new "app" or whatever and help ensure you haven't broken something that used to work.
I knew someone would reply with this as I was typing @Torsten
I fully appreciate your comment, but my point of this, after rereading the version 19.3.1 release notes is that there is no direct reference to, for instance, the breaking change of the Data API preventing downloading of files via the URL into an outside system, or the integration problems with Claris Connect workflows, or web viewer codes that have worked for 5-years no longer working (yes this is an Edge related issue, but this is still within FileMaker).
If you take our situation, we currently have a frequently crashing server that shares some of our SBA solutions across a number of customers. The problem we have that has been identified (by the Claris dev team) has been mentioned in the latest FM server release notes. However, we’ve never had the problem on our development server with server v19.0. 19.1 or 19.2, hence this is appears to be a server load issue (the specification of the server is fine and has been confirmed as such by the Claris tech team).
Our only option is to ‘suck it and see’, but we’ve still not finished rolling out v19.2, due to plugin updates to a large number of users. Now we’ve FMP v19.3 with these additional undocumented features. This server is running 12 different separated systems of varying age, so each update would require a complete test of each system, all of which need to be carried out on our development server, but we cannot replicate the server load and most of our clients do not have the time to do this testing for us and neither do we fully understand the exact workflow of each customer, as these change frequently without any communication with us.
As we integrate more and more with external systems, the chances of each update breaking something is far higher, than when everything was run within the FileMaker software.
At least we are in full control of our servers, we are not having updates forced upon us, hence we can control when we move to the next version, otherwise this situation would be unmanageable (it is hard enough as it is).
I still love the direction FileMaker is moving along, but I believe we need a lot more information of changes that are happening under the hood that we’ve had in the past.
All the best
Andy (stuck between a rock and a hard place )
Understood, I was just mentioning industry standard approaches to avoid these problems in the first place. Nothing is 100% of course so for all I know, they use standard testing practices.
Hi @AndyHibbs
I get your point. The workload becomes huge when multiple version of a system need to be sustained and updated.
In the enterprise environment, the entire 'agile' thing becomes moot when regression control is not mastered. For good reason, critical infrastructure software like server OS ship in an 'leading edge' flavour and are also canned as 'long term support' versions, even in the age of agile.