Can you Trust your Custom Function Library?

I've just discovered that an issue that I have been troubleshooting was being caused by a custom function which was producing the wrong result.

I do a lot of error-logging and use scripts to do in-line error logging but custom functions don't return errors - they simply return the wrong result.

It's now apparent that I really need to have a test suite for all my custom functions. I'm thinking that they need to be tested against every new environment as they arise. Including incremental version updates.


Karbon has a utility file that can be used to design test scenarios for Custom Functions.

Look for the file "ErrorCF.fmp12" in here (link to github).

1 Like

I believe even native functions do not return errors (unlike plugin functions that may use a different plugin call to read a plugin error).

Some will return "?". I think the functions for SQL & JSON are more verbose concerning "errors".

Functions do what they have to do, but cannot do much to prevent you from giving the wrong input. There are situations where there is no "correct" result if the input is faulty.

I know some people will mimic FileMaker's behavior and output "?". When expecting something, I prefer outputting an empty string as I feel it is easier to test for. Some may also prefer to output a specific error tag or json object.

Perhaps, the best approach is to have a custom function to identify if a given string matches one of the existing "patterns" we that we want to recognize as an error.

1 Like

A CF should take care of validating it’s result before returning and return an error code or nothing in case of an invalid result.
If validation is not possible within the CF, the calling script should take care of it.

Re-validating custom functions after each FM update happens in the broader context of regression testing. It is one of the downsides of agile development that regression testing has to happen more often. Having a sort of automation would be beneficial.

Having a suite of tests is 100% necessary. It allows you to test any changes to see if you break any tests. So your tests can span versions of FileMaker, and any know know use cases. When you run into something that fails, you can add it to your test cases.

Todd talks about this here. And Lance is also a big proponent of testing. You can so something similar with scripts, thought it's a bit more complex.


The gotcha, for a custom function returning an error code, is that we can't throw an error signal. Without the error signal, we're just passing back numbers or text.

I think that CF should just work. Like their built-in friends, we should be able to trust them to return the right result. In this case I got caught because I hadn't tested my CF in Claris FMC and the Get(HostApplicationVersion) output broke the logic in my CF. So, my fault, and the script kept forking down the wrong path as a result.

1 Like

The gotcha, for a custom function returning an error code, is that we can't throw an error signal. Without the error signal, we're just passing back numbers or text.

That is, unfortunately, the case. Error capture for CFs has to be a ‘home brew’.

Returns Pro 19.0.1 when the host computer is running FileMaker Pro version 19.0.1.
Returns Server 19.0.1 when the host computer is running FileMaker Server version 19.0.1.
Returns Cloud Server when the host computer is running FileMaker Cloud for AWS version 1.18.1 build 42.
Returns Cloud Server when the host computer is running FileMaker Cloud version 2.19.0 build 57.

The slightly inconsistent return format requires specific handling for on-premise and cloud versions. I would have been trapped by that one, too :no_mouth:

I think that CF should just work. Like their built-in friends, we should be able to trust them to return the right result.

It would be interesting to develop a reasoning, when the ‘trust’ and when the ‘know’ concept should be employed in FM-based development.

In aeronautics engineering the ‘does not work as expected’ result bears the unfriendly, yet revealing ‘failure’ tag. For each failure mode of an aircraft part, the outcome has to be analysed and classified. The outcome can range from ‘no consequences’ to ‘catastrophic’. The severity of the outcome dictates if the part can be ‘trusted’ in its current incarnation or not. If the latter is the case, engineers have to get back to work.
In SW development, I’d use a similar approach, asking ‘what are the consequences of failure?’ The answer will tell me how much effort I must put in error capture and regression testing.

Aircraft parts have a dedicated purpose so failure's consequences can be anticipated more easily. Code & custom functions tend to be built to re-use them in contexts you may not have anticipated. A catastrophic outcome is always possible, depending on how the code relies on what was custom-built (CF or script).

Not intended to be sassy, but it's going to sound that way, more a funny truth...irony if you will.

The function IS returning the right answer. Often, the problem is us asking the wrong question. LOL


yes, asking the right question is important. Accurate naming helps too.

As I'm working through my CF library and building a suite of unit tests for them I bumped into one CF which is named for what I wanted to achieve on that morning rather than what it actually does.


Yes, very familiar with said problem. :crazy_face:

I have a suite of custom functions whose job is to manage results, consisting of a result value and an error value. Advantage is your functions will always be able to return both a result value and an error value. Disadvantage is the suite adds overhead and your function results must be parsed. Let me know if this is of interest.

1 Like

Yes, it is. Thanks for the offer @bdbd. I'm always curious to see how other people solve problems.

1 Like

Sounds like an interesting idea. Thanks @bdbd

Following the paradigm of functional programming custom functions should never reference or set local or global variables.
So would not trust any CF using $var or $$vars ..

Isn't that a restrictive understanding of custom functions?

Applying the functional programming paradigm as a hard rule to gage the trustworthiness of custom functions seems, to me, the same as trusting wood joinery only to screws. It offers no guarantee of quality or reliability and is sometime simply inappropriate.

Scripts and functions, custom or not, differ. Unlike scripts, functions: are available everywhere; adopt the scope of the calling code; are limited to calculations, one of which is setting variables.

The ability to set variables when paired with the ability to adopt the caller's scope is valuable. It allows for two capabilities: pseudo-modularity and pseudo-encapsulation (everything seems pseudo in FileMaker).

As an example of the former, I use a custom function to transform a JSON structure into $vars in every script that takes a JSON parameter. I do this in the spirit of DRY, don't repeat yourself. It is impossible to do this with a sub-script.

As an example of the latter, I sometime use custom functions to manage $$vars. I am a believer that $$vars should be accessed in as few places as possible to enhance code maintainability. $$vars sometime need setting in places scripts can't run.

I learned that I could guarantee the existence of errors, not the absence of errors. I trust that all code can produce errors, especially considering the constant changes of dependencies.

Hope this helps.

There is a lot to be said for potential problems with Custom Functions that reference, or rely on, local and global variables...from a dependency perspective. Hiding the need for specific data to exist inside the function is a tough move to make, and causes a lot of required inside knowledge.

The expectation of functions that rely on data to be set or used, that can't be calculated at runtime, is to pass it in as a parameter. That allows the same use of variables, but they are then explicitly passed as parameters. So I'm with you 100% on this aspect.

As for Custom Functions that SET variables, I don't feel the same. In fact, it is one of the most powerful features of Custom Functions. Using functions to handle data gathering/setting, error trapping, and to handle a lot of the heavy lifting is essential to high level development. Most of this can be done with very little overhead, and is optional to actually use. Error trapping is a good example of this. You may read a lot more Debugging data than you actually log.

Because FileMaker's mechanism to pass data around involves variables, that partial what we have to use. That, in my opinion, still allows functions to be treated as first-class citizens, and carry a high level of trust.

there is no need to use {}vars for custom functions - it is bad design - can't be multi-threaded etc .. you can transform a CF with local or global vars into any side-effect free custom function by unification (as utilized in so called higher-order-programming languages).

that seams like a kludge - try to be schoolbook functional and return one value which in your example could be JSON embedding all errors and everything accomplished of the custom function.

For the most part, all my functions do return only a single result. There are definitely a few functions that return a JSON object or array. But that doesn’t make it non-functional programming. And it’s done intentionally, and is the expected result of the function.

Setting of variables doesn’t need to be returned in the result. Error trapping this way isn’t a kludge, it’s simply using the tools we have. Every language has tools used this way. It’s very common in JavaScript, Python, Java, C languages, etc. The mechanism is really the only difference.

On a side-note, I make no attempt to be “text-book” anything. This is about solving problems, making it maintainable, as easy to build as possible, performant to the users expectations. There is little value to me in calling myself a “functional”, “object-oriented”, or whatever developer. I solve problems for business and users. That is my identity.