Scripting creation of a record in an un-related table

This is inspired by the discussion here: https://the.fmsoup.org/t/most-efficient-way-to-navigate-to-the-new-layout/

I have a "Log" table used for general auditing and debugging, and a Log() script which takes parameters (severity, message..) and creates a record in the log table. This is a general-purpose script, and as such it has to work from any context (window, layout, mode etc.)

Right now, the script

  • keeps track of which window is frontmost
  • parses the script parameters to extract severity, message...
  • creates a new window (or shows the Log window if it's already open)
  • creates the new record
  • commits it
  • goes back to the original window.

This works, but is pretty slow, and does occasionally have side effects such as changing window order, etc.

Is there a better way?

Common solutions:

  • instead of creating a new document window, create a new Card window (since that's apparently lower cost and doesn't affect the current layout)
  • FileMaker's ExecuteSQL would be great, but it doesn't work, since it's read only (SELECT, not INSERT)
  • Using a plugin such as MBS which can do a SQL insert: Monkeybread Software - MBS FileMaker Plugin: FM.InsertRecord
  • using the "Magic Key" technique - unfortunately, this requres the current TO to have a relationship with the Log() table, so it's not a good general purpose solution.

Here's a couple of new ideas I had:

Use PSoS
Write a Log() script that uses Perform Script On Server. Pass it a JSON string with the data, and the actual record creation happens server-side.

Pros:

  • Since the record creation is taking place in an entirely separate process, absolutely zero Window or Layout changes would be required.
  • if the timing of record creation is not critical, you can set Wait for completion OFF

Cons:

  • I'm guessing the overhead for this is gigantic
  • if Wait for Completion is OFF, you could have issues with record creation happening out of order

Use Data API
Use the FileMaker Data API, triggered by the InsetFromURL script step.
Pros:

  • may be lighter-weight than using PSoS ?

Cons:

  • requires special Data API setup
  • only works when files are served from FileMaker Server (e.g. wouldn't work when running in FileMaker Pro locally)

Any other clever ideas?

1 Like

Great topic:

  1. Regarding using PSOS for every log message. I always used to lobby against this out of concern of overwhelming the server. But, I have colleagues who have done it, and actually have never had an issue with it. As such, I no longer assert that it can't be done or shouldn't be done. I think that it requires careful consideration of how frequently this will be called, and how ready the server is for all of the extra psos calls. These considerations are mostly beyond my expertise, so I still avoid it for my own solutions.

  2. Also regarding PSOS, if all of your log calls include the output from Get( UTCMilliseconds ) as one of the parameters, then out of order inserts can still be dealt with without much or any issue.

  3. Instead of calling a script at every invocation, I used to build logging systems which would cache number of log entries in a global variable, and only push the data to be stored into log records periodically. How frequently the push would happen would be configurable. The benefit of this is that you don't have to hit the server as frequently. The downside is that a client crash means that you lose any un-logged statements in the cache.

  4. Another approach leverages something I mentioned in the other thread: If you were to split out your logging functionality into separate FMP file, this would allow you to call a log script (in the logging file) from any UI context, and the log file would be able to insert the record without any disturbance to your UI window/context/found set, etc.. This would be feasible if the authentication method for your solution uses some kind of external authentication. Otherwise, it can be cumbersome to ensure that the credentials in the main solution are also propagated to the logging file, unless there are just a few users.

  5. Regarding the Data API approach. I used to do logging via the XML API (prior to Data API). It worked great. Because I did not want to open up the XML API port on the server to the outside world, what I did was that I used a PSOS script, and the PSOS script would make the call to the API while operating on the server. I never really thought about this, but it does kind of make it seem like I broke my own rule about never using PSOS for logging. I guess the difference I should mention is that the logging I did with the XML API was just for occasional INFO and ERROR messages. The super-frequent logging that I avoided was with respect to performance logging, where every script call would be logged in order to determine the script durations.

2 Likes

Maybe this approach will help you: I log record changes first in the respective record as a JSON array in an AuditLog field. The user can manually trigger the transfer to the log table. Otherwise, a script runs on the server at regular intervals, which goes through all tables and records and transfers the changes to the log table. Afterwards, the log fields in the records can be overwritten.

I use a system from Arnold Kegebein called K3 AuditLog. No plugins required!

This way the communication with the server is kept low.


[Edit:] Here you can find a demo file of the module: https://filemaker-magazin.de/forum/beitrag/185490 The post in the forum is in German but the documentation and explanations in the file are in English.

2 Likes

Interesting. Does this actually work? I'm familar with having a Table Occurannce from another FileMaker file, but don't think I've ever tried calling a script in a separate file. I would assume that this would require the other file to be open in its own window, and this would naturally change the context.

Yes, the separate file does have to be open, and it does have its own window, and so also yes, when the script to add the record occurs, the context has changed, but as context changes go, it is a pretty non-meddlesome one, as it does nothing to change the context/found-set of the main solution window, and, unless there has been some script step which would cause otherwise, upon termination of the CRUD script, context returns right back to where it was prior to the call to insert the record.

Attached is a super simple illustration: A table of Person records.

The illustration is in the scripting to add a child -- the New Record / Commit is performed in a file which is separate from the UI file. As long as you start by just opening the UI file, the Data file remains hidden, and the CRUD operation happens out of sight of the presentation in the UI file.

Separation_Record_Creation.zip (139.6 KB)

3 Likes

I did not look at the sample file that was shared, but simply wanted to add that Karbon uses transactions in this way. Transactions are not a requirement for the technique to give you what you are after, but if you are to have a separate file, having scripts handling only business logic that you can test against can be a very good thing. Wrapping the whole thing in transaction make it even better in my opinion, but that comes with its own set of specificities.

2 Likes