According to a 2011 white paper by David Kachel, “every field, calculation, script, layout, table, graphic, custom function or anything else you add after that first field, slows down your database.” I’m using Base Elements to identify unreferenced fields, etc in the database I inherited. I understand the benefits of deleting cruft and obsolete scripts. How much attention should I place on some of the other barnacles?
The non-other barnacles being just the fields? Asking because you said:
The 'etc' includes an unknown number of other schema elements so I don't know what you're already focusing on
And is the main focus to solve performance issues? Or just an effort to prune deadwood and technical debt?
If you have limited time to clean up the solution then I would suggest using the top call stats log as the basis for your focus. Map out what ia always listed as an expensive operation and work through your top 5 or top 10.
No point in cleaning something up that has no effect, while leaving something in place that has a big impact.
If you have unlimited time then do start with clearing out the deadwood first since it will make it easier to see the forest through the trees. Removing unneeded fields is a good tactic because the wider your tables are the more they become a drag on performance since FM typically works with the whole record. The impact of course depends on whether there is actually data in those stale fields. If not, then there is little penalty for FM in performance.
The other thing to focus on, which is typically not something you can identify quickly through BaseElements, is the graph complexity. If you have a giant spider vs. unconnected distinct Table Occurrence groups then that can have a big performance impact. And it can be quite a bit of work to untangle it.
My primary focus is removing deadwood, although I’m reviewing a few scripts (with multiple subscripts) that take longer to run than I would like. I’ll take a look at the top call stats log. I appreciate your assistance.