Moving data from ODBC to FileMaker via script

You have a SQL data source and you like to get JSON from it inside FileMaker to pass that JSON on to a web services or DATA API? Or you have a SQL source and you like to move records from it to FileMaker easily? Or other direction?

Since MBS Plugin supports CubeSQL, Centura SQLBase, DB2, DuckDB, Firebird, Informix, InterBase, MariaDB, Microsoft Access, Microsoft SQL Server, MySQL, ODBC, Oracle Database Server, PostgreSQL, SQL Anywhere, SQLite, SQLCipher and Sybase as data source, we can help here. ODBC itself can of course be used for many more here.

Let us show you a script to connect to a data source by its name used in the system. This means you define in ODBC control panel on macOS, Windows or Linux to define an ODBC data source. You can use the GUI on each operation system to enter the credentials and store them there, install required driver or set whatever parameter is needed. You can alternatively check our help page for SQL.Connect to see how to put these parameters to connect into the connection string.

Once you are connected, you can run a SQL command with SQL.NewCommand and SQL.Execute and grab some records. You may put in the field names in the query if you like to define the order or only pick a few fields. If your field names contains special characters like spaces, please use " when coding them, e.g. "My Field". You may rename a column with AS command inside the SQL like e.g. CityName as City. And you can always put in static data as text, date or number or even use SQL functions like concating two texts like e.g. (FirstName || ' ' || LastName) as FullName. Of course these things are depending on what database you talk to and for SQL Server you would use CONCAT() function like this: CONCAT(FirstName, ' ', LastName) as FullName.

Here is a sample script to quickly connect to data source named text with ODBC, run a query and grab the result as JSON array of records:

# get json in file ODBC Query
 
# connect to ODBC data source named "Test"
Set Variable [ $Connection ; Value: MBS("SQL.NewConnection") ] 
Set Variable [ $r ; Value: MBS("SQL.Connect"; $Connection; "test"; ""; ""; "ODBC") ] 
# 
# run SQL query
Set Variable [ $Command ; Value: MBS("SQL.NewCommand"; $Connection; "SELECT * FROM MyTable") ] 
Set Variable [ $r ; Value: MBS("SQL.Execute"; $Command) ] 
# 
# get JSON records
Set Variable [ $$json ; Value: MBS( "SQL.GetRecordsAsJSON"; $Command ) ] 
# 
# cleanup
Set Variable [ $r ; Value: MBS("SQL.FreeCommand"; $Command) ] 
Set Variable [ $r ; Value: MBS("SQL.FreeConnection"; $Connection) ] 

Above the script has no error handling and eventually it will fail at one of the steps. Of course you can do that and just check on the end, whether $$json has a valid JSON. But you should do error checking and maybe log an error to your error table or email yourself. Let's do the same with error handling:

# get json with error handling in file ODBC Query
 
# connect to ODBC data source named "Test"
Set Variable [ $Connection ; Value: MBS("SQL.NewConnection") ] 
Set Variable [ $r ; Value: MBS("SQL.Connect"; $Connection; "test"; ""; ""; "ODBC") ] 
If [ MBS("IsError") ≠ 0 ] 
	Show Custom Dialog [ "Failed to connect" ; $r ] 
Else
	# run SQL query
	Set Variable [ $Command ; Value: MBS("SQL.NewCommand"; $Connection; "SELECT * FROM MyTable") ] 
	If [ MBS("IsError") ≠ 0 ] 
		Show Custom Dialog [ "Failed to parse SQL" ; $command ] 
	Else
		Set Variable [ $r ; Value: MBS("SQL.Execute"; $Command) ] 
		# 
		If [ MBS("IsError") ≠ 0 ] 
			Show Custom Dialog [ "Failed to execute SQL" ; $r ] 
		Else
			# get JSON records
			Set Variable [ $$json ; Value: MBS( "SQL.GetRecordsAsJSON"; $Command ) ] 
		End If
		# cleanup
		Set Variable [ $r ; Value: MBS("SQL.FreeCommand"; $Command) ] 
	End If
End If
Set Variable [ $r ; Value: MBS("SQL.FreeConnection"; $Connection) ] 

The plugin loops over the record set, so you may better do a limit in the SQL to only process maybe 100 records at a time. Pick whatever batch size is best for you.

After we got JSON, we can instead also import records into a FileMaker database with our SQL.InsertRecords function. It takes a list of field names, so the field names in SQL and FileMaker don't need to match. You specify the file name, so you can use a separate import file for the table to import into. Our plugin will loop over records from SQL and insert them to FileMaker. Field types must match, so if you have an Integer field in ODBC, you need a number field in FileMaker. Sometimes it can be tricky to match field types as it may depend on what data type the values have in the SQL data set and sometimes people store numbers or times as text. You can use CAST() command in SQL to change the data type, e.g. like CAST(curtime AS VARCHAR).

You may decide to store the last ID you saw from last import. Then you next query may include a condition like WHERE ID > :ID and a named parameter "ID" to pass the number to the query with SQL.SetParamAsNumber or SQL.SetParamValue.

Here is the script with no error handling to import some records:

# copy records from SQL Server to FileMaker in file ODBC Query
 
# connect to ODBC data source named "Test"
Set Variable [ $Connection ; Value: MBS("SQL.NewConnection") ] 
Set Variable [ $r ; Value: MBS("SQL.Connect"; $Connection; "test"; ""; ""; "ODBC") ] 
# 
# run SQL query
Set Variable [ $Command ; Value: MBS("SQL.NewCommand"; $Connection; "SELECT FirstName, LastName, CityName, 123 FROM MyTable") ] 
Set Variable [ $r ; Value: MBS("SQL.Execute"; $Command) ] 
# 
# insert records to FileMaker
Set Variable [ $FieldNames ; Value: "FirstName¶LastName¶City¶ImportID" ] 
Set Variable [ $r ; Value: MBS( "SQL.InsertRecords"; $Command; Get(FileName ); "Import" /* target table in FileMaker */; $FieldNames) ] 
# 
# cleanup
Set Variable [ $r ; Value: MBS("SQL.FreeCommand"; $Command) ] 
Set Variable [ $r ; Value: MBS("SQL.FreeConnection"; $Connection) ] 

You can run this server side if you like to avoid transferring all the data from server to client and back from client to server. But before you install an ODBC driver, you may check if we can do the connection directly, e.g. for MySQL or PostgreSQL. Don't forget to use some condition in the SQL to e.g. only grab records newer than the last one you saw.

To go the other way around, you can run a query in FileMaker with FM.SQL.Execute function and get records in FileMaker. This may again use quotes with backslash for field names with spaces or otherwise special characters. The field names don't need to match between FileMaker and the external database, since you pass the list of field names in SQL to the FM.SQL.InsertRecordsToSQL function as well as the target table name. Field types should match, but you can do CAST() in the SQL to change data types as needed, put in constant values or include SQL function calls.

Here is again a sample script without error checking:

# copy records from FileMaker to SQL Server in file ODBC Query
 
# connect to ODBC data source named "Test"
Set Variable [ $Connection ; Value: MBS("SQL.NewConnection") ] 
Set Variable [ $r ; Value: MBS("SQL.Connect"; $Connection; "test"; ""; ""; "ODBC") ] 
# 
# run SQL query in FileMaker
Set Variable [ $Records ; Value: MBS( "FM.SQL.Execute"; Get(FileName); "SELECT FirstName, LastName, \"City Name\" FROM Import") ] 
# 
# insert records to SQL database
Set Variable [ $FieldNames ; Value: "FirstName¶LastName¶City" ] 
Set Variable [ $r ; Value: MBS( "FM.SQL.InsertRecordsToSQL"; $Records; $Connection; "Export" /* target table in SQL */; $FieldNames) ] 
# 
# cleanup
Set Variable [ $r ; Value: MBS("FM.SQL.Release"; $Records) ] 
Set Variable [ $r ; Value: MBS("SQL.FreeConnection"; $Connection) ] 

If you do several imports, you may actually keep the $connection alive and reuse it. But on server scripts, please always make a new connection to avoid using one connection at the same time in two scripts.

Please try and let us know when you have questions. We'll include this scripts in the ODBC Query example file for v13.1, so you can copy & paste them.

3 Likes