Import not working when running on server

I have the following script:

Freeze Window

Go to Layout [ “Settings” (Settings) ; Animation: None ]

Set Error Logging [ On ]

Set Error Capture [ On ]

Set Variable [ $Path ; Value: Get ( DocumentsPath ) & "StudentsFromPowerSchool.csv" ]

Set Variable [ $host ; Value: “###:##” ]

Set Variable [ $creds ; Value: "--user ###:### --connect-timeout 10 --max-time 600" ]

Set Variable [ $fname ; Value: "CompleteStudentData.csv" ]

Set Variable [ $URL ; Value: "sftp://" & $host & "/~/Powerschool/" & $fname ]

Insert from URL [ With dialog: Off ; Target: Settings::LatestSchoolBrainsImport ; $url ; cURL options: $creds ]

Pause/Resume Script [ Duration (seconds): 5 ]

Set Variable [ $LastError ; Value: Get ( LastError ) & " " & Get ( LastErrorDetail ) ]

Create Data File [ “$Path” ; Create folders: Off ]

Open Data File [ “$Path” ; Target: $fid ]

Pause/Resume Script [ Duration (seconds): 5 ]

Write to Data File [ File ID: $fid ; Data source: Settings::LatestSchoolBrainsImport ; Write as: UTF-8 ]

Pause/Resume Script [ Duration (seconds): 5 ]

Close Data File [ File ID: $fid ]

Set Variable [ $fileID ; Value: "" ]

# FileMaker does funky stuff with container data stored by reference in variables, so it's best to clear this global _after_ exporting the data above.

Set Variable [ $! ; Value: Evaluate ( "Let ([ " & $containerDataVarName & " = \"\" ]; \"\" )" ) ]

Go to Layout [ “Import_Student” (Import_Student) ; Animation: None ]

Show All Records

Delete All Records [ With dialog: Off ]

Import Records [ With dialog: Off ; Table: Import_Student ; “$Path” ; Add; UTF-8 ]

Go to Layout [ “Import_Student” (Import_Student) ; Animation: None ]

Show All Records

When I run it locally it has no problem. It downloads the 20MB file to a container, writes it to hard drive and imports all 16k records. When I run it on the server it only imports between 2000 and 2,300 records (different random number every time).

I tried putting the waits in and increased the max-time from 60 to 600 and got the same result.

Any thoughts on what’s going on? My desktop client is 22.0.2, but the server is on 21.0.2.202, since I didn’t want to put 22 into production until I waited to see what it did.

How big is your data file? Is it larger than 64MB?

20mb

1 Like

Is it correct of the data file created? Could you add some error trapping steps after the import record step to see if there is any error or you could see that in the fms log?

1 Like

Not knowing the actual issue here, but if I were debugging the first thing I would verify is that the downloaded CSV in Target: Settings::LatestSchoolBrainsImport is entirely there and correct. If the file is not all present, that would explain missing records, and help pinpoint the exact issue.

If the file is present in its entirety, the next step is after writing it to disk, to check the file exported to disk, again to verify it’s integrity and ensure it is 100% correct. I’d even go as far as to diff it with the original to ensure no differences have been introduced.

For example when you write the file to disk, your’e specifying UTF-8 as your character encoding. If this is a different encoding than the original file, you might be breaking the file during this export.

I’d then likely try a method of export other than write to disk. I’d be looking to use the BaseElements plugin and their BE_ExportfieldContents function (BaseElements-Plugin/docs/Functions/BE_ExportFieldContents.md at 9a45eed071e571978df6b329ca31cebeba701f87 · GoyaPtyLtd/BaseElements-Plugin · GitHub)

Lastly, if all looks fine from the above, it will at least help you pinpoint the issue specifically with the import records script step. This could be a bug in server itself with importing such a large file? At least it will help narrow the issue down specifically.

1 Like

side note, version 21.1.1 will support the export field contents script step sever-side when it is released.

1 Like

21.1.1 came out last year… I’m guessing you mean 22.1.1? Humm… I wonder if I just muddle through for the next 5-6 months with a manual button and automate it then…

What if you Open a transaction before Import Records?

There’s a weird CURL bug which affects transfers, where the transfer may fail if it’s above a certain size, and also depending on the content of the transfer. Personally I’ve only seen it when sending SFTP but I wonder if it also affects receiving as well? The buggy curl version is present in early 8.x curl versions and and it was fixed in version 8.9. It may also be present in late 7.x versions?

I wonder if, in fact, your data file is not downloading fully, and this is because the curl version is different across machines?

For background, see:

Sorry yes I meant 22.1.1