Import from FTP location/Variable

I have a client who has a nightly data dump and I'm setting up automatic nightly data imports.

I start by grabbing the text from the web:

Insert from URL [ Select ; With dialog: Off ; Target: $text ; "https://filesamples.com/samples/document/csv/sample4.csv" ; Verify SSL Certificates ] 

This works fine. The text comes in and I can display it in a global variable and it looks fine. But when I import

Import Records [ With dialog: Off ; Table: ImportTrackerBanks ; “$text” ; Add; UTF-8 ]

But it tells me it can't find a file named the first line of the data, so I tried encoding it

Set Variable [ $text ; Value: Base64Encode ( $text ) ] 

And when I run it I get an error saying:

The file “MS8xLzIwMjEgLSA3LzI2LzIwMjEgDQpTdHVkZW50LFNBU0lELFRhc2ssQ291cnNlICxEYXRlLEFj” could not be found and is required to complete this operation.
Import_Tracker_Banks_Remote in file ###

So I re-encoded it and put it back in the field and then pulled it back out:

Set Field [ Globals::Import ; Base64Decode ( $text ; "import.csv" ) ] 
Set Variable [ $Import ; Value: Globals::Import ]

And now when I run it it says "can't find the file import.csv".

What am I missing?

Hey @JasonMark

Apologies for the brevity -- just saw this on a quick break:

Looks as though you expect that the Import Records is designed to import data from a variable, but this is a mistaken notion (barring a new feature that I haven't heard of). Instead, Import Records is expecting a path to a file on the disk -- not the actual data itself.

So, after getting the data downloaded into the $text variable, you have the option of writing that data out to the disk to a known path, and then telling Import Records to import from that path. If you are dealing with a whole bunch of CSV data, that seems like a good option to me. If, on the other hand, you were only expecting to extract a small number of data from the file, you could also parse directly from the $text variable. But again, for a whole set of rows of CSV data, I think it better to write to disk, and then import from disk.

HTH!

4 Likes

As an option, a separate 'import table' can be used, from where data are copied to the target table. This breaks the import down in two steps, allowing for verification of imported data and, if necessary, dropping it, or parts of it, if test fails, whithout impeding the target table.

3 Likes

+1. This is what I usually do, whether I'm importing from a csv or tab file, or posting a web form response. A little more work, but worth it, IMO.

1 Like

Then you just have a relationship to the original and copy in every field manually? Or is there some sort of routine that simplifies this? @Torsten @bhamm

Do you need a separate import table for each import this way?

Each time, before importing a batch of records from a file, the import table is cleared (-> Truncate). Then the file is imported into the table.

Then, a validation routine checks each record of the new import for mandatory requirements (i.e. is field x not empty?, does the content of field y have the right format? etc.

Then, all records that have passed the test are copied to the target table. This can be done in a loop:

#we start on a layout with context of ImportTable
Loop
#build a JSON with all fields to be copied from this record of ImportTable

#go to layout with TargetTable context
#create new record in TargetTable
#copy values to fields in TargetTable from JSON
#go to layout with context of ImportTable

#go to next record of ImportTable, leave loop after last record

End Loop
#done

I'll add to this the ability to pre-process data. For example: the source data uses a US date format but you need an ISO date format; you want to calculate compound values; etc. This table has no index to update, so I say take advantage of it.

3 Likes

This is all super helpful. Thank you @Torsten and others. Sorry about the slow reply.