The file “MS8xLzIwMjEgLSA3LzI2LzIwMjEgDQpTdHVkZW50LFNBU0lELFRhc2ssQ291cnNlICxEYXRlLEFj” could not be found and is required to complete this operation.
Import_Tracker_Banks_Remote in file ###
So I re-encoded it and put it back in the field and then pulled it back out:
Set Field [ Globals::Import ; Base64Decode ( $text ; "import.csv" ) ]
Set Variable [ $Import ; Value: Globals::Import ]
And now when I run it it says "can't find the file import.csv".
Apologies for the brevity -- just saw this on a quick break:
Looks as though you expect that the Import Records is designed to import data from a variable, but this is a mistaken notion (barring a new feature that I haven't heard of). Instead, Import Records is expecting a path to a file on the disk -- not the actual data itself.
So, after getting the data downloaded into the $text variable, you have the option of writing that data out to the disk to a known path, and then telling Import Records to import from that path. If you are dealing with a whole bunch of CSV data, that seems like a good option to me. If, on the other hand, you were only expecting to extract a small number of data from the file, you could also parse directly from the $text variable. But again, for a whole set of rows of CSV data, I think it better to write to disk, and then import from disk.
As an option, a separate 'import table' can be used, from where data are copied to the target table. This breaks the import down in two steps, allowing for verification of imported data and, if necessary, dropping it, or parts of it, if test fails, whithout impeding the target table.
Then you just have a relationship to the original and copy in every field manually? Or is there some sort of routine that simplifies this? @Torsten@bhamm
Do you need a separate import table for each import this way?
Each time, before importing a batch of records from a file, the import table is cleared (-> Truncate). Then the file is imported into the table.
Then, a validation routine checks each record of the new import for mandatory requirements (i.e. is field x not empty?, does the content of field y have the right format? etc.
Then, all records that have passed the test are copied to the target table. This can be done in a loop:
#we start on a layout with context of ImportTable
Loop #build a JSON with all fields to be copied from this record of ImportTable
#go to layout with TargetTable context #create new record in TargetTable copy values to fields in TargetTable from JSON #go to layout with context of ImportTable
#go to next record of ImportTable, leave loop after last record
I'll add to this the ability to pre-process data. For example: the source data uses a US date format but you need an ISO date format; you want to calculate compound values; etc. This table has no index to update, so I say take advantage of it.