Cannot export large table

I have a table with 400 fields and 30,000 records.

Trying to export to MER, XLSX both seemingly do nothing. Dialog seems hung.

I get the export dialog, but after 15 minutes it hasn't updated anything. The dialog bar is all the way to the right, but nothing has happened. It never says how many records left to export.

Database is local. No other users.

Suggestions?

Thanks

Let it run over night.

Sounds like something there isn't optimized much.

1 Like

OK. Will do.

Danke.

another option I often use is exporting to a single FileMaker file first and move further from there...

2 Likes

I've noticed in newer versions of FMP (20, 21...) sometimes the export progress dialog is missing or stuck. If you are scripting this export, it seems that sometimes adding some "Freeze Window", "Refresh Window", or "Pause/Resume Script" steps before the export can help with this behavior, though I don't know the exact sequence which works.

The export is c-r-a-w-l-i-n-g along.

I think Christian's advice is good. It'll probably take overnight. Several hours so far.

Thanks.

Danke! I may try that if this export doesn't complete by tomorrow.... :frowning:

You have something seriously wrong in this table. You mentioned in another thread a SQL query getting these records is also painfully slow (which if local, should not be). and now your export of supposedly the same table is also taking forever.

I'd be seriously investigating records in this table, do you have logging of some sort in fields? How large is the database file overall? I wonder if you have field(s) with huge amounts of text in them that is causing this issue.

How long does it take to export a sub-set of the records would be a good test, e.g. pick 5-10 and see how long that takes, if every record takes a long time to export, then check the size of the records/fields exported. Do you have a very expensive/inefficient calculation you are exporting?

If all else fails I'd be testing doing a recovery on the file and checking for corruption.

1 Like

Recovery says all good. The table has millions of words in fields.

Seems likely that's part of the problem.

1 Like

Splitting the job into parts can improve performance. Here's a discussion of the issue Gone in 60 Seconds: Running Long Scripts at Max Speed - Google Docs

1 Like

how 'big' are the text-fields? There is a limit (I believe about 10Mb) per field - FileMaker will not give any warnings, but gets really slow...

We had to read&visualize xml files. The normal size was below 200K - but people began sending xml with embedded scanned documents... After we realized that, we had to stop storing the xml in a field - the db was no longer 'workable' with xml-texts over 10Mb - we had to delete those field-contents (what was also very slow) on a local copy..
(The storage was only temporary)

1 Like