dBase import of large files

One of my early uses of GS-Base was to query super-large .dbf files (created and managed with a Clipper-compiled program that used Advantage Database Server to break the 2Gb filesize barrier).

I don’t now remember quite what had to be done to allow GS-Base to import files from this environment, but I’ve just needed to go back to one of these files and it’s stopped working. GS-Base claims, after a few minutes furiously importing data, that the source file can’t be read.

This also happens if I constrain the read to just 500000 records instead of the full 29,000,000, and if I start from record 500,000 and try to import from there, so I’m guessing it’s not a corrupt record in the early part of the dbf.

Unfortunately, I don’t have access to the original environment anymore, so doing anything with the original data involving anything other than GS-Base is going to be a challenge…

Any ideas of anything I can try?

If a record has several fields or so, a few minutes for this number of records is way too much. The dbf file header may still be corrupt. Does this database record contain memo fields (and if so, what’s the total size of both files?).

Please try doing the following:

  1. Load just the first record. If this works, try increasing 10x.
  2. Open the Task Manager, set it to display the current memory usage, then start opening the dbf again and look for any unexpected large allocations/usage.
  3. Check the “Options > Maximum number of records” parameters.