Hi :>
I just tried to use GS-Base to work with a 12.8Gb .DBF file. I know the maximum filesize is available memory but I couldn’t remember if this was only RAM or RAM + virtual memory. Hoped for the latter (I only have 8Gb RAM on this machine) but it failed.
The error on the failure was just an “I can’t open this file” sort of error, and I wonder if it would be possible to make it a more specific “I’ve run out of memory” issue in order to distinguish it from “this file’s broken or corrupt” type problems…?
(I solved my immediate problem with an ancient copy of Advantage Data Architect, once I found an installer for it )
Hi,
If the message is “Can’t open the file.” then either it needs more memory or the file is corrupted. If the current max. record limit set in “Settings > Options” is too small (64k-256mln), the message would be different.
You should try increasing the paging file size in Windows as the default one is probably quite small. Unless that dbf record consists of a large number number of very small/short fields this should be sufficient to open it. You can verify this comparing the “File Open Progress” dialog with the Windows “Task Manager/Performance/Memory” data.
There are also some differences between the xBase *.dbf variants so make sure you’re specifying the correct file format.
Some programs may just scroll through the database loading only a portion of records at a time. In GS-Base you can use the “load records from/to” option which will proportionally decrease the memory usage. If you edit and save records within such a range, they’ll be correctly saved within the entire database boundaries.