Hi, 

I'm tring to build a database with around 1B rows. I'm testing on a 64bit AMD Opteron 2200 dual core x 2 CPU system with 16GB mem and 16G swap connected to an external fiber storage.

I've got a set of datafiles around 8GB. each with 100M records.

I'm running the following commands.

copy 300000000 records into test_table from 'datafile-1.csv','datafile-2.csv','datafile-3.csv';
copy 300000000 records into test_table from 'datafile-4.csv','datafile-5.csv','datafile-6.csv';
copy 300000000 records into test_table from 'datafile-7.csv','datafile-8.csv','datafile-9.csv';

The first command completes around 15mins. The second also completes around 15min. However the the third command forces the system to swap and at some point cause the system to crash.

I've also tried by NOT giving the record count i.e. copy into test_table from 'datafile-X.csv' which were not only slowe but also resulted in the same swap outage crash.

I need to load around 3 billion records. Any thoughts on what might be the problem here?

Thanks in advance,
Regards,
Ukyo