[MonetDB-users] Parallel COPY INTO
22 Sep
2011
22 Sep
'11
5:41 p.m.
I have a large table with about 50 billion records. The files to be loaded are not in CSV format. The files are processed in java and the data uploaded to the database using COPY INTO through the MapiSocket interface. I do not have the record count readily available.
My question is this: If I open multiple MapiSockets and upload multiple streams of data at the same time to the same table, is this supported and stable?
I am asking because the database appears to become unresponsive after the bulk load. (mclient seems to hang on connect) If I stop and the start the database after the bulk load, I get: "failed to fork mserver: database 'test_db' has inconsistent state (sabaoth administration reports running, but process seems gone)"
--
View this message in context: http://old.nabble.com/Parallel-COPY-INTO-tp32503834p32503834.html
Sent from the monetdb-users mailing list archive at Nabble.com.
4615
Age (days ago)
4615
Last active (days ago)
0 comments
1 participants
participants (1)
-
zack_pcd