On Tue, Jul 29, 2008 at 10:47:56PM +0300, uriel katz wrote:
i have a application where i have constant bulk inserts(about 50000 rows of 1Kbyte each) every 1 minute,i am wondering which
ie 50M per minute?
is the fastest and most efficient way to insert data. i am using now the COPY command but it is kind of awkward since i need to dump my data as csv and then execute COPY to insert it,is there any streaming method or a special api for bulk inserts?
COPY into is the fastest route, without special hacks. So I assume this is what you want. Then using 'dumpt to csv' then read from file isn't needed. You could simply create a single statement
COPY 50000 RECORDS INTO tablex FROM STDIN USING DELIMITERS ',', '\n'; here the 50K lines follow
Then atleast one extra 'empty line'
also when i issue a copy command it uses a little bit of cpu aroud 2%(it is a quad core setup with windows so i guess this means 8% of one cpu) and keep loading and releasing memory(i
Using 2% cpu indicates low IO (could be input your csv file/ or output bat files).
Does the table have any keys which may slow down the inserts/copy intos?
have 8GB of ram) even that it doesn`t get near 2GB. is this ok,what it is actually doing? P.S.: are insert/selects multi threaded?
In the latest code copy into is multi threaded.
Niels
thanks for this awsome peace of software! -Uriel
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK & win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100&url=/ _______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users