![](https://secure.gravatar.com/avatar/36c84e59ffbd9afd5f410102317eda25.jpg?s=120&d=mm&r=g)
Hello all,
I'm working in a project and we are planning to use monetdb.
Our Data: We have around 15 millon users and 6,000 columns.
Our Problem We need to be able to query specific log data. We use Hadoop to massage the data and create a sql file with sql inserts (around 15 millon insert with 3000 columns)
Out Question Is it efficent to run a command like. By efficent I refe:. betweent 1 to 6 hours efficent, after that, a big problem.
monetdb -u ....... < my_sql_insert.sql
with 15 millon inserts and between 1000 to 3000 columns? (I'm not always going to create insert for the 6000 columns)
could monetdb handle multiple of those command if a split the file.
monetdb -u ....... < my_sql_insert_1.sql monetdb -u ....... < my_sql_insert_2.sql monetdb -u ....... < my_sql_insert_3.sql ...
Is SQL COPY more efficient than a sql scrip file with inserts?
BAT files is it significant faster ? I'm trying to avoid this approach since it seems a little bit obscue.
any other suggestion?
I'm not able to run all this test and get the info myself becuase I will not have a monetdb ready soon.
Thank you very much and I would be glad to send back an email with the results I got and contribute in that way.
Federico