Hi,

I am testing the performance of MonetDB/XQuery over large data.
I use a HP xw8600 workstation, with Intel(R) Xeon(R) CPU X5482 3.20GHz, and 32GB RAM.
I generate XMark data as large as 50GB, using standard XMark generator with 500 factor.
When I tried to load it into database using "pf:add-doc", error message returned as following.

ERROR = !ERROR: [shred_url]: 1 times inserted nil due to errors at tuples 0@0.
        !ERROR: [shred_url]: first error was:
        !ERROR: strPut: string heaps gets larger than 15GB.
        !ERROR: shredBAT_append_str: APPEND-STR[_prop_text](
        !ERROR: catch fain deserves liking knee katharine easy hand helms daughter corn hire amaz plainness curtains staying excuse since tom walls thirty goodman twice corrections flesh behold woundl
ess cargo royally zwagger nuncle season repair govern piercing sighs ounce arrest ganymede balthasar perjury
        !ERROR: ), BUNappend fails
        !ERROR: CMDshred_url: operation failed.
Timer 2633290.282 msec


Can you help me? How can I fix it?
Thanks a lot. 

--
Xianmin Liu