Error id: scidb::SCIDB_SE_NO_MEMORY::SCIDB_LE_ARENA_EXHAUSTED

Hi

I have a 86 GB file that I am trying to upload to SciDB, but I get the following error:

iquery -aq "create array R<val1:int32>[i=0:18661890031];"
Query was executed successfully

iquery -anq "load(R ,'/home/scidb/us_aster.csv', -2, 'CSV');"

SystemException in file: src/util/arena/Arena.cpp function: exhausted line: 504
Error id: scidb::SCIDB_SE_NO_MEMORY::SCIDB_LE_ARENA_EXHAUSTED
Error description: Not enough memory. An attempt was made to allocate more memory than is available.
   arena  : {name="root",available=unlimited,allocated=2.1GiB,peakusage=2.2GiB,allocations=26030,features="FCT"}
   request: 64MiB.

Please let me know how to resolve this. I am using SciDB 16.9 available through AWS Community API.
The following is my config.ini file:

[mydb]
server-0=127.0.0.1,15
security=trust
db_user=dbuser
install_root=/opt/scidb/16.9
pluginsdir=/opt/scidb/16.9/lib/scidb/plugins
logconf=/opt/scidb/16.9/share/scidb/log4cxx.properties
base-port=1239
base-path=/home/scidb/data

execution-threads=9
result-prefetch-threads=8
result-prefetch-queue-size=2
operator-threads=2

mem-array-threshold=16384
smgr-cache-size=16384
merge-sort-buffer=1024
sg-receive-queue-size=32
sg-send-queue-size=16

@Samriddhi

  1. Can you create a file with first 1000 lines of the .csv file (maybe call it /home/scidb/us_aster_1_1000.csv, and then see if the import goes OK? Maybe proceed like this and check at what line number the import fails.
  2. Please let me know the available memory on your system (run free -m on the Unix terminal)
  3. Next, can you give me a preview of first 10 lines of the .csv file (you can use dummy data if you need to). That way I can try and run some tests on the data.

PS: I’m going to test on more recent versions of scidb (18.1 or 19.3). It’s unlikely that the problem has anything to do with scidb version though.