redimension_store and memory error


#1

When i try the command below i get the mem allocation error, here is the schema and the data and i have only one row. Please let me know what i am doing incorrectly.

[code]CREATE EMPTY ARRAY abc_load
<
id : int64,
d1: int64,
a1 : int64,
d2 : int64,
d3: datetime,
a2 : string,
a3 :string,
a4 : double,
a5 : double,
a6 : double,
a7 : double,
a8 : double,
a9 : double,
a10 : double,
a11 : string,
a12 : string,
a13: datetime

CREATE EMPTY ARRAY abc
<

a1 : int64,
a2 : string,
a3 :string,
a4 : double,
a5 : double,
a6 : double,
a7 : double,
a8 : double,
a9 : double,
a10 : double,
a11 : string,
a12 : string,
a13: datetime

[ d1=0:,100000,0, d3(datetime)=,3660,1, d2=0:*,100000,0 ][/code]

AFL% redimension_store(abc_load,abc); SystemException in file: src/query/executor/SciDBExecutor.cpp function: executeQuery line: 232 Error id: scidb::SCIDB_SE_NO_MEMORY::SCIDB_LE_MEMORY_ALLOCATION_ERROR Error description: Not enough memory. Error 'std::bad_alloc' during memory allocation. Failed query id: 1100963861036 AFL%


#2

Hmmm. Another example of chunk size problems.

Your dimensions are:

[ d1=0:*,100000,0, d3(datetime)=*,3660,1, d2=0:*,100000,0 ]

Now, each chunk in your array is 100000 x 100000, which (if your chunks are dense) yields 100,000,000,000 values per chunk. If this is dense array, then what you have is about 80G per chunk. As your array has 13 attributes, you’re going to be filling 13 chunks at about that size.

We recommend that you size your chunks so that each of them contains about 1,000,000 values. If your array is dense, set your chunk size to 1000 x 1000. At 8 bytes per value, that will yield chunks of about 8 Meg per chunk.

[ d1=0:*,1000,0, d3(datetime)=*,3660,1, d2=0:*,1000,0 ]

#3

I reduced the size of the chunks, and it worked. The array had only one row, the one posted, but with large chunk value it failed. I am using the 12.03 version.

Thank you.