I understand and could run svd() in R now, however, then I tried tsvd, seems it doesnt work?
I also managed to load the dense linear algebra library. But I only found gesvd, which is used to deal with the dense array. Looks to me the coordinate of the array should start with zero. Is there a function that could deal with sparse array?
Is there a way to migrate or redefine the array coordinate once the data were put into the template array? The way that I’m using is ‘subarray’, might not be practical all the time.
However,when I tried the gesvd, I get the following error
Error in scidbquery(query, afl) :
UserException in file: src/linear_algebra/scalapackUtil/ScaLAPACKLogical.cpp function: checkScaLAPACKInputs line: 110
Error id: DLA::SCIDB_SE_INFER_SCHEMA::DLA_ERROR41
Error description: Error during schema inferring. ChunkInterval is too small.
Failed query id: 1103046567470
my array is like this:
No name start length chunk_interval chunk_overlap low high type
1 0 Gn1 0 82 82 0 0 81 int64
2 1 timeint 0 30 30 0 0 29 int64
I look forward to getting some answers.
In addition, svd() in R works, but find it takes much longer time than PCA e.g. prcomp() in R on the R array.