I am creating a very similar database with data at about 2 kHz. There are 8 sensors all outputting data at 2 kHz. I set up a 2d database that has time on the x axis and sensor number on the y. I am having some difficulty importing the data into the database, or rather the imports are taking a long time. I would like to optimise how long it takes to import the data especially since we have more than 2 years worth of data to upload.
Currently I am creating a data file that is ~1GB containing the data in this 2d format. 1GB is almost 30 minutes of data. It is taking me about 10 minutes to upload this whole file. This seems a lot longer than I would expect. Does anyone have any ideas on what I may be doing wrong, or different alternatives I may have?
I am using this command: iquery -aq “load(dbName, ‘filename’)” > /dev/null".
The reason for dumping the output is because it seems that an import has 3 phases. 1st it parses the file, 2nd it uploads the data, 3rd it prints the data. Printing 1 GB of data to stdout takes a very long time. If I can simply disable the printing of the data all together that may save some time also.
I am also loading the data in chunks of size (62000,8) with 0 overlap. I imagine there is an optimal size for the imports. Might it be faster to use chunks of (62000,1)? or (124000,8)?
Please let me know what you think!