It is not immediately apparent from this error message that I ran out of memory with R
Error: negative length vectors are not allowed
Execution halted
basically I loaded read depth information per genomic position (3 billion data points) into R, hoping it will work out. After googling around, it turns out that
- R Works on RAM
- Maximum length of an object is 2^31-1
Not sure if there's a solution out there ... still pretty noob in R ..
These didn't seem to offer help on my simple problem.
http://www.r-bloggers.com/why-we-need-to-deal-with-big-data-in-r/
http://www.revolutionanalytics.com/products/enterprise-big-data.php
Have you looked at the bigmemory and related libraries? http://www.bigmemory.org/
ReplyDeleteThey work with memory-mapped files, so you don't have to store everything in RAM. However, there are some limitations - for instance, at least in some cases your matrices can only contain numerical data.
Hiya, is this exact blog is your sole portal or you personally have some others?
ReplyDeleteWell it does read like a portal but it's more of a personal log than a public log
ReplyDeleteYou might be interested in the R package 'ff' whch maps your oversized variable onto a disk-based file. That is of course slow, but significantly increases your size limitation.
ReplyDeleteAlso try using R's 64 bit version for really big data.
R-3.0 includes full 64-bit vector support.
ReplyDeleteNice! Looking forward to the Final release is scheduled for April 3, 2013
DeleteThank your article. very helpful article. thank you very much.
ReplyDeletejuegos 2 friv 4 school