How can I use the R packages zoo or xts with very large data sets? (100GB) I know there are some packages such as bigrf, ff, bigmemory that can deal with this problem but you have to use their limited set of commands, they don't have the functions of zoo or xts and I don't know how to make zoo or xts to use them. How can I use it?
I've seen that there are also some other things, related with databases, such as sqldf and hadoopstreaming, RHadoop, or some other used by Revolution R. What do you advise?, any other?
I just want to aggreagate series, cleanse, and perform some cointegrations and plots. I wouldn't like to need to code and implement new functions for every command I need, using small pieces of data every time.
Added: I'm on Windows
mmap
package which was created by Jeff Ryan (author of xts) – CHPmmap
on unix-alikes andMapViewOfFile
on Windows. You don't need to know any of that to use the package, which is why I asked if you actually looked at (i.e. tried) the package. There's a vignette with examples and Jeff has several presentations floating around on the web. – Joshua Ulrich