I am new to data.tables so apologies if this is a very basic question.
I have heard that data.tables significantly improves computational times when working with large amounts data, and so would like to see if data.table is able to help in speeding up the rollapply function.
if we have some univariate data
xts.obj <- xts(rnorm(1e6), order.by=as.POSIXct(Sys.time()-1e6:1), tz="GMT")
colnames(xts.obj) <- "rtns"
a simple rolling quantile with width of 100 and a p of 0.75 takes a surprisingly long time...
i.e. the line of code
xts.obj$quant.75 <- rollapply(xts.obj$rtns,width=100, FUN='quantile', p=0.75)
seems to take forever...
is there anything that data.table can do to speed things up? i.e. is there a generic roll function that can be applied?
perhaps a routine to convert an xts object to a data.table object to carry out the function in a speeded up manner and then reconvert back to xts at the end?
thanks in advance
hlm
p.s. I didn't seem to be getting much of a response on the data.table mailing list so am posting up here, to see if I get a better response.
p.p.s having a quick go with another example using dataframes the data.table solution seems to take longer than the rollapply function, i.e. shown below:
> x <- data.frame(x=rnorm(10000))
> x.dt <- data.table(x)
> system.time(l1 <- as.numeric(rollapply(x,width=10,FUN=quantile,p=0.75)))
user system elapsed
2.69 0.00 2.68
> system.time(l <- as.numeric(unlist(x.dt[,lapply(1:((nrow(x.dt))-10+1), function(i){ x.dt[i:(i+10-1),quantile(x,p=0.75)]})])))
user system elapsed
11.22 0.00 11.51
> identical(l,l1)
[1] TRUE