I'm working with rasters in R. Extracting data from them to coordinates, to be exact. However, my issue is not actually in working with the rasters so much as it is with reading in the specific rasters I want to work with at any given time.
I have 35 years worth of raster data, each raster is named in a way that reflects its date. For example: "raster.01.01.1990.tif"
All rasters are in one folder. Sometimes I need to extract data from all the rasters, in that case, it's straight forward. Set the wd, create a list of the rasters, and read those rasters into R.
setwd("C:/Users/User/Folder/Rasters")
f <- list.files(getwd())
ras <- lapply(f,raster)
No problem. I can then do what I need to do. However, sometimes I only want to work with certain years. In that case, I've been excluding certain rasters from my file list based on if their filename contains the years I don't want.
setwd("C:/Users/User/Folder/Rasters")
f <- list.files(getwd())
# Choose years to exclude, e.g 2010, 2015, and 2016
f <-f[lapply(f,function(x)length(grep("2010|2015|2016",x,value=FALSE)))==0]
ras <- lapply(f,raster)
This approach works; however, I can't help feel that there is probably a much more elegant solution. In particular, if I just want to work with 3 years of data I then have to manually exclude 32 years worth of data. Sure it doesn't take that long to type out...but it is inefficient.
Is there a more efficient way to exclude or include files based on file names than the method I'm using above?
Thank you!