1
votes

I have had land coverage map in TIF format that presumably used to calculate an area-weighted yearly mean temperature for Germany. I downloaded this land coverage map data from here (direct download link of land coverage map for Europe). In particular, I intend to extract data of land/soil coverage for the city, agricultural area and vice-versa. In my first step, I imported this land coverage data with raster package. Here is my R script down below:

library(raster)
library(R.utils)
library(maptools)

url = "https://cidportal.jrc.ec.europa.eu/ftp/jrc-opendata/LUISA/PrimaryOutput/Europe/REF-2014/JRC_LUISA_Input_Corine_land_cover_2006_r_ref_2014.zip"
download.file(url, basename(url))
gunzip(basename(url))
rname <- list.files(getwd(), "tif$")

land_cover = raster::brick("~/LUISA_CLC_land_coverage/clc06_r.tif")

so far I am able to import original land coverage map in RasterBrick object in R. Note that original map was covered the whole europe, so I have to crop the regions that I am only interested in. To do so, I used maptools and raster package for clipping the map. Here is the R script down below:

data(wrld_simpl)
germany <- wrld_simpl[wrld_simpl@data$NAME == "Germany",]
germany <- spTransform(germany, CRSobj = land_cover@crs)
germany_land <- crop(land_cover, germany)

However, I assume that this cropped land coverage map in RasterBrick object better to be in grid shapefile with very high degree resolution, but how is it doable? Any idea?

The main point of raising this question is I need to retrieve all data of land/soil coverage for the city, agricultural area, and match this information to respective Germany NUTS-s level shapefile (download link of Germany level 3 shapefile).

I really don't have a solid idea how to utilize the data in this land coverage map for the sake of calculating area-weighted yearly mean temperature. Perhaps, a possible approach could be to retrieve land/soil coverage for the city, agricultural area data, then find match from Germany NUTS-3 level shapefile.

Here is how to get Germany' NUTS-3 shapefile( R script how to get Germany' NUTS-3 regions' shapefile in R):

library(maptools)
library(rgdal)
library(R.utils)

url = "http://ec.europa.eu/eurostat/cache/GISCO/distribution/v2/nuts/download/ref-nuts-2013-03m.shp.zip"
download.file(url, basename(url))
gunzip(basename(url))

getwd()
setwd("~/ref-nuts-2013-03m.shp/")
list.files(pattern = 'NUTS_RG_03M_2013.*.shp$')

eu <- readOGR(dsn = getwd(), layer ="NUTS_RG_03M_2013_4326_LEVL_2")
Germany_NUTS3 <- eu[eu@data$CNTR_CODE == "DE",]

So using this Gernamny' NUTS-3 shapefile, Germany_NUTS3, I want to extract all data of land/soil coverage for the city, agricultural area and vice-versa.

If such extraction of data from land coverage map in RasterBrick, how can I make this happen in R? Is that doable? Any efficient workaround to get this done? Any idea?

1
what kind of information? The landcover data is just a raster with a class assigned to each pixel - so from the top of my head you could extract the pixels for each feature in your shapefile and calculate the percentage of each classVal
In the end it will depend on what your goal is ... I was just assuming since you have NUTS regions and a landcover raster, that would be a natural way to aggregate the information to each region. So let's say if one of these regions is covered by 1000 pixels, if 100 of them are labeled "bare soil" you could argue that 10% of that region is covered by bare soil (of course this conclusion will depend on the quality of your landcover data, and the more pixel the more robust the conclusion)Val
In the raster, each pixel has a distinct class ... so you could extract all the pixels for each polygon (have a look at the raster package's extract function). That could give you and idea about the distribution of landcover classes within each polygon. Further, you could get an estimate of area through methods like pixel counting, which directly relates to the raster's resolution ... but this is turning into a theoretical discussion, and SO isn't really the place for that. I suggest you read some literature and googling (e.g. land cover area estimate from land cover classification)Val
Like this or this ... If you have a clear idea of what you want to do, you can think about the programmatical implementation. Hope that helps!Val

1 Answers

4
votes

As we discussed in the comments and in chat, this would be a quick and dirty method (the JRC approach, amongst others, would be certainly a better way to do it but also much more effort)

So you have your landcover land_cover and your NUTS regions over Germany Germany_NUTS3:

# you can take the raster function since it's only one layer

land_cover <- raster::raster("~/LUISA_CLC_land_coverage/clc06_r.tif")

eu <- readOGR(dsn = getwd(), layer ="NUTS_RG_03M_2013_4326_LEVL_2")
Germany_NUTS3 <- eu[eu@data$CNTR_CODE == "DE",]

# you can use Germany_NUTS straight for cropping the landcover, so we'll project it to the raster's coordinate system

Germany_NUTS3_projected <- spTransform(Germany_NUTS3, CRSobj = land_cover@crs)

land_cover_germany <- crop(land_cover, Germany_NUTS3_projected)

Now you can extract the landcover pixels for the NUTS region using extract from the raster package.

BE AWARE: This can take some time, especially if the area or the raster is big or you have many polygons. If you need to do this repeatedly, you might want to use a different approach.

As an example, I'll do it for one of the NUTS regions:

pixel_extract <- raster::extract(land_cover_germany,Germany_NUTS3_projected[2,])

If you use more polygons at once, pixel_extract will be a list of vectors with pixel values with each element representing a different polygon.

In this exemplary case, there's only one element, showing the landcover class values for the pixels within this polygon:

> head(pixel_extract)
[[1]]
   [1] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24
  [24] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24
  [47] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24
  [70] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24
  [93] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24
 [116] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 23 23 24 24 24 24 24 24 24
 [139] 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24 24  4 ...

Now, to derive the area covered by your classes of interest, you need to count the pixels and then multiply them with the area of a single pixel. Since the resolution of a single pixel is 100 by 100 meters, the area is 10000 m2.

To identify the land cover values of your desired classes, we can take a look into the LUISA_legend.xls file which comes with the raster:

   GRID_CODE CLC_CODE LABEL
    1   111 Artificial surfaces
    2   112 Artificial surfaces
    3   121 Artificial surfaces
    4   122 Artificial surfaces
    5   123 Artificial surfaces
    6   124 Artificial surfaces
    7   131 Artificial surfaces
    8   132 Artificial surfaces
    9   133 Artificial surfaces
    10  141 Artificial surfaces
    11  142 Artificial surfaces
    12  211 Agricultural areas
    13  212 Agricultural areas
    14  213 Agricultural areas
    15  221 Agricultural areas
    16  222 Agricultural areas
    17  223 Agricultural areas
    18  231 Agricultural areas
    19  241 Agricultural areas
    20  242 Agricultural areas
    21  243 Agricultural areas
    22  244 Agricultural areas

So to count the pixels, we just see which values are between 1 and 11 for artificial surfaces, and between 12 and 22 for agriculture. To get an area "estimate" we can calculate the number of pixels with the area of a single pixel.

areas <-data.frame(ARTIFICIAL_AREA=sum(pixel_extract[[1]]>=1 &  pixel_extract[[1]]<=11) * (100*100),
           AGRICULTURE_AREA=sum(pixel_extract[[1]]>=12 &  pixel_extract[[1]]<=22) * (100*100))

This gives you an area estimate in square meters:

> areas
  ARTIFICIAL_AREA AGRICULTURE_AREA
1       954030000       9282970000

If pixel_extract is a list with an element per polygon (that is if you used the full shapefile), you can calculate the areas with lappy and use do.call(rbind, to create a single dataframe:

areas <- lapply(pixel_extract, function(x) data.frame(ARTIFICIAL_AREA=sum(x >=1 &  x <=11) * (100*100),
                   AGRICULTURE_AREA=sum(x>=12 & x<=22) * (100*100))

areas_df <- do.call(rbind,areas)