15
votes

We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.

2
I had problems with this before too (link), which might be related to your problem. The solution we ended up with was to entirely disable memory overcommiting on the machine. It is a blunt solution but has worked fine.Backlin
If, by chance, you use RStudio server, you can set user limits by adding a line like rsession-memory-limit-mb=4000 to /etc/rstudio/rserver.confGSee
is this unix.stackexchange.com/questions/44985/… useful? (i.e., not an R-specific approach, but if you can come up with a generic per-process solution that works on your OS, then you can set up an alias for R that imposes it ... Seems like this github.com/pshved/timeout would be particularly usefulBen Bolker
ulimit works fine until you want to use all your cores.otsaw

2 Answers

13
votes

There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.

In my .Rprofile I have

unix::rlimit_as(1e12, 1e12)

to limit memory usage to ~12 GB.

Before that...

I had created a small R package, ulimit with similar functionality.

Install it from GitHub using

devtools::install_github("krlmlr/ulimit")

To limit the memory available to R to 2000 MiB, call:

ulimit::memory_limit(2000)

Now:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
8
votes

?"Memory-limits" suggests using ulimit or limit.

There is a command line flag: --max-mem-size which can set the initial limit. This can be increased by the user during the session by using memory.limit.