We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.
15
votes
2 Answers
13
votes
There's unix::rlimit_as()
that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit
in the shell. Windows and macOS not supported.
In my .Rprofile
I have
unix::rlimit_as(1e12, 1e12)
to limit memory usage to ~12 GB.
Before that...
I had created a small R package, ulimit
with similar functionality.
Install it from GitHub using
devtools::install_github("krlmlr/ulimit")
To limit the memory available to R to 2000 MiB, call:
ulimit::memory_limit(2000)
Now:
> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
rsession-memory-limit-mb=4000
to/etc/rstudio/rserver.conf
– GSeeulimit
works fine until you want to use all your cores. – otsaw