How to increase RAM for R scripts run in computer cluster
0
0
Entering edit mode
3.9 years ago

Hi all, This is probably not specific to facets but i am struggling to run facets.R in an HPC (linux).

My command:

Rscript --vanilla facets.R -i $PATIENT_ID -f $INPUT -o $OUTPUT -c $CVAL

Loading required package: pctGCdata
Taking input= as a system command ('gunzip -c /exports/igmm/eddie/WGS/variants/cnv/facets/pileups/E13.csv.gz') and a variable has been used in the expression passed to input=. Please use fread(cmd=...). There is a security concern if you are creating an app, and the app could have a malicious user, and the app is not running in a secure environment; e.g. the app is running as root. Please read item 5 in the NEWS file for v1.11.6 for more information and for the option to suppress this message.
Error: cannot allocate vector of size 2.3 Gb
Execution halted
  

I've give the job 32Gb of ram through the cluster setting but it keeps giving off this error. Is there a way around it? How do you increase RAM use for R scripts run through Rscript?

I've tried ulimit but the same error was obtained.

A

R linux cluster ram • 2.0k views
ADD COMMENT
0
Entering edit mode

Please show what you are running, so the code of that script.

ADD REPLY
0
Entering edit mode
ADD REPLY
0
Entering edit mode

https://stackoverflow.com/questions/12582793/limiting-memory-usage-in-r-under-linux

Apologies, copy/pasted wrong link from bookmarks; TLDR set ulimit globally

ADD REPLY
0
Entering edit mode

That's for windows- i'm using linux ie computer cluster.

ADD REPLY

Login before adding your answer.

Traffic: 2218 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6