Question: How to increase RAM for R scripts run in computer cluster
gravatar for alhafidzhamdan
4 months ago by
alhafidzhamdan0 wrote:

Hi all, This is probably not specific to facets but i am struggling to run facets.R in an HPC (linux).

My command:

Rscript --vanilla facets.R -i $PATIENT_ID -f $INPUT -o $OUTPUT -c $CVAL

Loading required package: pctGCdata
Taking input= as a system command ('gunzip -c /exports/igmm/eddie/WGS/variants/cnv/facets/pileups/E13.csv.gz') and a variable has been used in the expression passed to input=. Please use fread(cmd=...). There is a security concern if you are creating an app, and the app could have a malicious user, and the app is not running in a secure environment; e.g. the app is running as root. Please read item 5 in the NEWS file for v1.11.6 for more information and for the option to suppress this message.
Error: cannot allocate vector of size 2.3 Gb
Execution halted

I've give the job 32Gb of ram through the cluster setting but it keeps giving off this error. Is there a way around it? How do you increase RAM use for R scripts run through Rscript?

I've tried ulimit but the same error was obtained.


linux ram R cluster • 250 views
ADD COMMENTlink written 4 months ago by alhafidzhamdan0

Please show what you are running, so the code of that script.

ADD REPLYlink written 4 months ago by ATpoint40k

Essentially this ->

ADD REPLYlink written 4 months ago by alhafidzhamdan0

Apologies, copy/pasted wrong link from bookmarks; TLDR set ulimit globally

ADD REPLYlink modified 4 months ago • written 4 months ago by bruce.moran860

That's for windows- i'm using linux ie computer cluster.

ADD REPLYlink written 4 months ago by alhafidzhamdan0
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 2166 users visited in the last hour