Running out of memory in R
1
1
Entering edit mode
8.3 years ago

I run GENIE3 for constructing Gene Regulatory Network on a 22,000x30 gene expression data which is about 20MB. When I run it, the RStudio fills my 4GB RAM, and it throws an error which is about memory lacking. I've tried it on a 8GB RAM, but it still has THE problem. How can I deal with it? Is there any easy way to parallelize multiple computers in order to increase Memory capacity? Or, is there a way for RStudio to use Virtual Memory on Hard Drive as much as it needs? If yes, how can I use them? I've heard about FF and bigmemory, but I don't know how to use them!

R GENIE3 Memory • 3.7k views
ADD COMMENT
2
Entering edit mode
8.3 years ago

How memory is used depends on the implementation. If the software wasn't built for parallelization with distributed memory, then there's nothing you can do to run it in this kind of environment. You don't want to use virtual memory on disk (aka swap space) because this is way too slow. The packages bigmemory and ff are meant to be used to write your own code for handling big data. I could be mistaken but I don't think you can use them to wrap existing code. At this point your options are either try the python version of genie3 or get enough RAM.

Also check that you're running a 64-bit operating system and a 64-bit version of R.

ADD COMMENT

Login before adding your answer.

Traffic: 1505 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6