Question: Haploview problem over heap space
gravatar for Ripeapple
5.6 years ago by
United States
Ripeapple40 wrote:

Hi, All,

I am having problem with haploview that Java keeps output errors like "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space" And "Fatal Error: Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded";

My command line is java -XX:+UseConcMarkSweepGC -Xmx40g -jar Haploview.jar -n -pedfile linkage.txt -info SNP_info.txt -dprime -blockoutput ALL (I tried allocate more memory still not working)

Can anyone help,

Very much appreciated!

haploview • 3.5k views
ADD COMMENTlink modified 5.4 years ago by Biostar ♦♦ 20 • written 5.6 years ago by Ripeapple40

Well that means it ran out of memory. In particular, GC Overhead Limit implies it's doing a lot of work.  Haploview was designed for SNP panels, are you trying to use NGS?  Cut down the size of the input data and see if that helps. It might be requiring quadratic or exponential memory so removing a few snps or subjects will help a lot.  I hope it's 64-bit Java.

ADD REPLYlink written 5.6 years ago by karl.stamm3.6k

What exactly do you need to compute?  PLINK 1.9's --blocks and "--r2 dprime" commands are very memory-efficient, and cover a lot (though not all) of the uses of that Haploview command.

ADD REPLYlink written 5.6 years ago by chrchang5236.5k
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 1803 users visited in the last hour