Haploview problem over heap space
0
1
Entering edit mode
9.8 years ago
Ripeapple ▴ 40

Hi, All,

I am having problem with haploview that Java keeps output errors like "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space" And "Fatal Error: Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded";

My command line is java -XX:+UseConcMarkSweepGC -Xmx40g -jar Haploview.jar -n -pedfile linkage.txt -info SNP_info.txt -dprime -blockoutput ALL (I tried allocate more memory still not working)

Can anyone help,

Very much appreciated!

haploview • 5.4k views
ADD COMMENT
0
Entering edit mode

Well that means it ran out of memory. In particular, GC Overhead Limit implies it's doing a lot of work. Haploview was designed for SNP panels, are you trying to use NGS? Cut down the size of the input data and see if that helps. It might be requiring quadratic or exponential memory so removing a few snps or subjects will help a lot. I hope it's 64-bit Java.

ADD REPLY
0
Entering edit mode

What exactly do you need to compute? PLINK 1.9's --blocks and --r2 dprime commands are very memory-efficient, and cover a lot (though not all) of the uses of that Haploview command.

ADD REPLY

Login before adding your answer.

Traffic: 1907 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6