Disstribution of the memory useage
0
0
Entering edit mode
7.1 years ago
boymin2020 ▴ 80

I want to use SortSam of Picard to sort individual WGS data, whose size range is 40~100G. And I use qsub command line to submit my script. Here I am not sure how much memory should I distribute to every job (per individual) in the parameter -l s_vmem. I test several times to check memory usage in one node of our cluster, the memory usage was between 17.5~35G.

PS:Because I have lots of WGS files, it can save my huge time if I can efficiently use the whole memory of the cluster. Of course, the prerequisite is to make sure all of the jobs can be completely done finally.

Thanks for any advice.

picard sortsam • 1.2k views
ADD COMMENT

Login before adding your answer.

Traffic: 2957 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6