Entering edit mode
14 months ago
bala • 0
I am working in bioinformatics metaphlan and human. My system has config of 64 gb ram with intel i9 processor with ubuntu 20.04 (swap 2gb space). The problem is when I ran the bioinformatics command it take more time and crash before the results.
I have no idea what to do. Some one kindly provide solution.
'Ran the bioinformatics command' - you need to be much more specific about what you were doing or there is no chance anyone will be able to help you. What software? What dataset? Did you use
htopto profile the memory use of the application?
Hi I am doing Kneaddata and humann3 for identifying the functional gene fro shot-gun metagenome using command
In both the cases the system took long time and hanged.
Please correct following first:
In general variables do not have "/" in their name.
Thank you Even with correct inputs (--input AST2R1.fastq --input AST2R2.fastq) it takes more time and hanged at the end
I realize this is not what you asked about, but will offer you an unsolicited advice. Having 2 Gb swap with a 64 Gb RAM is like not having swap at all. It seems like you have a relatively small disk so I understand the impulse not to "waste" the disk space, but there will come a time when you won't be able to do things without a swap, and you basically don't have it.
The opinions vary as to what swap size should be - anywhere from 0.5-2x the size of RAM. You have about 3% which is tiny. I am guessing based on your partition occupancy that you have created this system recently, so you still have time to expand the swap size.
Thank you Mensur Dlakic If I increase the swap space to 32 gb will it work.
I have no idea whether it will work or not in this particular case because you are not telling us anything about the size of your data, or the resource profile (see the suggestion about top / htop). Generally speaking, the swap size you have is tiny for ANY application.
Completely off topic, but in my experience once my program exceeded the available RAM and used swap it was as good as dead, so I usually don't attribute any swap space at all. Admitted, I haven't tried on super modern hardware with ultrafast SSDs as I'd rather looked for some more powerful server. So I'm a bit curious: did you really make good experiences with swapping applications?
Using a swap is not fun, especially with slow disks. It also depends on how many times a given program needs to swap, and I think swapping is most effective with programs that have a small peak of memory usage, but for the most part can fit into RAM during the run. For example, I have been able to run a program that requires ~100 Gb of memory on a 64 Gb computer with a 64 Gb SSD swap. So it is a way, however slow, to surmount memory shortcomings here and there, but I have since moved on to a 256 Gb server (still with 90 Gb SSD swap!).
Thank you for your reply My file size is 14 gb (paired shot-gun metagenome sample) and the reference database of 16.8 gb
Now i have increased my swap space to 32 GB, and I am running the same command again.
I'll update once the command finish