"Centrifuge-build" tool - usage
0
1
Entering edit mode
5.2 years ago
deepak.deucy ▴ 10

hello, I am using "centrifuge-build" tool for nano-pore meta-genomics sequencing. The configuration i have in my system is 256 GB of RAM, 40 CORES for usage. When we start indexing in the process by giving specific amounts of cores. After some period of time memory % is gradually increasing and some time further in the process, it is consuming 100% of memory and process is getting hanged and finally getting halted. So please suggested a method to use memory. In the tool we have come across certain parameters for memory usage like "bmax", and "dcv". But i am very much confused about the parameters in the process and how to implement it. Please help me out. thanks in advance :)

centriguge metagenomics core nanopore • 2.7k views
ADD COMMENT
2
Entering edit mode

Please make it easier for us and add a link to the tool you are talking about. It is also more likely that the developers themselves can help you, for which a GitHub issue would be the most appropriate.

ADD REPLY
0
Entering edit mode

As mentioned by you in your reply, i have posted the same question in "github" .

And this is the link for the "Centrifuge-build" tool - https://ccb.jhu.edu/software/centrifuge/manual.shtml. Awaiting for your response.

thank you

ADD REPLY
1
Entering edit mode

Hello, I am trying to index the NCBI nt database using "centrifuge-build" tool, but I am also running out of memory. I'm running it on 16 CORES, 360GB RAM system but getting the error OUT_OF_MEMORY.

I'm using: centrifuge-build -p 16 --bmax 1342177280 --conversion-table gi_taxid_nucl.map --taxonomy-tree taxonomy/nodes.dmp --name-table taxonomy/names.dmp nt.fa nt

Any suggestions?

ADD REPLY

Login before adding your answer.

Traffic: 2996 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6