Can someone explain this Trinity/Chrysalis output warning?
0
0
Entering edit mode
2.6 years ago
nina.maryn ▴ 30

Hello all, I'm currently running trinity on 3 reps of PE leaf tissue RNA from a species with no genome sequence in order to create a transcriptome for sequence analysis.

This is my first try at running Trinity on our universities computing cluster.

Here is my Trinity input code, and I'm running on a compute node with 1.5 TB memory:

Trinity --seqType fq --SS_lib_type RF --max_memory 1500G --left $raw/NM1_PG1_S64_R1.fastq,$raw/NM2_PG2_S65_R1.fastq,$raw/NM3_PG3_S66_R1.fastq --right $raw/NM1_PG1_S64_R2.fastq,$raw/NM2_PG2_S65_R2.fastq,$raw/NM3_PG3_S66_R2.fastq --CPU 6 --no_bowtie --output $trin

I've gotten through read normalization (~10% of reads kept, which for a large genome plant I think I expect because there are probably a lot of very highly expressed similar genes), and inchworm (I think?), and the output file said it's started working through chrysalis. However, for many hours the output keeps giving me something like this:

If it indicates bad_alloc(), then Inchworm ran out of memory. You'll need to either reduce the size of your data set or run Trinity on a server with more memory available. The inchworm process failed.warning, cmd: /software/sl-7.x86_64/modules/trinity/2.5.1/util/support_scripts/../../Trinity --single "//Trinity_PG/read_partitions/Fb_1/CBin_1253/c125330.trinity.reads.fa" --output "/Trinity_PG/read_partitions/Fb_1/CBin_1253/c125330.trinity.reads.fa.out" --CPU 1 --max_memory 1G --run_as_paired --SS_lib_type F --seqType fa --trinity_complete --full_cleanup --no_bowtie failed with ret: 256, going to retry. succeeded(10076), failed(3324) 10.3695% completed.

Files are being produced in the "read_partitions" directory. Is this fine or should I be concerned about these warnings?

Trinty de-novo RNAseq • 1.1k views
ADD COMMENT
0
Entering edit mode

From what I recall, Trinity is a memory hog. 1.5TB of RAM is insanely high though, and Trinity shouldn't complain. Check with your sysadmin if the node you are running Trinity on does indeed have access to the entire 1.5 TB of RAM, and set the max memory to something reasonable, like 500G to ensure the program doesn't expect the machine to have 1.5 TB RAM free entirely for itself.

ADD REPLY
1
Entering edit mode

1.5TB of RAM is insanely high though

Not necessarily. trinity devs recommend 1GB of RAM per Million paired-end reads. So depending on the input data size that may be needed.

nina.maryn : If you only have access to 1.5 T RAM then make sure you actually can use all of it and/or normalize the data before providing it to trinity to reduce overall memory requirements. bbnorm.sh (GUIDE) is one option.

ADD REPLY
0
Entering edit mode

Trinity contains a flag for computationally normalisation you could play around with, see this earlier answer exact command for read normalization using trinity software

ADD REPLY

Login before adding your answer.

Traffic: 2439 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6