Question: Tophat2 Reporting output tracks failed
gravatar for mbio.kyle
4.8 years ago by
United States
mbio.kyle340 wrote:

I am working on analysing some RNA sequence data using tophat2. The data comes from a variety of sources (SRA, CGHub, etc). I have both paired and non-paired data. I have consistenly ran into the following error:

[2015-02-13 16:13:23] Beginning TopHat run (v2.0.13)
[2015-02-13 16:13:23] Checking for Bowtie
          Bowtie version:
[2015-02-13 16:13:23] Checking for Bowtie index files (genome)..
[2015-02-13 16:13:23] Checking for reference FASTA file
[2015-02-13 16:13:23] Generating SAM header for #############3
[2015-02-13 16:13:23] Reading known junctions from GTF file
[2015-02-13 16:13:23] Preparing reads
     left reads: min. length=58, max. length=58, 32687348 kept reads (12356 discarded)
[2015-02-13 16:20:48] Building transcriptome data files #######
[2015-02-13 16:20:48] Building Bowtie index from ########.fa
[2015-02-13 16:20:49] Mapping left_kept_reads to transcriptome ########### with Bowtie2
[2015-02-13 16:27:48] Resuming TopHat pipeline with unmapped reads
[2015-02-13 16:27:48] Mapping left_kept_reads.m2g_um to genome ######## with Bowtie2
[2015-02-13 16:49:05] Mapping left_kept_reads.m2g_um_seg1 to genome ###### with Bowtie2 (1/2)
[2015-02-13 16:52:27] Mapping left_kept_reads.m2g_um_seg2 to genome ######## with Bowtie2 (2/2)
[2015-02-13 16:57:39] Searching for junctions via segment mapping
[2015-02-13 17:17:24] Retrieving sequences for splices
[2015-02-13 17:17:24] Indexing splices
[2015-02-13 17:17:25] Mapping left_kept_reads.m2g_um_seg1 to genome segment_juncs with Bowtie2 (1/2)
[2015-02-13 17:18:53] Mapping left_kept_reads.m2g_um_seg2 to genome segment_juncs with Bowtie2 (2/2)
[2015-02-13 17:20:46] Joining segment hits
[2015-02-13 17:20:46] Reporting output tracks
Error running /usr/bin/tophat_reports (.......)

I am running tophat with all default parameters on a debian linux workstation with more then adequate RAM, processor power and harddrive space. I am running tophat2 version: 2.0.13, I reinstalled it and built it from source. I have found many mentions of this error on various forums they are old, and seem to have been solved by a past update.

I have encountered the case where one sample from a group causes the error, while another completes without any error using the exact same parameters.

I am looking for any and all suggestions, tips or advice





ADD COMMENTlink modified 3.7 years ago by kanika.15180 • written 4.8 years ago by mbio.kyle340

I had same problem and increasing memory(RAM) solved the issue. How are you sure you have more than adequate RAM?

ADD REPLYlink written 4.8 years ago by Parham1.4k

I have 32GB on my workstation and I have been monitoring the progress via htop, the RAM useage does not seem to go over ~1GB. I am hoping 32GB is enough...


ADD REPLYlink written 4.8 years ago by mbio.kyle340

The only thing you can do is look in the run log and execute the last command manually. Perhaps that will produce a more meaningful error message.

ADD REPLYlink written 4.8 years ago by Devon Ryan93k

I had a problem with Tophat2 as I aligned 3' UTR derived reads and no junctions were found at all. How many junctions are reported by Tophat2 in your runs?

ADD REPLYlink written 3.8 years ago by michael.ante3.5k
gravatar for michael.superdock
4.2 years ago by
United States
michael.superdock0 wrote:

I encountered the same error using tophat 2.0.13.

I experienced this error when I was trying to map reads to a custom gene sequence. When I looked in the run.log file produced by tophat, I recognized that some of the parameters in the call to tophat_reports were missing. If you find the same thing, it is very likely that you have run into the same problem as me. You will find that even if you include those parameters manually, tophat will still fail to report output tracks.

This error can occur when none of your reads map. When this is the case, as it was in my situation, adding another gene sequence that reads do map to was an easy work-around.


ADD COMMENTlink written 4.2 years ago by michael.superdock0
gravatar for kanika.151
3.7 years ago by
kanika.15180 wrote:

This error I got when my disk was 97% full so, you might have to make some space and make sure there are not any other jobs running which utilize a lot of RAM.

ADD COMMENTlink written 3.7 years ago by kanika.15180
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 1374 users visited in the last hour