Question: PE reads uneven after cutadapt
0
gravatar for blur
2.5 years ago by
blur100
European Union
blur100 wrote:

Hi! I used cutadapt on my PE reads, and then FASTX to collapse duplicate reads in my data. Then I proceeded to do bowtie2 - turns out that there was a different no. of reads in R1 than R2 so bowtie could not run at all! I assume this was not do to FASTX, but rather to cutadapt (I specified to remove reads that became too short after trimming). so my questions are:

  • Am I right in my assumption regarding cutadapt?
  • Is there any way to make the reads appear even again, without hurting the data? (I was not sure if just adding reads would be OK or not).

Thanks for the help!

bowtie fastx • 940 views
ADD COMMENTlink modified 2.5 years ago by dariober10k • written 2.5 years ago by blur100
1

If you are happy with the end result then you could use repair.sh from BBMap to re-sync your files (C: Calculating number of reads for paired end reads? ).

ADD REPLYlink written 2.5 years ago by genomax67k

Errr.... Why would it not be fastx_collapser? If one of a pair is duplicate and the other isn't, you'll lose a read from one file but not the other....

Cutadapt has a paired end mode. As far as I can tell fastx_collapser doesn't.

ADD REPLYlink written 2.5 years ago by i.sudbery4.5k

While fastx-collapser doesn't seem to support paired end processing, the tally tool apparently does the same thing and does accept paired data.

ADD REPLYlink written 2.5 years ago by i.sudbery4.5k

If we are talking about other read de-duplicators then dedupe.sh from BBMap would be a great candidate as well.

ADD REPLYlink written 2.5 years ago by genomax67k
1
gravatar for dariober
2.5 years ago by
dariober10k
WCIP | Glasgow | UK
dariober10k wrote:

Recent versions of cutadapt support paired end trimming while keeping the two files in sync. For example:

cutadapt -a AGATCGGAAGAGC -A AGATCGGAAGAGC -o trimmed.R1.fq -p trimmed.R2.fq read.R1.fq read.R2.fq

How did you run cutadapt exactly?

About your second point, maybe it's possible to rescue the corrupted files but I strongly doubt it is worth the effort (maybe as an exercise...). Just start again once you find out what went wrong.

ADD COMMENTlink modified 2.5 years ago • written 2.5 years ago by dariober10k
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 802 users visited in the last hour