ChIP-Seq alignment with random site for duplicated
Entering edit mode
9.0 years ago ▴ 20

Dear all,

I want to align my ChIP-Seq samples but with two strategies:

  1. Align the fastq.gz files and remove the reads that match to more than one site
  2. Align the fastq.gz files and, for those reads that are duplicated that match more than one site, assign a random site for them.

Can someone tell me hoe to do this?

As far as I know, I can use bowtie2: For second situation (random assign multi-matching reads):

  • Use the -a argument for bowtie2 to look for all the "best" sites for a read and selects the binding randomly from the "best" sites.

But I don't know how to perform the first situation (delete multi-matching reads).

ChIP-Seq alignment bowtie bwa • 2.1k views
Entering edit mode
9.0 years ago
Sander ▴ 20

Hi, I can recommend Bowtie1 for this purpose. It has an option for exactly what you want:

-m <int>           suppress all alignments if > <int> exist (def: no limit)

So in your case, you want to set -m 1

I don't have experience with Bowtie2, (I've looked in the help but couldn't find a similar option), maybe this options is hidden in a different parameter, or it is not possible.

For you 2nd strategy, in Bowtie1 you can use:

-M <int>           like -m, but reports 1 random hit (MAPQ=0); requires --best

Again, for your purpose -M 1

Entering edit mode
9.0 years ago
Fidel ★ 2.0k

For both cases you can use bowtie2 as it will map your reads using random mapping but you can filter them afterwards. Be sure to use an assembly that contains all unassigned regions (those _random). Otherwise some regions will appear unique when they are not.

To remove multi-reads, filter the mapping file file by mapping quality. Multi-map reads usually have a low quality score of 1. This is as far as I understand the best way to remove multi-reads.

I will discourage the use of bowtie1 if your reads are longer than 36bp.


Login before adding your answer.

Traffic: 3441 users visited in the last hour
Help About
Access RSS

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6