Question: Calculating Depth For Each Amplicon From A Mapped Bam
gravatar for Robert Sicko
7.3 years ago by
Robert Sicko610
United States
Robert Sicko610 wrote:

I have a mapped BAM file that contains 32 samples and 20 genes from a targeted re-sequencing experiment ran on a 316 ion torrent chip. Is there a way I can calculate the individual sample sequencing depth at each gene instead of all of the samples together? I used IGV to generate coverage data, but the result is all samples combined. I'm not sure if graphically is best since even exon to exon the same sample seems to have variation in depth. Or maybe a min, max, mean depth for each sample at each gene is the right way to look at it?

Also, I've been searching Biostar to get a feel for the standard depth required for SNV calling on a targeted re-sequencing project; while I couldn't find a definitive standard, I get the feeling ~30x depth is a good place to start... any thoughts?


Update: After doing some more searching, maybe the approach needed is a by-amplicon depth analysis. Meaning, at each amplicon generate the mean, std dev, min & max depth across all samples.

Update2 I used coveragebed in BEDtools to figure out how many reads in my BAM map to a BED of each amplicon and also a base by base depth report using the '-d' option. But this still would leave me doing manual mean, min, max, stddev calcs on ~720 amplicons. There must be some tool to look at the data this way right? That or perhaps I should be looking at depth data another way for a project of this type.

coverage ngs ion-torrent igv • 4.5k views
ADD COMMENTlink modified 7.3 years ago • written 7.3 years ago by Robert Sicko610
gravatar for Zev.Kronenberg
7.3 years ago by
United States
Zev.Kronenberg11k wrote:

Watch out for hard depth filtering! We have found, validated, and published causal alleles in low coverage areas.

Here is another fun way to think about it. Removing NGS error from the problem think about hets.

1) What is the number of reads that need to be sampled so that both strands are represented.

2) What is the probability that a site will be called homozygous when it is heterozygous.

The binomial distribution can be used to answer these questions.

ADD COMMENTlink modified 7.3 years ago • written 7.3 years ago by Zev.Kronenberg11k

Interesting... when you say hard depth filtering you mean throwing out variants called if they do not meet a certain depth requirement? I think we will certainly look at all variants called. I'm just trying to get a feel for what depth I need for optimal variant calling.

ADD REPLYlink written 7.3 years ago by Robert Sicko610

see revised comments above.

ADD REPLYlink written 7.3 years ago by Zev.Kronenberg11k
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 1079 users visited in the last hour