Entering edit mode
2.2 years ago
Zain Arifin
•
0
Hi!
I am performing CUT&Tag data analysis, and I am currently generating a coverage file using bamCoveragefrom the deepTools suite. For general best practice, what would be the best normalization method? I am currently using CPM
but I am unsure if this is the best.
Any opinion would be very appreciated!
Many thanks!
Hi ATpoint, many thanks for the feedback. It worked well for CUT&Tag data, where I have the same replicates of differing read depths. Typical normalization fails to account for this, but the use of TMM normalization does.
Quick question. Aside from visual examination, is there a metric that can be used to determine the similarity between coverage files? As in, how can we objectively quantify that this method performs better than the other? Thanks.