Question: How to calculate the joint entropy H(X,Y,Z) where X, Y, Z are position of MSA
0
gravatar for sivanc7
4.0 years ago by
sivanc710
sivanc710 wrote:

Hi to all!

I want to perform a mutual information (MI) among three or more position of one multiple sequence alignment (MSA). So far, I know that

MI(XY) = sum( p(X,Y) * log ( p(X, Y) / ( p(X) * p(Y) ) ) )

But the MI among three position is:

MI(X,Y,Z) = MI(X,Y) + MI(X,Z) + MI(Y,Z) - [H(X) + H(Y) + H(Z) - H(X,Y,Z)]

where H(X) is calculated as:

H(X) = sum( p(X) * log p(X) )

The problem here is that I don't know how to calculate the joint entropy H(X,Y,Z). Someone knows how to calculate it? Also, if someone knows how to extend the MI among more than three position I will be grateful for the info.

Thanks to all! =)

 

entropy mutual information rna • 2.2k views
ADD COMMENTlink modified 4.0 years ago by Brian Bushnell16k • written 4.0 years ago by sivanc710

It's in Wikipedia: http://en.wikipedia.org/wiki/Joint_entropy.

ADD REPLYlink written 4.0 years ago by matted7.0k
0
gravatar for Brian Bushnell
4.0 years ago by
Walnut Creek, USA
Brian Bushnell16k wrote:

I don't really understand why this matters.  Why can you not take the minimum of the individual entropies of the different sequences?  They are all equally valid.  So, if two of them have very complex sequences, and the third is poly-A, well, you're not doing anything useful.  I posit that the information content of a multiple sequence alignment is constrained by the most uninformative sequence you decide to include.

ADD COMMENTlink written 4.0 years ago by Brian Bushnell16k
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 666 users visited in the last hour