**10**wrote:

Hi to all!

I want to perform a mutual information (MI) among three or more position of one multiple sequence alignment (MSA). So far, I know that

MI(XY) = sum( p(X,Y) * log ( p(X, Y) / ( p(X) * p(Y) ) ) )

But the MI among three position is:

MI(X,Y,Z) = MI(X,Y) + MI(X,Z) + MI(Y,Z) - [H(X) + H(Y) + H(Z) - H(X,Y,Z)]

where H(X) is calculated as:

H(X) = sum( p(X) * log p(X) )

The problem here is that I don't know how to calculate the joint entropy H(X,Y,Z). Someone knows how to calculate it? Also, if someone knows how to extend the MI among more than three position I will be grateful for the info.

Thanks to all! =)

**17k**• written 5.3 years ago by sivanc7 •

**10**

It's in Wikipedia: http://en.wikipedia.org/wiki/Joint_entropy.

7.3k