<b> </b>在概率论和信息论中,两个随机变量的互信息(Mutual Information,简称MI)或转移信息(transinformation)是变量间相互依赖性的量度。 <b> 互信息与各种熵之间的关系如下图:</b> <img src="https://res.cloudinary.com/montaigne-io/image/upload/v1720975877/D4062473-DC0A-46B3-94D0-710CD86324FD.png" style="background-color:initial;max-width:min(100%,1044px);max-height:min(606px);;background-image:url(https://res.cloudinary.com/montaigne-io/image/upload/v1720975877/D4062473-DC0A-46B3-94D0-710CD86324FD.png);height:auto;width:100%;object-fit:cover;background-size:cover;display:block;" width="1044" height="606"> <span style="font-family: '.PingFangSC-Regular'">其中</span>H(X)和H(Y) 是边缘熵,H(X|Y)和H(Y|X)是条件熵,而H(X,Y)是X和Y的联合熵。注意到这组关系和并集、差集和交集的关系类似,用Venn图表示。 两个连续随机变量互信息的定义: <img src="https://res.cloudinary.com/montaigne-io/image/upload/v1720975876/A35FAE64-2C39-43CD-990E-50830382F80E.png" style="background-color:initial;max-width:min(100%,812px);max-height:min(130px);;background-image:url(https://res.cloudinary.com/montaigne-io/image/upload/v1720975876/A35FAE64-2C39-43CD-990E-50830382F80E.png);height:auto;width:100%;object-fit:cover;background-size:cover;display:block;" width="812" height="130"> 互信息越大,相关性越强,两个独立变量的互信息为0.