第一范文网 - 专业文章范例文档资料分享平台

求条件熵和互信息的matlab实现

来源:用户分享 时间:2025/7/1 8:34:35 本文由loading 分享 下载这篇文档手机版
说明:文章内容仅供预览,部分内容可能不全,需要完整文档或者需要复制内容,请下载word后使用。下载word有问题请添加微信号:xxxxxxx或QQ:xxxxxx 处理(尽可能给您提供完整文档),感谢您的支持与谅解。

Assignment 1_08116649_Chaoyun_Song

Solution: Matlab Code: >> p=0 : 0.01 : 1;

>> C=p.*log2(p)+(1-p).*log2(1-p)+1; >> plot(p,C)

From the diagram, we can see the change of ?C? with different value of ?p?. When p=1 the channel capacity is 1(bit/symbol)

When p=0.5 there are no information and the mutual information is 0 When 0.5 ?p ? 1 the diagram is same as leftside

So p=0.5 is the capacity minimized, the minimum value of C is 0.

(3). A binary non-symmetric channel is characterized by the probabilities P(0|1) =0.1 and P(1|0) = 0.2.

Assignment 1_08116649_Chaoyun_Song

Derive an expression for the mutual information I(X,Y).

Plot the mutual information I(x,y) between the input and the output of the function of p where p is the probability of transmitting a `1', i.e. P(X = 1).

For what values of p is the mutual information maximisized? What is the value of this maximum?

The expression for the mutual information I(X, Y) is: I(X, Y)=??p(x,y)logp(x,y)p(x)p(y)

I(X, Y)=H(X)-H(X|Y)

For this mutual information, when P(X=1) the probability is p, and because it is a binary non-symmetric channel so P(X=0)=1-p, and we also know P(0|1)=0.1 P(1|0)=0.2, so P(0|0)=0.8, P(1|1)=0.9 then we can calculate P(Y=0)=(1-p)0.8+p0.1 P(Y=1)=(1-p)0.2+p0.9 H(X)=??p(x)log2p(x)

H(X)=-plog2p-(1-p)log2(1-p) H(X|Y)=??p(y)?p(x|y)log2p(x|y)

H(X|Y)=-P(Y=0)(0.8log0.8+0.2log0.2)-P(Y=1)(0.1log0.1+0.9log0.9) So the mutual information I(X, Y)=

-plog2p-(1-p)log2(1-p)

+P(Y=0)(0.8log0.8+0.2log0.2)+P(Y=1)(0.1log0.1+0.9log0.9) Using matlab we can get:

Assignment 1_08116649_Chaoyun_Song

Solution: Matlab Codes: >> p=0 : 0.01 : 1;

H1=-p.*log2(p)-(1-p).*log2(1-p); P1=(1-p).*0.8+p.*0.1; P2=(1-p).*0.2+p.*0.9;

H2=-P1*(0.8*log2(0.8)+0.2*log2(0.2))-P2*(0.1*log2(0.1)+0.9*log2(0.9)); I=H1-H2; plot(p,I)

From the diagram we can see

When p=0, the mutual information equals to -0.6 this is the minimum value. when p=0.5, the mutual information comes to the maximum value which I(X, Y)=0.4, in this case, the channel capacity becme the largest value.

Assignment 1_08116649_Chaoyun_Song

3. Discussion

In this paper, we finish three items. In the first item, using matlab write functions to calculate the entropy of A, B and C, we use the function ?-sum(A.*log2(A))? to calculate the entropy, after searching the matlab help. We find it can be instead by ?entropy()? and this will be more simple. And the definition of entropy diagram is not clear. In the third item, we get the diagram by plot on matlab, and the maximum value of mutual information I(X,Y)=0.4, in this point p=0.5, but in the diagram it is not accurate.

4. Conclusion

This paper including the basic exercise of elements of information theory. First there has a short introduction on A short introduction on Shannon's information content, entropy and mutual information. Then there are three assignment items about calculate the entropy of distributions; find the channel capacity of a binary symmetric channel(BSC) by plot on matlab and find the minimized capacity; plot the mutual information I(X,Y) of a binary non-symmetric channel and find the maximum value of it. By doing this exercise we get More in-depth understanding of information theory, and practice the skills on calculating the entropy, mutual information and channel capacity. It will help our future study on this subject.

搜索更多关于: 求条件熵和互信息的matlab实现 的文档
求条件熵和互信息的matlab实现.doc 将本文的Word文档下载到电脑,方便复制、编辑、收藏和打印
本文链接:https://www.diyifanwen.net/c2l1pp1ievl1is530735f_2.html(转载请注明文章来源)
热门推荐
Copyright © 2012-2023 第一范文网 版权所有 免责声明 | 联系我们
声明 :本网站尊重并保护知识产权,根据《信息网络传播权保护条例》,如果我们转载的作品侵犯了您的权利,请在一个月内通知我们,我们会及时删除。
客服QQ:xxxxxx 邮箱:xxxxxx@qq.com
渝ICP备2023013149号
Top