Assignment 1_08116649_Chaoyun_Song
Solution: Matlab Code: >> p=0 : 0.01 : 1;
>> C=p.*log2(p)+(1-p).*log2(1-p)+1; >> plot(p,C)
From the diagram, we can see the change of ?C? with different value of ?p?. When p=1 the channel capacity is 1(bit/symbol)
When p=0.5 there are no information and the mutual information is 0 When 0.5 ?p ? 1 the diagram is same as leftside
So p=0.5 is the capacity minimized, the minimum value of C is 0.
(3). A binary non-symmetric channel is characterized by the probabilities P(0|1) =0.1 and P(1|0) = 0.2.
Assignment 1_08116649_Chaoyun_Song
Derive an expression for the mutual information I(X,Y).
Plot the mutual information I(x,y) between the input and the output of the function of p where p is the probability of transmitting a `1', i.e. P(X = 1).
For what values of p is the mutual information maximisized? What is the value of this maximum?
The expression for the mutual information I(X, Y) is: I(X, Y)=??p(x,y)logp(x,y)p(x)p(y)
I(X, Y)=H(X)-H(X|Y)
For this mutual information, when P(X=1) the probability is p, and because it is a binary non-symmetric channel so P(X=0)=1-p, and we also know P(0|1)=0.1 P(1|0)=0.2, so P(0|0)=0.8, P(1|1)=0.9 then we can calculate P(Y=0)=(1-p)0.8+p0.1 P(Y=1)=(1-p)0.2+p0.9 H(X)=??p(x)log2p(x)
H(X)=-plog2p-(1-p)log2(1-p) H(X|Y)=??p(y)?p(x|y)log2p(x|y)
H(X|Y)=-P(Y=0)(0.8log0.8+0.2log0.2)-P(Y=1)(0.1log0.1+0.9log0.9) So the mutual information I(X, Y)=
-plog2p-(1-p)log2(1-p)
+P(Y=0)(0.8log0.8+0.2log0.2)+P(Y=1)(0.1log0.1+0.9log0.9) Using matlab we can get:
Assignment 1_08116649_Chaoyun_Song
Solution: Matlab Codes: >> p=0 : 0.01 : 1;
H1=-p.*log2(p)-(1-p).*log2(1-p); P1=(1-p).*0.8+p.*0.1; P2=(1-p).*0.2+p.*0.9;
H2=-P1*(0.8*log2(0.8)+0.2*log2(0.2))-P2*(0.1*log2(0.1)+0.9*log2(0.9)); I=H1-H2; plot(p,I)
From the diagram we can see
When p=0, the mutual information equals to -0.6 this is the minimum value. when p=0.5, the mutual information comes to the maximum value which I(X, Y)=0.4, in this case, the channel capacity becme the largest value.
Assignment 1_08116649_Chaoyun_Song
3. Discussion
In this paper, we finish three items. In the first item, using matlab write functions to calculate the entropy of A, B and C, we use the function ?-sum(A.*log2(A))? to calculate the entropy, after searching the matlab help. We find it can be instead by ?entropy()? and this will be more simple. And the definition of entropy diagram is not clear. In the third item, we get the diagram by plot on matlab, and the maximum value of mutual information I(X,Y)=0.4, in this point p=0.5, but in the diagram it is not accurate.
4. Conclusion
This paper including the basic exercise of elements of information theory. First there has a short introduction on A short introduction on Shannon's information content, entropy and mutual information. Then there are three assignment items about calculate the entropy of distributions; find the channel capacity of a binary symmetric channel(BSC) by plot on matlab and find the minimized capacity; plot the mutual information I(X,Y) of a binary non-symmetric channel and find the maximum value of it. By doing this exercise we get More in-depth understanding of information theory, and practice the skills on calculating the entropy, mutual information and channel capacity. It will help our future study on this subject.
相关推荐: