HW6 – Introduction to Information Theory

$24.99 $18.99

1. Consider the probability distribution p(x; y) for x; y 2 f0; 1g: 9 p(x; y) = > 1=6; x = 0; y = 0; > (1) 0; x = 1; y = 0; > 1=6; x = 0; y = 1; > < = > 2=3; x = 1; y = 1 > >…

5/5 – (2 votes)

You’ll get a: zip file solution

 

Description

5/5 – (2 votes)

1. Consider the probability distribution p(x; y) for x; y 2 f0; 1g:

  • 9

p(x; y) =

>

1=6;

x = 0; y = 0;

>

(1)

0;

x = 1; y = 0;

>

1=6;

x = 0; y = 1;

>

<

=

>

2=3;

x = 1; y = 1

>

>

>

:

;

Find

    1. H(X), H(Y ),

    1. H(XjY ), H(Y jX),

    1. H(X; Y ),

    1. I(X; Y ).

  1. Show that the following expressions hold. X; Y; Z are jointly distributed random variables.

    1. H(X; Y jZ) H(XjZ),

    1. I(X; Y ; Z) I(X; Z),

(c) H(X; Y; Z) H(X; Y ) H(X; Z) H(X),

(d) I(X; ZjY ) I(Z; Y jX) I(Z; Y ) + I(X; Z).

3. Assume that we have four symbols, namely a; b; c; d. These symbols are mapped to codewords of

10; 00; 11; 110, respectively.

  1. Does the given code have the pre x property?

  1. Is this a uniquely decodable code? Prove it.

  1. The probability distributions p(x) and q(x) are de ned on the same random variable X with 5 outcomes, f1; 2; 3; 4; 5g. C1(x) and C2(x) are two di erent codes on X:

Symbol

p(x)

q(x)

C1(x)

C2(x)

1

1/2

1/2

0

0

2

1/4

1/8

10

100

3

1/8

1/8

110

101

4

1/16

1/8

1110

110

5

1/16

1/8

1111

111

    1. Find H(p); H(q); D(pjjq) and D(qjjp).

    1. Prove the optimality of C1 and C2 on X under p and q, respectively.

    1. We decide to use C2 when the distribution is p. Find the average codeword length. Find its di erence to the entropy of p.

    1. We decide to use C1 when the distribution is q. Find the average codeword length. How much do we lose due to miscoding?

  1. For each of the following codes, nd a probability assignment that results in Hu man coding. If there is no such a probability assignment, explain why.

    1. f0,10,11g

    1. f00,01,10,110g

    1. f01,10g.

  1. Consider the probability distribution on X with 8 outcomes:

p(X = 1) = p(X = 2) = 2p(X = 3) = 2p(X = 4) = 4p(X = 5) = 4(X = 6) = 4p(X = 7) = 4p(X = 8).

    1. Give the Hu man binary code.

    1. Find the expected code length of a) and H(X).

    1. Find the Hu man 3-ary code.

R

7. For each of the following, nd the di erential entropy, h(X) = f ln f.

    1. f(x) = e x for nonnegative x.

    1. f(x) = 1=2 e jxj for real x.

    1. f(x) where, X = X1 + X2 where X1 and X2 are independent with f(xi) = N( i; i2), i = 1; 2.

  1. Consider the channel transition matrix, Q:

Q =

1=2

1=2

(2)

1

0

where Q(i; j) gives the probability of receiving yj

if xi

is transmitted.

Assume x1 = y1 = 0 and

x2 = y2 = 1.

  1. Draw the channel model with the transmitted bits on the left side, the received bits on the right side and the transition probabilities connecting them. What is the name of this channel?

  1. Find the capacity of this channel.

  1. Find the input distribution that achieves the capacity.

3

HW6 - Introduction to Information Theory
$24.99 $18.99