Probability and Computing: Chapter 3 Exercises (cont.3)

Exercise 3.21: A fixed point of a permutation \pi : [1,n] \to [1,n] is a value for which \pi (x) = x. Find the variance in the number of fixed points of a permutation chosen uniformly at random from all permutations. (Hint: Let X_{i} be 1 if \pi (i) =i, so that \sum_{i=1}^{n}X_{i} is the number of fixed points. You cannot use linearity to find Var[\sum_{i=1}^{n}X_{i}], but you can calculate it directly.)

Solution:
Var[\sum_{i = 1}^{n}X_{i}] = E[(\sum_{i=1}^{n}X_{i})^{2}]-(E[\sum_{i=1}^{n}X_{i}])^{2}

First we will calculate E[\sum_{i=1}^{n}X_{i}].

E[\sum_{i=1}^{n}X_{i}]=\sum_{i=1}^{n}E[X_{i}]=\sum_{i=1}^{n}Pr(\pi (i) = i) = \sum_{i=1}^{n}\frac{ 1}{n} = 1

About E[(\sum_{i=1}^{n}X_{i})^{2}]:

E[(\sum_{i=1}^{n}X_{i})^{2}]=E[\sum_{i=1}^{n}X_{i}^{2}+2\sum_{i=1}^{n-1}\sum_{j = i+1}^{n}X_{i}X_{j}]

= \sum_{i=1}^{n}E[X_{i}^{2}] + \sum_{i=1}^{n-1}\sum_{j = i+1}^{n}E[X_{i}X_{j}]

We can calculate E[X_{i}^{2}] in the same way as with E[X_{i}]:

\sum_{i=1}^{n}E[X_{i}^{2}] = \sum_{i=1}^{n}Pr(X_{i}=1) = \sum_{i=1}^{n}Pr\big(\pi (i) = i\big) = \sum_{i=1}^{n}\frac{1}{n} = 1

E[X_{i}X_{j}] can be worked out as followed:

E[X_{i}X_{j}] =Pr(X_{i}X_{j} = 1) = Pr\big( (X_{i}=1) \cap (X_{j} = 1)\big)

= Pr\Big(\big(\pi (i) = i \big) \cap \big(\pi (j) = j \big)\Big)= \frac{1}{n(n-1)}

Then

2\sum_{i=1}^{n-1}\sum_{j =i+1}^{n}E[x_{i}X_{j}]=2\binom{n}{2}\frac{1}{n(n-1)} = 1

So

Var[\sum_{i=1}^{n}X_{i}] = \sum_{i=1}^{n}E[X_{i}^{2}] + 2\sum_{i=1}^{n-1}\sum_{j=i+1}^{n}E[X_{i}E_{j}] - (E[\sum_{i=1}^{n}])^{2}

= 1 + 1 - 1 = 1

Exercise 3.22: Suppose that we flip a fair coin n times to obtain n random bits. Consider all m=\binom{n}{2} pairs of these bit in some order. Let Y_{i} be the exclusive-or of the ith pair of bits, and let Y=\sum_{i=1}^{m} be the number of Y_{i} that equal to 1.

(a) Show that each Y_{i} is 0 with probability 1/2 and 1 with probability 1/2.

(b) Show that the Y_{i} are not mutually independent.

(c) Show that the Y_{i} satisfy the property that E[Y_{i}Y_{j}]=E[Y_{i}]E[Y_{j}].

(d) Using Exercise 3.15, find Var[Y].

(e) Using Chebyshev’s inequality, prove a bound on Pr( \mid Y - E[Y] \mid \geq n).

Exercise 3.25: The weak law of large numbers state that, if X_{1},X_{2},X_{3}, \ldots are independent and identically distributed random variables with mean \mu and standard deviation \sigma then for any constant \epsilon >0 we have

\lim_{n \rightarrow \infty} Pr(\mid \frac{X_{1}+X_{2}+\ldots +X_{n}}{n} -\mu \mid > \epsilon ) = 0.
Use Chebyshev’s inequality to prove the weak law of large numbers.

Solution: Let X =\frac{ X_{1}+X_{2}+\ldots +X_{n}}{n}.

E[X] = E[\frac{X_{1}+X_{2}+\ldots +X_{n}}{n}] = \frac{1}{n}\sum_{i=1}^{n}E[X_{i}] = \mu

Var[X]= Var[\frac{X_{1}}{n}+\frac{X_{2}}{n}+\ldots +\frac{X_{n}}{n}]

=Var[\frac{X_{1}}{n}]+Var[\frac{X_{2}}{n}] + \ldots +Var[\frac{X_{n}}{n}]

=n\frac{\sigma^{2}}{n^{2}}

= \frac{\sigma^{2}}{n}.

Thus X has standard deviation \sigma_{X} = \frac{\sigma}{\sqrt{n}}

For any constant \epsilon >0 we will have \epsilon \ge \frac{\sigma}{n} = \sqrt{n}\frac{\sigma}{\sqrt{n}} = \sqrt{n}\sigma_{X} for all sufficiently large n.

Applying Chebyshev’s inequality yields

Pr(\mid \frac{X_{1}+X_{2}+\ldots +X_{n}}{n} -\mu \mid > \epsilon )

= Pr(\mid X -E[X]\mid > \epsilon)

\le Pr(\mid X - E[X] \mid > \sqrt{n}\sigma_{X})

\le \frac{1}{n}

for all sufficiently large n.

Therefore \lim_{n \rightarrow \infty} Pr(\mid \frac{X_{1}+X_{2}+\ldots +X_{n}}{n} -\mu \mid > \epsilon ) = 0 \qquad \square

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: