## Archive for the ‘Solutions’ Category

### Probability and Computing: Chapter 3 Exercises (cont.3)

Exercise 3.21: A fixed point of a permutation $\pi : [1,n] \to [1,n]$ is a value for which $\pi (x) = x$. Find the variance in the number of fixed points of a permutation chosen uniformly at random from all permutations. (Hint: Let $X_{i}$ be $1$ if $\pi (i) =i$, so that $\sum_{i=1}^{n}X_{i}$ is the number of fixed points. You cannot use linearity to find $Var[\sum_{i=1}^{n}X_{i}]$, but you can calculate it directly.)

### Probability and Computing: Chapter 3 Exercises (cont.2)

Exercise 3.15: Let the random variable $X$ be representable as a sum of random variables $X=\sum_{i=1}^{n}X_{i}$. Show that, if $E[X_{i}X_{j}]=E[X_{i}]E[X_{j}]$ for every pair of $i$ and $j$ with $1 \leq i , then $Var[X] = \sum_{i=1}^{n}Var[X_{i}]$.

### Probability and Computing: Chapter 3 Exercises

Exercise 3.2: Let $X$ be a number chosen uniformly at random from $[-k,k]$. Find $Var[X]$.

### Probability and Computing: Chapter 2 Exercises (Cont. 4)

Exercise 2.34: We roll a standard fair dice over and over. What is the expected number of rolls until the first pair of consecutive sixes appears? (Hint: The answer is not 36.)

### Probability and Computing: Chapter 2 Exercises (Cont. 2)

Exercise 2.18: The following approach is often called reservoir sampling. Suppose we have a sequence of item passing by one at a time. We want to maintain a sample of one item with the property that it is uniformly distributed over all the items that we have seen at each step. Moreover, we want to accomplish this without knowing the total number of items in advance or storing all of the items that we see.
Consider the following algorithm, which store just one item in memory at all times. When the first item appears, it is stored in the memory. When the $k$th item appears, it replaces the item in memory with probability $1/k$. Explain why this algorithm solves the problem.

### Probability and Computing: Chapter 2 Exercises (Cont. 1)

Exercise 2.14: The geometric distribution arise as the distribution of the number of times we flip a coin until it comes up heads. Consider now the distribution of the number of flips $X$ until the $k$th head appears, where each coin flip comes up heads independently with probability $p$. Prove that this distribution is given by

$Pr(X=n) = \binom{n-1}{k-1}p^{k}(1-p)^{n-k}$

for $n \geq k$. (This is known as the negative binomial distribution.)

### Probability and Computing: Chapter 2 Exercises

Exercise 2.5: If $X$ is a $B(n,1/2)$ random variable with $n \geq 1$, show that the probability that $X$ is even is $1/2$.

$Pr(X=j) = \binom{n}{j}p^{j}(1-p)^{n-j}=\binom{n}{j}p^{n} \qquad (p=1-p)$.

Since $Pr(X = j)$ only depends on $\binom{n}{j}$, all we need to prove is : $\sum_{\text{i is even}}\binom{n}{i} = \sum_{\text{j is odd}}\binom{n}{j}$.

Indeed, we have $0= (1-1)^n=\sum_{k=0}^{n}\binom{n}{k}(-1)^{k}=\sum_{\text{i is even}}\binom{n}{i} - \sum_{\text{j is odd}}\binom{n}{j}$.

Exercise 2.6: Suppose that we independently roll two standard six-sided dice. Let $X_{1}$ be the number that shows on the first dice, $X_{2}$ the number on the second dice, and X the sum of the numbers of two dice.