## Posts Tagged ‘Probability’

### Handbook of Markov Chain Monte Carlo

Today I received my copy of “Handbook of Markov Chain Monte Carlo“. Up to now I have consulted “Monte Carlo Strategies in Scientific Computing” of Jun S.Liu. In this post I will give my first thought on the new book.

The book of Liu is indeed awesome. The author himself has made many original contributions to sampling algorithms, and many of his ideas was explained in the book. The chapters on Gibbs Sampler (chapter 6),General Conditional Sampling (chapter 7) and multi-chain MCMC (chapter 10, 11) were excellent. But given that the book was written 10 years ago, many recent developments is missing. Some algorithms were not given enough spaces (in particular, Reversible Jump MCMC was given only  2 pages!).

Advertisements

### RJMCMC in clustering

Slide from a 30-minute presentation. There are some mistakes in the slide.

### Probability and Computing: Chapter 7 Exercises

Exercise 7.12: Let $X_{n}$ be the sum of $n$ independent rolls of a fair dice. Show that, for any $k > 2$, $\lim_{n \rightarrow \infty}(X_{n} \text{is divisible by k}) = \frac{1}{k}$.

### An exercise about Markov Chains

Lately  I have stopped reading “Probability and computing”, since I found some gaps in the exposition of the text, especially at chapter 7 “Markov Chains and Random Walks”-the authors left undefined some terminologies such as absorption. Certainly this is not the text for anybody who has little background of probability and want to learn it rigorously (though it is a good introductory text for randomized algorithm). So I bought “Markov Chains” of James Norris.
Exercises in “Markov Chains” are easy (at least for the first chapter ), though there are some problems that I am not quite sure. Here is one of them:

### Linear algebra viewpoint of function of discrete random variable

Suppose $X$ is a discrete random variable can take on $n$ values $x_{1} < x_{2} <\ldots with probability $p_{x_{1}},p_{x_{2}},\ldots ,p_{x_{n}}$; and $Y$ is a random variable defined by $Y = g(X)$.

### Probability and Computing: Chapter 3 Exercises (cont.3)

Exercise 3.21: A fixed point of a permutation $\pi : [1,n] \to [1,n]$ is a value for which $\pi (x) = x$. Find the variance in the number of fixed points of a permutation chosen uniformly at random from all permutations. (Hint: Let $X_{i}$ be $1$ if $\pi (i) =i$, so that $\sum_{i=1}^{n}X_{i}$ is the number of fixed points. You cannot use linearity to find $Var[\sum_{i=1}^{n}X_{i}]$, but you can calculate it directly.)

### Probability and Computing: Chapter 3 Exercises (cont.2)

Exercise 3.15: Let the random variable $X$ be representable as a sum of random variables $X=\sum_{i=1}^{n}X_{i}$. Show that, if $E[X_{i}X_{j}]=E[X_{i}]E[X_{j}]$ for every pair of $i$ and $j$ with $1 \leq i , then $Var[X] = \sum_{i=1}^{n}Var[X_{i}]$.