Q

(These are some thoughts I got while reading the inspiring book by James D.Watson “Avoid boring people and other lessons from a life in science.” Maybe I will give a full summary on the lessons from the book later.)

Continue reading

Advertisements

Mathematics and the Unexpected

(Just some thoughts while I read Mathematics and the Unexpected and Innumeracy: Mathematical Illiteracy and its consequence. Maybe some more detailed post later.)

Continue reading

Probability and Computing: Chapter 7 Exercises

Exercise 7.12: Let X_{n} be the sum of n independent rolls of a fair dice. Show that, for any k > 2, \lim_{n \rightarrow \infty}(X_{n} \text{is divisible by k}) = \frac{1}{k}.

Continue reading

An exercise about Markov Chains

Lately¬† I have stopped reading “Probability and computing”, since I found some gaps in the exposition of the text, especially at chapter 7 “Markov Chains and Random Walks”-the authors left undefined some terminologies such as absorption. Certainly this is not the text for anybody who has little background of probability and want to learn it rigorously (though it is a good introductory text for randomized algorithm). So I bought “Markov Chains” of James Norris.
Exercises in “Markov Chains” are easy (at least for the first chapter ), though there are some problems that I am not quite sure. Here is one of them:

Continue reading

Linear algebra viewpoint of function of discrete random variable

Suppose X is a discrete random variable can take on n values x_{1} < x_{2} <\ldots <x_{n} with probability p_{x_{1}},p_{x_{2}},\ldots ,p_{x_{n}}; and Y is a random variable defined by Y = g(X).

Continue reading

Probability and Computing: Chapter 3 Exercises (cont.3)

Exercise 3.21: A fixed point of a permutation \pi : [1,n] \to [1,n] is a value for which \pi (x) = x. Find the variance in the number of fixed points of a permutation chosen uniformly at random from all permutations. (Hint: Let X_{i} be 1 if \pi (i) =i, so that \sum_{i=1}^{n}X_{i} is the number of fixed points. You cannot use linearity to find Var[\sum_{i=1}^{n}X_{i}], but you can calculate it directly.)

Continue reading

Probability and Computing: Chapter 3 Exercises (cont.2)

Exercise 3.15: Let the random variable X be representable as a sum of random variables X=\sum_{i=1}^{n}X_{i}. Show that, if E[X_{i}X_{j}]=E[X_{i}]E[X_{j}] for every pair of i and j with 1 \leq i <j\leq n, then Var[X] = \sum_{i=1}^{n}Var[X_{i}].

Continue reading