Archive for the ‘Probability’ Category

Information Criterion

This post will be my summary about the Akaike Information Criterion(AIC)  and the Takeuchi Information Criterion(TIC).  In particular, a derivation of AIC and TIC is shown. And if I can understand more about the Generalized Information Criterion, I will cover it too.

Continue reading

Handbook of Markov Chain Monte Carlo

Today I received my copy of “Handbook of Markov Chain Monte Carlo“. Up to now I have consulted “Monte Carlo Strategies in Scientific Computing” of Jun S.Liu. In this post I will give my first thought on the new book.

The book of Liu is indeed awesome. The author himself has made many original contributions to sampling algorithms, and many of his ideas was explained in the book. The chapters on Gibbs Sampler (chapter 6),General Conditional Sampling (chapter 7) and multi-chain MCMC (chapter 10, 11) were excellent. But given that the book was written 10 years ago, many recent developments is missing. Some algorithms were not given enough spaces (in particular, Reversible Jump MCMC was given only  2 pages!).

Continue reading

Formulating the PCA

Today I have thought about how one can formulate the Principal Component Analysis (PCA) method. In particular I want to reformulate PCA as a solution for a regression problem. The idea of reformulation PCA as a solution for some regression problem is useful in Sparse PCA , in which a L_1 regularization term is inserted into a ridge regression formula to enforce spareness of the coefficients (i.e. elastic net). There are at least two equivalent ways to motivate PCA. In this post I will first give a formulation of PCA based on orthogonal projection, and then discuss a regression-type reformulation of PCA.

Continue reading

RJMCMC in clustering

Slide from a 30-minute presentation. There are some mistakes in the slide.

Continue reading

Nonparametric Bayesian Seminar 1 : Notes

(mục đích chính là viết ra để khỏi quên nên sẽ  lộn xộn. Không có hình vẽ)

Paper: Introduction to Nonparametric Bayesian Models (Naonori Ueda, Takeshi Yamada)

Continue reading

Probability and Computing: Chapter 7 Exercises

Exercise 7.12: Let X_{n} be the sum of n independent rolls of a fair dice. Show that, for any k > 2, \lim_{n \rightarrow \infty}(X_{n} \text{is divisible by k}) = \frac{1}{k}.

Continue reading

An exercise about Markov Chains

Lately  I have stopped reading “Probability and computing”, since I found some gaps in the exposition of the text, especially at chapter 7 “Markov Chains and Random Walks”-the authors left undefined some terminologies such as absorption. Certainly this is not the text for anybody who has little background of probability and want to learn it rigorously (though it is a good introductory text for randomized algorithm). So I bought “Markov Chains” of James Norris.
Exercises in “Markov Chains” are easy (at least for the first chapter ), though there are some problems that I am not quite sure. Here is one of them:

Continue reading