Matrix Decomposition (or factorization) is pretty important in many research areas, especially in data analysis, such as using SVD or EVD in PCA. Actually there are more than 10 kinds of matrix decomposition methods.  In general, researchers divide these methods into 4 types, diagonal factorization (like SVD), triangularization factorization (like LU), triangle-diagonal decomposition (like schur decomposition) and tri-diagonal decomposition. Here the triangularization factorization is just discussed first.

1 Cholesky factorization

When I got my diploma and bachelor's degree certificates,  I realized that I had graduated from Zhejiang University. Although I  encountered numerous difficulties during these 4 years, I still felt very happy and satisfied. Anyway, the most important goal has been achieved at least. Then I should set some new goals in next several years, and I have to try my best to complete my PhD study at Virginia tech.

Now I have arrived in Blacksburg which is a very beautiful town in Virginia. It is surrounded by mountains and big trees, so the sky is pretty blue and the air is very fresh. Additionally, of course the campus of VT is also impressive and the local residents are very nice. Despite all this, I still have trouble adjusting to life here. For instance, my spoken English is not very OK and I am not familiar with the traffic rules of United States, but anyway, I believe I will overcome the troubles and have a great time in the future. Thank all the person who gives me help, who cares for me. Thank you very much. Continue reading

Most theories and methods depend on or derive from mathematics and probability, so it is necessary to review probability before getting started on studying machine learning.

2 Review of Probability(just an outline)

2.1 Probability

Some Probability Formulas:

(1)Sum rule: $Pr[A\cup B]=Pr[A]+Pr[B]-Pr[A\cap B]$

(2)Union bound: $Pr[\cup A_i]=\sum\limits_{i=1}^nPr[A_i]$

(3)Conditional probability: