*This note is still open Machine Learning Notes I Machine Learning Notes II The Primal Question of Optimization For a general optimization problem, it usually could be rewritten as maximizing or minimizing a certain function with several constrictions. For example, maybe you want the optimized value non-negative. The most basic form of such is called the primal question which looks like this: And...

## Machine Learning Notes II

Link to Machine Learning Notes I The least squares estimates of α and β For simple linear regression: we have: Linear Regression way We can all use the NN method to solve the regression problem but that leads to being nearly impossible to locate exactly which layer foreshadows which feature of the data. Thus, maybe the better way is to upscale the dimension of the linear regression method. That...

## Bayes’ Rule

When I was in the high school learning about AP statistics I learned the formula: , Which able to be transformed as: is called “Conditional probability” which pretty much self-explained itself. For which I only knew the meaning of each element but not the whole idea, what I do is just plug in numbers, because it is kinda abstract to understand from itself: “The probability of...

## Machine Learning Notes I

Lately, I was into the studying process of machine learning, and outputting(taking notes) is a vital step of it. Here, I am using Andrew Ng’s Stanford Machine Learning course in Coursera with the language of MATLAB. So the rest of the code I will write in this post by default are based on MATLAB. What is ML? “A computer program is said to learn from experience E with respect to some...