machine learning pt.3

Machine learning pt.3

The reason why the regulation in linear regression is necessary

  • when the weights are too many in linear regression, the sizes of the weights are also bigger.
  • so, the model would be complicated, and then the possibility of overfitting would occur.
  • To restrain this modeling complexity, the regulation is needed.

The regulation

Please refer to a table on the previous post about comparison of ridge, lasso, elastic net.

  • However, too much regulation can lead to the underfitting of the data.
  • Lasso (L1 norm) regulation: if a feature would be not 0 value even though the user imposes the lasso to give a penalty, the feature must be a significant feature. (a role of feature selection to eliminate many of features)
  • formula for Elastic Net‘s cost function:
    $\displaystyle weight(error) + \lambda 1 \times \sum |w| + \lambda 2 \times \sum w^2 \\\\
    = weight(error) + \lambda(l_{l ratio} \sum |w| + (1 - l_{l ratio}) \sum w^2)$

Multi-linear regression

  • A kind of linear regression that has some features, not only one feature.
  • Through the value of weight, it is possible to identify with the influence of each of X features.

types of Linear Classification model

  • Logistic regression
  • SVM (Support Vector Machine)

SPOILER of the next post!!

It would be continued regarding about Linear Classification models. It would also be about python codes for machine learning with some explanation!