Adaboost

http://www.csuldw.com/assets/articleImg/adaboost-algorithm.png

Steps

  1. Initialize very sample weight to 1N\frac{1}{N}
  2. Iterate M times. Each time, change the the training data weights according to training error rates eme_m. The basic rules are: increases the weights for mis-classified data and reduce the weights of correctly classified data.
  3. According to the weights of all weak learners ama_m, combine all MM learners together, then finally give a output as G(X)=m=1MamGm(x)G(X) = \sum_{m=1}^{M}a_mG_m(x)

Derivation

Please refer to references!

[1] Adaboost - 新的角度理解权值更新策略 in Chinese [2] Adaboost Wikipedia

results matching ""

    No results matching ""