@pagal_guy wrote:
Hello,
While learning about boosting I have noticed a few things:
There are several modes to boosting: AdaBoost,Gradient Boost and Stochastic gradient boosting.While Adaboost implements the concept of weighting of records after each classifiers output Stochastic Gradient Boosting uses bootstrap samples to implement boosting.
So while Adaboost uses a single dataset with varied weights each time,Stochastic gbm uses bootstrapped samples each time as can be seen in the above output.
My doubt is that if bootstrapped samples are used each time how is the weights updation being done.
I mean if I again do bootstrap sampling after 1 round,some records whose weights have changed might be left out from the new sample.
Is this the case or am I missing something here??
Can someone please help me with this??
Posts: 1
Participants: 1