bagging
An ensemble ML technique to reduce overfitting and variance.
From a training dataset T, generate ānā new datasets by sampling observations in T with replacement. Then train ānā models separately on these sampled datsets. Finally, use a consensus among these models to get final prediction.
This technique is typically used with decision trees, as they are more susceptible to overfitting.
See also: decision-trees overfitting variance
AKA: bootstrap aggregating
References: