An ensemble ML technique to reduce overfitting and variance.

From a training dataset T, generate ā€˜nā€™ new datasets by sampling observations in T with replacement. Then train ā€˜nā€™ models separately on these sampled datsets. Finally, use a consensus among these models to get final prediction.

This technique is typically used with decision trees, as they are more susceptible to overfitting.