Combining multiple models (such as bagging) is to reduce expected error, which is composed of: (more than one)
1)bias, i.e. expected error due to the learning model/classifier/algorithm
2)outliers, i.e. expected error due to inconsistent data collection
3)variance, i.e. expected error due to particular training
4)data inaccuracy, i.e. expected error due to inaccurate data observation/measurement
Solution :
Statement 3 is True.
3) variance, i.e. expected error due to particular training
Explanation :
Bagging also called Bootstrap Aggregating is an ensemble method. In this metod first, we create random samples of the training data set (sub sets of training data set) and then, we build a classifier for each sample. Finally, results of these multiple classifiers are combined using average or majority voting. Bagging helps to reduce the variance error. It also helps to avoid overfitting.
Get Answers For Free
Most questions answered within 1 hours.