Question

Combining multiple models (such as bagging) is to reduce expected error, which is composed of: (more...

Combining multiple models (such as bagging) is to reduce expected error, which is composed of: (more than one)

1)bias, i.e. expected error due to the learning model/classifier/algorithm

2)outliers, i.e. expected error due to inconsistent data collection

3)variance, i.e. expected error due to particular training

4)data inaccuracy, i.e. expected error due to inaccurate data observation/measurement

Homework Answers

Answer #1

Solution :

Statement 3 is True.

3) variance, i.e. expected error due to particular training

Explanation :

Bagging also called Bootstrap Aggregating is an ensemble method. In this metod first, we create random samples of the training data set (sub sets of training data set) and then, we build a classifier for each sample. Finally, results of these multiple classifiers are combined using average or majority voting. Bagging helps to reduce the variance error. It also helps to avoid overfitting.

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions