Why are random forests a good model to use for classification predictions/problems? What are the pros and cons of using random forests for classification predictions?
1.The number of decision-trees involved in this process makes random forests a highly accurate and robust system. It does not face the issue of overfitting.The key explanation is that the average of all the predictions is taken, which cancel out the biases. That's why random forest is a good model for classification predictions/problems.
2.
The pros of using random forest is: The missing values are also be managed by random forests. There are two ways to deal with these: median values are used to replace continuous variables and a nearby averages of missing values are determined.
The cons of using random forest is:Random forests produce forecasts gradually because they have many decision trees. Whenever a prediction has been made, all the trees of the forest will forecast and vote on the same data. It is a time-consuming process.
Get Answers For Free
Most questions answered within 1 hours.