k-nearest neighbors and regression trees are methods that can be used to predict numeric outcomes. Both tend to outperform linear regression for very large data sets. Explain one specific advantage that these methods have over linear regression.
Advantages of using knn and decision trees over regression:
1. K-NN and Decision tress are non-parametric algorithms which means there are assumptions to be met to implement them. Parametric models like linear regression has lots of assumptions to be met by data before it can be implemented which is not the case with K-NN and Decision trees.
2. K-NN and Decision tree algorithms are very simple to understand and equally easy to implement as compared to linear regression.
3. K-NN and decision trees are memory-based approach. The immediately adapt as we collect new training data. They allow the algorithm to respond quickly to changes in the input during real-time use.
Thank you. Please upvote.
Get Answers For Free
Most questions answered within 1 hours.