True/False question about statistic analysis methods
1. LDA is preferable to QDA if they both have same training error.
2. LDA and QDA are both robust t outliers.
3. Naive Bayes classifier usually works better than LDA/QDA when number of input feature is large
4. The computational cost of Naive Bayes classifier is higher than that of LDA/QDA.
5. Compared to KNN, Generalized Additive Model produces a model that's easier to interpret.
6. Generalized Additive Model can make 0 test error by producing a sufficiently complicated model.
7. Generalized Additive Model can make 0 training error by producing a sufficiently complicated model.
1. LDA is preferable to QDA if they both have same training error.- True
LDA tends to be better than QDA if both have same training error, so therefore reducing variance is crucial.
2. LDA and QDA are both robust t outliers. - False
The LDA mean and covariance matrix parameters are highly influenced by outliers.
3. Naive Bayes classifier usually works better than LDA/QDA when number of input feature is large. - False
LDA Works well when the input featues is large. The Naive Bayes classifier works only with categorical variables, so one has to transform continuous features to discrete, by which throwing away a lot of information. If there's a continuous variable in the data, it's a strong sign against Naive Bayes.
Please raise separate request for each questions.
Get Answers For Free
Most questions answered within 1 hours.