Suppose you wanted to understand the relationship between a customer's yearly income (X) and the number of movies (Y) the customer watched in a year. You then gather data on incomes and the number of movies watched in a year. The range of incomes in your data set is $5K to $150K. After fitting a simple linear model and performing all the appropriate diagnostics, the model showed that, on average, for every $10K in income, the customer watched 1.5 movies in the year. So, for example, if a customer earned 60K in a year, he or she would be expected to watch nine movies during the year. Now you want to apply this model to your very wealthy friend who will earn $1 million in the next year. Is this an appropriate application of your model? Why or why not? Provide specific examples to justify your opinion.
No, this is not an appropriate application of the model because $1 million is not in the range of income dataset($5K to $150K). Hence, the problem of extrapolation occurs. It is shown in the following figure
Here the true value dmay deviate from the predicted value when we try to predict for an observation that is out of range, because the behvaiour of tru function may be different outside that range.
Thank you.
Get Answers For Free
Most questions answered within 1 hours.