how to measure forecast accuracy using MSE? Suppose you have MSE = 28.625
Mean squared error MSE is an error measure, used to measure the accuracy of a forecasted results, using any time series methods. It is the average of squared error values and it provides largest weight to the bigger error value.
MSE = (sum of squared error)/number of periods
Given the MSE = 28.625, it can be said that, the forecast involves errors. And the squared error value makes the error bigger. When MSE is less than one, then it can be said that the forecast is more accurate.
(Note: please like the answer it will encourage.)
Get Answers For Free
Most questions answered within 1 hours.