You want to estimate the average household income for Ohio. You want the margin of error to be no more than $1,000. Prior data shows the standard deviation of household income is $30,000.
How many households should we sample to achieve the desired margin of error?
Suppose you want to cut the margin of error down to $500 next time. What should your sample size be?
Suppose you want to cut the margin down to 1/3 of what it started at (1/3 of 1000). What does your sample sizes have to be? (Be careful!)
Why do we always round up when finding the appropriate sample size to achieve a certain margin of error, even when the value after the decimal point is less than .5? (For example, if we solve for n and get 422.2 households, why do we round this up to 423 households when reporting the required n?)
ME = 1000 , s = 30000
z value at 95% = 1.96
ME = z *(s/sqrt(n))
1000 = 1.96 *(30000/sqrt(n))
n = 3457
ME = 500
z value at 95% = 1.96
ME = z *(s/sqrt(n))
500 = 1.96 *(30000/sqrt(n))
n = 13829
ME = 333.33
z value at 95% = 1.96
ME = z *(s/sqrt(n))
333.33 = 1.96 *(30000/sqrt(n))
n = 31116
If you need part of an any part of it to meet the margin of error requirement, we need the wholw individual. Rounding down will result in a slightly higher margin of error than what you wanted
Get Answers For Free
Most questions answered within 1 hours.