Last week we looked at the properties of the normal distribution and have found that it is the basis for much of inferential statistics. But not all data follows that normal distribution. Are we in trouble? No. Enter the Central Limit Theorem (CLT). What is the CLT? Why does it make the use of the common inferential tools possible even if the raw data is not distributed normally?
We know that,
In the study of probability theory, the central limit theorem (CLT) states that the distribution of sample means approximates a normal distribution (also known as a “bell curve”), as the sample size becomes larger, assuming that all samples are identical in size, and regardless of the population distribution shape.
Hence,
If the data does not follow normal distribution then we need not to worry about it. By applying central limit theorem we can convert it into normal distribution by talking large sample size (more than 30).
Thank you.
Get Answers For Free
Most questions answered within 1 hours.