Why, if X is G-random variable, does it mean that knowing the sigma field implies knowing the random variable x? Show why this is true.
Answer:
Sigma is a measure that tells us about "the spread of a random variable" that how widely(or closely) the random variable is spread out.
Consider you have a group of people and you are given that their
mean age is 30.
Why are you given the average age and not the age of each person
separately? for easiness of analysis. They don't want you to stare
blankly at 10 different values. When we try to represent the
information contained in ten values by a single value, of course,
there is a trade off. You lose the accuracy of the information.
So, there is a need to find out how good is this 'easy
representation'.
Lets take the case. Average age of 10 persons is 30. There can be n combinations of their ages.
they could be i) 1,5,18,25,25,30,30,30,35,35,42,55,59 or
they could be ii) 30,30,30,30,30,30,30,30,30,30,30,30.
In the first case, the actual values are spread far away from the mean and the value of the sigma will be significant(case #1 follows normal distribution with mean 30 and sigma 16.67).
In the second case, all of them are exactly on mean. How to distinguish between these? Here comes the need for standard deviation(SIGMA). A high standard deviation signifies high deviation of data points from the mean. A moderate SD signifies a moderate deviation of data points from the mean and a low (can be even zero) signifies that the data points are close to the mean.
And in case #2 as all the data points are same therefore there is no variability in the data therefore sigma is 0 in this case and that is the case when we have a Uniform Distribution.
So, by looking at the sigma value of a random variable, one can easily imply some inference about the distribution of the random variable.
Thank You...
Get Answers For Free
Most questions answered within 1 hours.