Saturday, May 22, 2010

Why dividing by the square of 12 when calculating the standard deviation of a uniform distribution?

My teacher does not know the answer on this one, so maybe someone can help me out with this.





In case of uniformal distribution, the probability distribution for a uniform random variable can be calculated with some formulas. The standard deviation is calculated with the formula





St. dev. = d - c / square 12





d=the highest value


c=the lowest value.





I understand that d-c is the lenght of the posibilities for the variable. But why should we divide by square 12 to know the standard deviation. Where does this come from?

Why dividing by the square of 12 when calculating the standard deviation of a uniform distribution?
the variance is (a+b)^2/ 12, so the SD. is the sqrt of variance...





the denominator is root 12
Reply:variance = integral(x from c to d) (x-m)²f(x)dx





where m is the mean = (c+d)/2 and f(x) = 1/(d-c) is the probability density function. So





variance = integral(x from c to d) (x-m)²/(d-c) dx





The indefinite integral is 1/3(d-c) * (x-m)³, plug in x=d and x=c and subtract the two. We get





variance = 1/3(d-c) * (d-m)³ - 1/3(d-c) * (c-m)³


= 1/3r * [(r/2)³ - (-r/2)³] where r = d-c


= 1/3r * r³/4


= r²/12.
Reply:TRy looking here;





http://en.wikipedia.org/wiki/Standard_de...





I think your formula is wrong but the number you divide by is related to the nuber of samples you have.

florist shop

No comments:

Post a Comment