# Standard deviation

Six Sigma – iSixSigma Forums Old Forums General Standard deviation

Viewing 12 posts - 1 through 12 (of 12 total)
• Author
Posts
• #42932

KS
Participant

Will some one help me to understand why do we square the deviation from mean while calculating standard deviation? Why can’t we take absolute value and sum it.

0
#135778

Vinay Goyal
Member

http://www.pacific.edu/college/psychology/Statistics/sd.html

0
#135781

KS
Participant

Thanks, but my basic question was why NOT WE just add the absolute values!!

0
#135782

sumant
Member

KS, If you are aware of MGF(Moment generting function), then i think it will clarify all your questions. Mean ans Standard devition are generated by using MGF. mean is the first order and SD is the second order of MGF.

0
#135791

Griswold
Participant

Data Points
X- Mean
Absolute Deviation (X-Mean)
Mean Squared

10
-20
20
400

20
-10
10
100

30
0
0
0

40
10
10
100

50
20
20
400

Sum
150
0
60
1000

Average
30
0
12
200

ABS
Squared

12.00
14.14
If you used the absolute value (pop stdev) you would get 12, if you used the root mean squared standard deviation you get 14.14.  So it does calculate differently.  I was once told that Absolute Values can have some funky characteristics when the “math” become  more complicated too.  I’m not sure but this seems to go along with an earlier post regarding standard error and fitted line calculations.
Anyway, just a thought.

0
#135792

Griswold
Participant

Ignore the last sentence of my last post….Having a rough day here, thinking about two separate topics….I believe it is now Miller time.

0
#135803

statisticsKU
Member

mean is the 1st order where as SD is the 2nd order of mgf. how do you relate that with the question in discussion.

0
#135812

Rajesh Mohandas
Participant

To calculate variance, the mean of a group of scores is subtracted from each score to give a group of “deviations”. If we then calculate the average of these deviations then we have a measure of how the scores deviate from the mean, i.e. how they vary. Unfortunately if you do this you will always end up with the same result for mean deviation – zero! This is because the mean has as many scores above it as below it and therefore the deviations will cancel out! To get around this we can square each deviation before calculating the mean deviation – this gets rid of any negative deviations and therefore we do not end up with zero for the mean deviation – the statistic you do end up with is called variance.

0
#135813

Rajesh Mohandas
Participant

KS, your question is interesting, why not the absolute value and why the square. I guess, the answer below should be helpful.
A fundamental task in many statistical analyses is to characterize the spread, or variability, of a data set. Measures of scale are simply attempts to estimate this variability.
When assessing the variability of a data set, there are two key components:

How spread out are the data values near the center?
How spread out are the tails?
The variance is roughly the arithmetic average of the squared distance from the mean. Squaring the distance from the mean has the effect of giving greater weight to values that are further from the mean. For example, a point 2 units from the mean adds 4 to the above sum while a point 10 units from the mean adds 100 to the sum. Although the variance is intended to be an overall measure of spread, it can be greatly affected by the tail behavior.
The standard deviation is the square root of the variance. The standard deviation restores the units of the spread to the original data units (the variance squares the units).
the average absolute deviation measure does not square the distance from the mean, so it is less affected by extreme observations than are the variance and standard deviation. Hope this clears your dobut.
Rajesh Mohandas.

0
#135814

aparna
Participant

Hi let me give a very simple answer. average is teh sum of observations divided by no of observations. if u take the diff of each observation from eth mean and sume it for standard deviation. the result would be 0!
thats the diff of teh observation from teh mean. then take mean and tehn teh square root

0
#135819

KS
Participant

Thanks Rajesh, i got it.

0
#135886

sumant
Member

Ok. How this is related in the discussion is, Both mean and SD are derived from probability distributions. For each distribution, based on MGF, mean and SD derived. For normal distribution, SD is the mean square root of deviation.
whereas for a continuous distribution, it is given by

The variance is therefore equal to the second central moment .
When you will derive, you will get the formula of SD.i.e

Just for understanding purpose, what rajesh told, it is correct.

0
Viewing 12 posts - 1 through 12 (of 12 total)

The forum ‘General’ is closed to new topics and replies.