### Average is Not Subjective

In the course of research for a class assignment, I found this article. And although I agree with her overall argument, the author of that article makes a common and annoying mistake:

This indicates a complete lack of understanding of the statistical meaning of the term average. It is not some arbitrarily defined, artificial construct, defined by people with power. It is a mathematical property of a group of data points.

To decide, based on some measurement scale, whether a characteristic of a person is 'normal' or not requires three things. Firstly, you have to have an accurate and reliable scale (otherwise, the data points may or may not be normally distributed, but they won't reflect anything 'real' about the person). No scale is completely accurate - for example, how many rulers measure height/length to 1/5 of a milimeter (and even one that did would have some level of precision it couldn't measure)? But as long as it's close enough, it can tell you something of use.

Secondly, you have to have a continuous variable. Although you may not be able to measure each data point, all the data points in between the extremes must be

Thirdly, the population must have a normal distribution. What this means is that if you find the average of the data points (calculated by adding all the points together and dividing by the number of points), most of the data points will be pretty close to this average. In fact, there is a number known as the standard deviation, calculated by taking the difference between each point and the average, multiplying it by itself, adding them together and taking the square root of them. In a normally distributed population, 68% of the points will be within one standard deviation of the average, and 95% will be within two standard deviations. (For an example of a trait that is

So, for a given data point, representing a specific person's score on a specific measurement, how 'normal' they are represents how close they are to the average. Typically, we define any score that is two or more standard deviations from the average as being 'abnormal'. For example, with IQ (which meets the criteria of reliability, validity and normal distribution, and approximates a continuous variable), the average IQ score is 100, and the standard deviation is 15. So, the 5% with abnormal IQs are equally divided between those with an IQ over 130 and those with an IQ below 70. (In other words, 2.5% of the population has a cognitive disability, and 2.5% are intellectually gifted.)

So average is

I know I'm abnormal. To me, this is a neutral statement. On some highly significant psychological characteristics, I score more than two standard deviations away from the average score. (For example, my tested IQ is 137.) Abnormal does not mean bad - it means atypical, in a measurable statistical way. Normal is a real, measurable thing, and so is abnormal. And admitting that does not mean you have to accept the societal baggage applied to those terms.

*"To complicate matters, definition #4 has arrived on the scene to indicate that normal has*

to do with any intelligence and development that is 'averageâ€˜, and, furthermore, that normal

means 'sane' or 'free from mental disorder.' But what is 'average', but an artificial and

impossible construct that none of us statistically embody, and what does it mean to be 'sane'?

Simply put, #4 has circled back and elaborated upon the first (#2) reading of normal: that a

society (or its power-owners) decide what the ranges of 'normal' intelligence and development

are, and whatever falls outside of those ranges is then labeled as sub-normal, extraordinary, or

insane. In this sense then, we all exist in some way as deviations, or as 'abnormals'."to do with any intelligence and development that is 'averageâ€˜, and, furthermore, that normal

means 'sane' or 'free from mental disorder.' But what is 'average', but an artificial and

impossible construct that none of us statistically embody, and what does it mean to be 'sane'?

Simply put, #4 has circled back and elaborated upon the first (#2) reading of normal: that a

society (or its power-owners) decide what the ranges of 'normal' intelligence and development

are, and whatever falls outside of those ranges is then labeled as sub-normal, extraordinary, or

insane. In this sense then, we all exist in some way as deviations, or as 'abnormals'."

This indicates a complete lack of understanding of the statistical meaning of the term average. It is not some arbitrarily defined, artificial construct, defined by people with power. It is a mathematical property of a group of data points.

To decide, based on some measurement scale, whether a characteristic of a person is 'normal' or not requires three things. Firstly, you have to have an accurate and reliable scale (otherwise, the data points may or may not be normally distributed, but they won't reflect anything 'real' about the person). No scale is completely accurate - for example, how many rulers measure height/length to 1/5 of a milimeter (and even one that did would have some level of precision it couldn't measure)? But as long as it's close enough, it can tell you something of use.

Secondly, you have to have a continuous variable. Although you may not be able to measure each data point, all the data points in between the extremes must be

*possible*. In terms of height, for example, you could be five feet 8 inches exactly, or five feet 8 1/2 inches, or 5 feet 8 1/4 inches, or even 5 feet 8 7/276 inches. Some of those heights may not be measured properly, but they do*exist*, or at least could. In contrast, whether or not you are an American citizen is not a continuous variable. You either are or you aren't. If there are intermediate points (eg 'landed immigrant'), there is a clearly definable number of those options.Thirdly, the population must have a normal distribution. What this means is that if you find the average of the data points (calculated by adding all the points together and dividing by the number of points), most of the data points will be pretty close to this average. In fact, there is a number known as the standard deviation, calculated by taking the difference between each point and the average, multiplying it by itself, adding them together and taking the square root of them. In a normally distributed population, 68% of the points will be within one standard deviation of the average, and 95% will be within two standard deviations. (For an example of a trait that is

*not*normally distributed in our population, take breast size. There are two separate peaks in the distribution - one for men and one for women. But if you separate out the genders, each gender has a normal distribution of breast size.)So, for a given data point, representing a specific person's score on a specific measurement, how 'normal' they are represents how close they are to the average. Typically, we define any score that is two or more standard deviations from the average as being 'abnormal'. For example, with IQ (which meets the criteria of reliability, validity and normal distribution, and approximates a continuous variable), the average IQ score is 100, and the standard deviation is 15. So, the 5% with abnormal IQs are equally divided between those with an IQ over 130 and those with an IQ below 70. (In other words, 2.5% of the population has a cognitive disability, and 2.5% are intellectually gifted.)

So average is

*not*an arbitrary number. No matter what weight people think they*should*be, the average weight is always the sum of all people's weights divided by the number of people. (This average may, and in fact*has*, changed over time, but it's a 'real' change, not a change in perception.) Even if no one happens to get the exact measurement that constitutes the average, this does*not*mean that everyone is abnormal, because not everyone is the same distance from the average.I know I'm abnormal. To me, this is a neutral statement. On some highly significant psychological characteristics, I score more than two standard deviations away from the average score. (For example, my tested IQ is 137.) Abnormal does not mean bad - it means atypical, in a measurable statistical way. Normal is a real, measurable thing, and so is abnormal. And admitting that does not mean you have to accept the societal baggage applied to those terms.

## 1 Comments:

Another thing that seems to confuse people about the concept of average:

Suppose we say, "People in group X, on average, are more likely to express homophobic attitudes than people in group Y", some people misunderstanding the concept of average think that this is the same as saying "every single individual in Group X expresses more homophobic attitudes than every single individual in Group Y". And because of high in-group variation, they can easily find case examples to counter this idea and thus use it to argue that the whole concept of "average" is bogus.

I have had a number of conversations over the years in which I tried to explain what "average" really means to people who don't understand mathematics (and thus would have trouble with a more math-oriented explanation ... although good with math, I am not strongly math oriented myself). And not all of these attempts have been successful.

But finally I had a conversation in which I explained that what "average" really means in this context is: YES, there is a lot of variation in Group X. And there is a lot of variation in Group Y. You can find very homophobic people in both groups, and very non-homophobic people in both groups. The concept of "average" does not deny this in-group variation. What it DOES say is that, even though there are still plenty of non-homophobic people in Group X, there are still more homophobic people *mixed in among them* compared to Group Y. And the person I was talking to then seemed to grasp this much more easily and came to accept that the concept of "average", presented in this way, could have relevance and meaning even if not everyone in the group necessarily fit the "average".

Post a Comment

<< Home