Politics & Policy

Symbols, Statistics, and Stereotypes

Yes, you should judge people as individuals.

In his elegant meditation on stereotyping, Theodore Dalrymple gives one example that seems clearly not to fit with the others: belief that the Antarctic tends to be cold. This is a generalization fully reducible to observation and statistics. The other examples are symbols, which enable you to intuit something about another mind that cannot be observed (the thought it wishes to express, the reference it wishes to make, the attitude it wishes to convey) and that you would not come to understand through statistical methods or define in statistical terms.

Some of the symbols Mr. Dalrymple mentions have explicit meanings, such as the India-ink tattoo by which British youths convey the message that they have been to reformatory, while others are styles of grooming and attire that have a cultural or ideological significance.

So there are two distinctions here: a main distinction between symbols, on one hand, and generalizations from experience, on the other; and a secondary distinction, among symbols, between explicit meaning and sociocultural connotation.

It would be useful to identify what it is that people object to in what are called “stereotypes.” I think it’s roughly this: the making of assumptions or judgments about someone’s mind or character on the basis of anything other than his own expressions of them. No one detects prejudice in the observation that a person of African descent is more likely than one of European descent to have sickle-cell anemia, but when it comes to our minds, we want to be seen — or at least we think we want to be seen — as individuals.

For this reason, the main distinction is important. Symbols cannot exist absent a social and cultural background, but understanding their meanings does not depend on comparing any occasion of use with any other. If the symbol is ambiguous, the ambiguity on one occasion of use could not be resolved by observing another occasion (although the possibility of ambiguity could be seen by observing various cases). And all of this is true even when a symbol establishes group membership. If someone is wearing a military uniform, you know that he has identified himself with a well-defined group and embraced its ethic and ethos, but you are making that judgment on the basis of something that he individually has done to communicate a meaning.

Meaning is murky in cases that fall on the “sociocultural” side of the secondary distinction. So I must disagree with such passages in Mr. Dalrymple’s essay as this one: “The fashion among young males for low-slung trousers . . . originated as a symbolic identification with prisoners, who have their belts removed from them on arrival in prison. . . . Those who see, or rather intuit, in this fashion an insolent defiance, a deliberate rejection of what would once have been called respectability, are surely right to do so even if they do not know the origin of the fashion. The same is true, incidentally, of those who obey the fashion; they may not know its origin, but they are fully aware of the effect it is likely to have on those whom they wish to offend.”

Of course it is likely to have that effect on those who believe, a priori, that anyone who obeys the fashion wishes to offend them. But what reason have they to believe this? Whence this certainty? What justifies the “surely” and “fully aware”? Many baggy-panted individuals have presumably adopted the style, which is now quite common, without giving it a moment’s thought. Moreover, unlike that of a military uniform, the meaning of baggy pants was not fully determinate even in origin: It could equally well have expressed an endorsement of criminality or a protest of perceived systemic injustice. Which meaning someone had in mind would certainly have been relevant to your assessment of him, and today he may mean nothing at all. Ideologically or culturally charged styles tend to lose their meaning with time, until they become, simply, styles. To assume that anyone who adopts the style does so for the same reason as those who inaugurated it is indeed to make an assumption about his mind on the basis of something other than his own expression of it. There is no clear line here, but there is a clear general principle: The less definite a symbol’s meaning, the more hesitant we should be to draw conclusions on its basis, and the more skeptical we should be of our starting assumptions.

Further, even a fully determinate symbol does not necessarily imply anything determinate about an individual’s likely conduct. The India-ink tattoo is an excellent example of this. What it means is “I have been to reformatory.” What it connotes is “I am not to be trifled with.” And what that means is anyone’s guess. Caution in dealing with such a person is understandable, but we should remember that his character cannot be reduced to, nor his likely conduct inferred from, his history.

There is, finally, a lesson here for the individual who wishes to avoid being stereotyped: If you want others to see you as sui generis, you must first see yourself that way. An extreme individualist would avoid the realm of sociocultural connotation as much as possible, and might therefore end up looking quite conventional.

Although it seems to me that some of Mr. Dalrymple’s conclusions depend on attributing a false determinacy to ambiguous symbols, I think his essay is an important and humane corrective to the misuse of statistics as a guide to interpersonal dealings. In this connection, a word now about the much-abused thought experiment “Whom would you rather meet in a dark alley?”

There are two ways of approaching that question. The first is to make statistical arguments about what a certain type of person is likely to do in a certain type of situation. If the resulting conclusions are supposed to guide our conduct, it is the actual probability of some outcome that matters, not the mere existence of a statistically significant correlation. There are all kinds of statistically significant correlations we ignore because the probabilities associated with them in any particular case are too low to worry about. For example, there is a statistically significant correlation between rock-climbing and falling to one’s death, but the probability of falling to one’s death is low enough that rock climbers practice their sport regardless. This is not irrational. What would be irrational is to live your life in a state of paranoia about even very small probabilities.

From statistics about crime by demographic group, how are we supposed to calculate the probability that that guy will mug us here and now? Our statistics are about crime in large populations over long periods of time, so we’ll have to do something like this: First we calculate the (low) probability that a hypothetical average person of that guy’s profile will commit a certain type of crime over some long period of time; then we choose some very small unit of time to serve as a rough approximation of now and calculate the number of units in the long period; and then we divide the probability by the total number of units (thus breaking our initial time interval into a very large number of much shorter time intervals). Here drops out of the picture, since our statistics are not sorted by type of location, dark alley vs. busy street vs. grassy park. The final probability will be minuscule, and completely unilluminating. That is because our calculation has abstracted away from the myriad situational factors — themselves unrelated to demographic profile — that we would have to take into account in order to say with confidence that someone was likely to commit a crime. Statistical methods are not fine-grained enough to capture these things.

If there is a statistically significant correlation between demographic group and crime, we could adopt a vague principle of “extra caution” in our dealings with members of the group, but we must admit that we have no idea how likely even the hypothetical average member is to commit a crime in some particular situation, let alone the actual human being who stands before us. Nor can we say how much likelier he is than some other type of person to commit a crime in that situation. We have simply translated the demographic correlation into an abstract character trait such as “prone to violence,” predicated the trait of everyone in the group, and let our imagination run wild about what someone “prone to violence” might do. There is nothing scientific about this. It is not statistical common sense. It is a pretext for prejudice.

There is an underlying point here about the crudeness of using social-scientific methods to predict the conduct of individuals. Even if we had statistics that controlled for the myriad situational factors, different people could not be expected to act the same way in the same type of situation. This is true in principle if it is false that human conduct is causally determined. If determinism is true, the causal explanation of conduct will have to be given in terms of neurobiological descriptions of particular brains. This is almost unfathomably remote from social science, and even from most of biology. It is also inapplicable to our actual dealings with one another, unless we devise a way to go around filming people’s brains with real-time neural video cameras.

The relevance of non-situation-specific probabilities is more debatable if we are deciding how to proceed in a very large number of cases. A police force might practice various kinds of profiling if its sole concern were to stop as much crime as possible, because even tiny differences between the probabilities of different types of people of committing or having committed a crime could result in significantly different aggregate outcomes when multiplied by thousands or millions of individuals. To this we should reply that stopping as much crime as possible ought not to be our sole concern. The same principle that leads us to think it better that a guilty person should go free than that an innocent person should be found guilty will lead us to eschew profiling.

The second approach to the dark-alley thought experiment is to think of it as an interaction between persons rather than an event type to be studied empirically. This corresponds to what we said above about understanding the meaning of a symbol, with “action” substituted for “symbol” and “purpose” or “intent” for “meaning.” If someone points a gun at you and says, “Your wallet or your life,” you need not reflect on the statistical probability that he is being serious rather than playing a practical joke, or that he will shoot you if you refuse the demand, or that by the word “wallet” he refers to your wallet. This would be contrived and absurd. If God were a social scientist (God forfend), He would possess statistics on these things, but we would have no need of them. We already understand the mind of the mugger, and any ambiguity could not be resolved by consulting other cases.

Statistics play a part in establishing the dark-alley fear, but it is small and informal: Your awareness that there is a lot of crime in the neighborhood makes you worry about being a crime victim. For reasons similar to those discussed above, a calculation of the probability of crime here and now would yield a very small number and depend on a great many simplifying assumptions (though not nearly so many as are involved in attempts to predict the conduct of persons). But really you are not thinking statistically about the probability of crime here and now. Rather, as you walk down the dark alley, you say to yourself, “My, what a fine place to mug someone . . .” This way of thinking is fundamentally imaginative and interpersonal, not empirical and quantitative.

A frisson of fear may then run through you if you hear footsteps, but you feel it before you know whose footsteps they are. This “before” is important. It implies that the identity of the individual has only a negative function: It does not establish your fear — the dark alley does that — and can only diminish it. (If you see the person before hearing his footsteps, your frisson is due simply to seeing someone rather than to the particulars of his identity. These things are conceptually distinct even if there is no perceptible delay between seeing someone and noticing the particulars.)

Why might the identity of the person diminish your fear? Perhaps you make a snap judgment about his physical capacity to overpower you, such that you do not fear the elderly (although they too can carry guns). Perhaps you see a child, who is not mentally or physically mature enough to mug someone. Perhaps the person is wearing a professional uniform, which shows him to be employed. These are not stereotypes in the sense given above.

Other people will diminish your fear because they deviate from your mental paradigm of a mugger. Inevitably we keep these sorts of stock images in our heads. When you imagine a mugger, you probably see a young man rather than a woman or an old man, and you do not see him wearing a cowboy hat. You have probably acquired this image informally, from news reports and portrayals of muggers in popular culture, but it could be given a more plausible statistical defense than a general suspicion of young urban men could be. For while it is not true that most young urban men are muggers, it is true that most muggers are young urban men. Accordingly, if someone is a mugger, there is a high probability that he is a young urban man, although it is untrue that young urban men are likely to be muggers. Further, without knowing the actual probability that being a young urban man confers on someone’s being a mugger, you can safely assume that, among those who are muggers, you will not find old women in cowboy hats.

This last is the vital point, and it must be understood in connection with what we have said about the purely negative function of the individual’s identity. What establishes your fear is the dark alley itself: Being in it causes you to see everybody as a potential mugger. If you see an old woman in a cowboy hat, you rule her out, but a young urban man adds nothing to your fear — he simply fails to neutralize it. What you have directly stereotyped is muggers, not young urban men. Indirectly, you have stereotyped old women in cowboy hats as people who can be assumed not to be muggers, and young urban men as people about whom this assumption cannot be made. These indirect stereotypes are operative only in the special circumstance of the alley, where you are thinking first about muggers. They are not based on anyone’s probability of mugging you right now, but rather on whether someone might at any time be a mugger. And since you are ruling people out as possible muggers, the threshold of exclusion will be very high. In particular, you will not rule someone out purely on the basis of his race, since people of all races are represented among muggers, although not necessarily in proportion to their share of the population. In all of these ways, the submerged rationale of dark-alley reactions differs from the habit of thought we dismissed as prejudiced a few paragraphs ago.

It is nonetheless a sideshow. A frisson is not a judgment, and it implies nothing about how you should decide to act. The thousand subtleties of the encounter that follows are vastly more important than whether someone “looks like a mugger,” and they cannot be studied from afar. How does the person approach? Does he leap from a hidden place? Are his hands visible and empty? Does he greet you and introduce himself? And cetera and cetera and cetera. What constitutes an appropriate response to these things has nothing to do with assumptions about types of people. It’s all right there between the two of you. So, for that matter, is the crucial stipulation that he approaches you rather than vice versa — a stipulation that shows the inaptness of the dark-alley scenario as a point of comparison with the Zimmerman–Martin encounter or with President Obama’s experience of having been tailed in a department store.

In our application of statistical abstractions to what is immediate and personal, we have reached conclusions that are bluntly, blindly, stupidly categorical.

Perhaps this discussion has seemed hair-splitting and abstruse, but I think we have simply drawn out the implications of “Judge people as individuals” when this principle is not subjected to a trivializing interpretation. It is a principle in which I believe very strongly. It does not mean that we should ignore the social and cultural contexts in which people act and on which the meanings of their actions depend. To the contrary, it means we should pay very precise attention to these things, while remembering that each occasion of action — and each mind — is a type of its own.

— Jason Lee Steorts is the managing editor of National Review.

Exit mobile version