Saturday, September 22, 2012
The slogan "No means no" is a strategic mistake
Monday, September 06, 2010
Charles Darwin on immigration
Reference:
Darwin, C. R. 1882. The descent of man, and selection in relation to sex. London: John Murray. 2nd edition. (p.142)
Wednesday, March 24, 2010
"Immigrants are taking all our jobs"
Population increases, whether due to immigration or births mean more workers vying for whatever jobs are available, but they also mean that there will be more jobs available because there will be more customers that need to be served and hence more staff needed to serve them. This is why a country with 100 million people will not generally have a higher unemployment rate than a country with 50 million.
Wednesday, December 03, 2008
"Same-sex marriage should be banned to protect what is fundamentally a religious institution" (plurium interrogationum)
Wednesday, February 13, 2008
Australia's Apology
When Australia was invaded in the late 18th century, white settlers brought with them viruses that had never existed in that part of the world before including influenza for which native peoples had never established any prior immunity. These viruses alone wiped out entire tribes of peoples. I refer to them as 'peoples' in the plural because although indigenous Australians are often lumped together as a single group, there were 500 distinct languages spoken around the continent at the time of the British invasion, which should give us some idea of the diversity and cultural richness that existed, especially when we compare this to Europe where there are only around 50 distinct languages today. This richness developed over a very long time, modern homo sapiens settling the continent of Australia some 50,000 years ago, before they'd even set foot in Europe. Yes,
Following initial white settlements, Aboriginal deaths were often of a more deliberate nature with dozens of recorded massacres occurring throughout the 19th and early 20th centuries. Many Aboriginal people were also driven off their land and thereby left to starve after being deprived of traditional sources of food and water.
It wasn't until as late as 1962 that indigenous Australians were even given the right to vote, and for a hundred years ending in 1969, the Australian government had a policy of forcibly removing Aboriginal children from their families in the hope of assimilating them into the dominant white culture. The youngest members of these 'stolen generations' are only in their 40s today and can still clearly remember the trauma of being separated from their mothers as children and forced to live in state-run orphanages.
Today,
Thursday, August 09, 2007
Prejudice is the milk of a sacred cow
There is a popular myth, and that is all it is, that prejudices arise from a certain kind of ignorance that leads people to overgeneralise properties observed in a small number of unrepresentative individuals to all individuals belonging to their group. But even if some prejudices are acquired this way, it can hardly account for the majority of cases. The reason is because it implies that everyone who acquires a particular prejudice does so independently of everyone else, but if this were so, we’d have no way of explaining why particular prejudices attach themselves to particular sub-cultures during particular periods in history and not others. A more plausible explanation is that prejudices are mostly acquired as part of the set of unexamined beliefs and attitudes that constitute a person’s culture, and although reports of negative experiences are central to the justifications that people give for their views, personal experiences of this kind actually play very little part in establishing them.
Satisfaction in the absence of reason and evidence
The drinking habits of people from the neighbouring cities of Cologne and Düsseldorf are suspicious in this sense. In the former city, the beer of choice is usually Kölsch, and in the latter, Altbier. Given that both brands are available in both cities, it would be perfectly possible for the inhabitants of each to sample both and come to their own independent conclusions about which they prefer, but they clearly don’t. Brand loyalty is one of many manifestations of the friendly rivalry between the two cities and serves to distinguish group identities. People from Cologne not only prefer Kölsch, but also people who prefer Kölsch, with Altbier serving the same function in Düsseldorf. Hence, it would be unsurprising if the pressure to go along with the crowd is what provides that extra incentive to drink the local brand over and above whatever intrinsic value the beers possess. This would all seem unremarkable if it were not for the observation that the people from these cities often seem completely earnest when they declare their preferences, suggesting that they are not merely overriding their personal judgements for the sake of conformity. There may be a few Machiavellian types who prefer the non-local brew secretly but pretend otherwise to reinforce their identity within the group, but this requires consciously distinguishing between the taste of the beer and the benefits of conformity. It seems just as plausible that people are actually being earnest about their preferences but simply conflating the sensation of taste with the positive feeling of belonging, the emotions being indistinguishable in the mix so that when they say they prefer their local brew, what they are actually evaluating is the whole range of emotions associated with the experience, not least those having to do with the camaraderie of it all. When they drink the rival beer, those positive associations are absent, and this would explain why they fail to hold it in the same regard. Irrespective of how this pattern of preferences arises, the positive feelings individuals attribute to their local brew can be so strong that it is incomprehensible to many of them that people from the rival city could genuinely hold exactly the opposite opinion. It seems so obvious which beer is better that those who disagree appear insincere or foolish, qualities that ordinarily make a person deserving of scorn. Negative feelings towards drinkers of the rival brand would not be readily explained if drinking preferences were merely self-conscious, Machiavellian displays.
I don’t understand, therefore it’s stupid
When we are thoroughly convinced of something, it is virtually inevitable that we see those who disagree with us as foolish or dishonest, thereby attributing low status to them, but if our views are not founded on reason or evidence, the unavoidable conclusion is that our downward glances are examples of prejudice. If feelings of belonging are central to understanding how people unconsciously come to be satisfied with their beliefs, it is possible for prejudices to develop without a person being directly taught that its targets are of low status. The targets will seem unworthy simply because they fail to embrace what appear to us to be so obviously the right standards. Even racism, which is generally taken to be about superficial features rather than cultural differences can be understood in these terms, since a person who inherits a given set of racial characteristics will also very likely inherit a culture specific to it (culture being passed down in families too) and differences between the values of different cultures may account for why the other is taken to be foolish or dishonest, differences in outward physical characteristics simply being used to identify the groups involved. This is most evident when race is determined by ancestry in the complete absence of any physically identifying features. As for sexism, conformity to separate gender identities would explain why people are often dismissive about the concerns of the opposite sex, simply failing to understand why certain issues are important to them. The leap from failing to understand a person’s motives to concluding that they are a product of ignorance is seductive.
Received wisdom
Some opinions may not be based on reason or evidence directly, but happen to be correct nonetheless, either by sheer luck or because they are based on the reasoning or experience of those who have passed their wisdom down to us. Moral lessons that were hard-won by previous generations are often simply passed down in the form of myths and rituals without subsequent generations critically evaluating how they help us get along in the world. It is likely that the cultural values that have survived the ages have done so precisely because they gave strength and stability to the societies that embraced them, while the values that didn’t survive were lost along with the societies they helped destabilise and destroy. This means we shouldn’t dismiss traditional values purely on the basis that they are passed on uncritically rather than discovered, but we shouldn’t trust them unquestioningly either and for a number of reasons. Firstly, a value might have been desirable under the conditions in which it first appeared in the remote past, but serve no function in modern societies. For instance, a taboo against sex outside of a permanent relationship was quite practical before the advent of contraception because the potential for negative consequences was far greater for single mothers. Another reason to question the values passed down to us from previous generations is that the stability and strength of a society is not necessarily an indication of fairness. History is filled with examples of enduring empires that achieved their strength and stability by brutally suppressing dissent. The value systems that survived in these societies were mostly geared to the interests of those in positions of power, with mechanisms in place to punish disloyalty.
Cognitive dissonance and the suspicious clustering of beliefs
The set of values that identify a group can be quite arbitrary. There is for instance, no obvious reason why environmentalism and opposition to the Vietnam War were correlated or why opposition to abortion ought to go with support for tax cuts. We have to suspect that when apparently independent values cluster together more often than chance would dictate, group loyalties have largely overridden rational judgements about the issues involved. The notion that a person’s political views can be summarised as leftwing or rightwing is a paradigm example of this and is indefensible, perversely conflating every issue of contention into a single dimension of disagreement [though click here for an intriguing attempt to partially explain why the views of liberals and conservatives cluster the way they do]. Rational debate cannot really be sustained under such conditions, and as a consequence, those with party loyalties will be forced to make tortuous justifications that belie the fact that their values have been acquired without the mediation of reason and evidence, justifications that look for all the world like insincerity, but which are plausibly just elaborate forms of self-deception. The ‘proper’ view within a given social sphere is just whatever it is rewarding to believe, with rewards provided in the currency of acceptance and respect from the group that cultivates the individual.
The inertia of the status quo and the social function of irrational behaviour
If we dismissed all of this as ignorance rather than cultural, we’d miss another of the interesting questions in connection with all this. We’d be forced to conclude that where irrational beliefs endure, it is simply because the truth is too complex for the ignorant masses to grasp. But even if the masses are ignorant, the constant buffeting of even mildly challenging evidence should, if given enough time, drive the evolution of a group’s beliefs towards those that are rational and consistent, unless there is something actively preventing this. To the extent that this rationality is lacking, we should consider the counter intuitive possibility that irrational beliefs actually serve a function.
The unfinished business of The Enlightenment
The Enlightenment raised consciousness about a new set of methods for acquiring knowledge in a reliable way. It was only as recently as the sixteenth century that a word was even coined to label the concept of a fact, appearing first in the context of the law and later being extended to other domains. And it was only four hundred years ago that Galileo was arguing that we need to make observations of the world in order to understand it, rather than simply contemplating it. Prior to this period, the dogma of the church followed Plato’s view that worldly objects were degenerate copies of those to be found in the ‘realm of perfect forms’, only the latter being regarded as the proper objects of inquiry. Beginning with pioneers like Galileo, the seventeenth century saw the birth of the scientific method, which meant testing predictions via observations and experiments. With it, nature began to reveal herself apace, leading directly to the industrial revolution and the associated growth in technology. We take these ideas for granted today, forgetting that there was ever a time in which a case had to be made for embracing them, but there is another question lurking here, one which hasn’t received as much attention, but which is arguably necessary to finally break free from lingering vestiges of the Dark Ages. The question concerns how people were forming beliefs before the Enlightenment, and to what extent these habits have endured alongside our newfound appreciation for the scientific method.
I have attempted to argue here that a key factor in establishing unreliable beliefs is the confounding influence of membership within various social groups. This is backed up by studies of social conformity that suggest that what other members of a group say even about matters of direct experience can strongly bias not only what individuals report seeing, but also what they think they actually see [source]. Prior to the enlightenment too, disagreements about matters of opinion must have been settled in an almost entirely ‘tribal’ way, being accepted or rejected principally on the basis of whether those who expressed them were assumed to be on one’s own side or not, ignoring matters of actual substance.
When beliefs and values are among the badges people wear to identify themselves as members of a group, it also sets up a dynamic in which people compete for status by adopting extremist positions. Extremism is not just confined to religion, but arises in social groups generally. We can see it in teenagers who drink to excess to demonstrate their party credentials to one another, in musicians who compete amongst themselves by pursuing technical prowess rather than melodic qualities, and so on. These kinds of things are often harmless, frequently pointless, but obviously dangerous when the extremist positions involve advocating murder, terrorism and genocide.
Friday, June 22, 2007
"HIV/AIDS is God's way of punishing gays" (non sequitur)
Homosexual males have a higher risk of HIV infection than heterosexuals, who in turn have a higher risk of infection than lesbians. If HIV is part of God’s purpose, then we would have to infer that He dislikes gay men the most, dislikes heterosexuals somewhat and thoroughly approves of lesbians. HIV/AIDS clearly doesn’t differentiate between homosexuals and heterosexuals in the way that some people would like it to.
Monday, May 22, 2006
“Deafness is not a disability” (argumentum ad consequentiam)
Many deaf and hard of hearing individuals resent being told they have a disability. Passionate opposition to this idea is for instance found in the official documents of The National Association for the Deaf in the
Many within the medical profession continue to view deafness essentially as a disability and an abnormality and believe that deaf and hard of hearing individuals need to be "fixed" by cochlear implants. This pathological view must be challenged and corrected by greater exposure to and interaction with well-adjusted and successful deaf and hard of hearing individuals. [source]
Extreme proponents of this view regard giving a deaf child a cochlear implant or hearing aids as akin to ‘correcting’ the colour of a black person’s skin by making them white. They argue that deaf people are unlike hearing people, but that deafness is only a disadvantage to the extent that deaf people are unfairly excluded from a society engineered around the concerns of the hearing.
From an evolutionary point of view, this position is almost impossible to maintain. It would be to claim that the sense of hearing has no selective advantage in humans. But without it, an entire channel of information about potential threats and opportunities is closed off, information that a hearing person can exploit to great advantage. Hearing people have access to information about events occurring behind their backs, through occlusions and in complete darkness. Without looking, they have access to exceedingly subtle information about their surroundings. They can tell whether they are in a city or in the countryside, indoors or outdoors, in a large room or a small room. They can tell the difference between sounds produced in a room with tiles versus carpet. Even the sound produced by tapping on something reveals subtle information about the kind of material it is made of and whether it is hollow.
Hearing people take this information for granted and tend to focus on the consequences that hearing loss might have for their ability to communicate. But in the context of language, a stronger case can be made that the disadvantages associated with deafness are basically arbitrary, a result of the broader culture communicating via a different but equally expressive code. If hearing people communicated using signed languages, the code would be accessible to deaf people and in terms of expressiveness there would be nothing lost, signed languages being just as systematic as spoken languages and capable of conveying meanings that are just as subtle and precise. Indeed, a perfect comparison can be made with signed English, which is English in exactly the same sense that both the spoken and written forms are – just conveyed via yet another medium. In terms of what can be expressed, signed and spoken languages are equivalent, but there are nevertheless advantages and disadvantages associated with each. If you know a signed language, you can communicate through panes of glass (useful for expressing final sentiments through the window of a train as it pulls away from the platform), and in noisy environments such as nightclubs and factory floors. You can converse without waking babies and without alerting enemy soldiers. And you can talk with your mouth full. On the other hand, if you can speak and hear, you can communicate with someone in another room or in the dark. You can also converse while your hands and eyes are occupied – chopping vegetables or whatever. Hearing people can of course learn to sign if they choose, thus availing themselves of all the benefits of both modes of communication, but the reverse is not true. The ability for deaf people to communicate via the auditory channel is incontrovertibly impaired.
Sound carries subtle information about the environment and allows communication in contexts that sign language does not, but sound is also a source of subjective pleasure, the experience of sound having a richness that is very difficult to explain to the congenitally deaf. How can one explain the contagious effect of laughter, the drama of rolling thunder, or the penetrating effect of a baby’s cry? These sounds get into us, affecting our emotional state directly. The same is of course true of music, but its effect on the hearing must seem almost mystical to those who have never heard it.
It is hard to avoid the conclusion that deaf people are genuinely missing out on something that hearing people experience, and there are almost no advantages to being deaf that would compensate for this, the exceptions being situations in which a deaf person is untroubled by sounds that a hearing person would find unpleasant or distracting (noisy neighbours, elevator music, construction sites, and so on). But even in these cases, a hearing person has the option of plugging their ears.
To most hearing people, going deaf would be far from an insignificant event, which is why it is often surprising for them to find that those who are born deaf usually view their condition quite differently. Hearing people and those with acquired deafness usually equate it with suffering, but those who have been deaf all their lives generally don’t think about it in this way. After all, they have never known any other life so have nothing to compare it to. It would be like a person lamenting that they couldn’t fly. We are not filled with bitterness and regret about being unable to fly because we’ve never had any serious expectations of being able to beyond a few bone-breaking experiments in childhood. This doesn’t mean that many of us wouldn’t like to be able to if we had the option, but it is senseless to pine for something that is impossible to obtain. With this in mind, imagine birds looking down at us from the tree tops endlessly offering their sympathy to us – their unfortunate flightless companions – and you may go some way to understanding why the deaf resent the ‘poor you’ attitude of hearing people. For the average hearing person, meeting a deaf person is a novelty, which confronts them with a contrast that fills them with pity. But for deaf people who know nothing else, these expressions of sympathy are jarring and misplaced. They are justifiably frustrated at constantly being made to feel like victims and yearn to be seen as more than just deaf [example]. This may go some way to explaining why there are those who deny that deafness is a disability, but this would be to miss the target of a legitimate complaint. The legitimate complaint is the same as that of anyone who stands out from the crowd for reasons that are beyond their control. People get described as the ‘tall guy’, the ‘fat girl’, the ‘one with the big nose’ and so on, descriptions that obviously overlook virtually everything that’s important about a person. Likewise, that ‘deaf girl’ is never just that. She might also be someone’s sister, love animals, have a keen interest in photography, or whatever. She is a complete person.
There may also be more familiar reasons for people denying that deafness is a disability having to do with people simply believing what they want to believe, but there is an important difference between dealing with something that we accept is unpleasant and dealing with it by denying that it is. This is of course, easier said than done. After someone dies for instance, there is no easy way to reconcile ourselves with the intolerable proposition, which leaves us shaking our heads in disbelief, that we will never again be able to talk to our loved-one again. Nor is there an easy way to accept, when a relationship ends, the intolerable proposition, which leaves us unable to breathe, that the person who knew us more intimately than anyone else in the world and who is therefore most qualified to pass judgements about us, has judged that we are no longer worthy of their love. Also intolerable is the proposition that there is something wrong with us, like being deaf, that will put us at a disadvantage for the rest of our lives. It is very difficult to console ourselves when confronted with the unbearable permanence of these things, so it is little wonder we find ourselves concluding – without evidence – that our dearly departed are now “in a better place” where we will someday be reunited with them, that our lover will someday come back into our arms once they realise their mistake, or that the problem lies not with us, but with a condescending society that treats us as though we are broken and need to be fixed.
Most of us see it as profoundly insensitive to challenge the views of those who we think are suffering, which is no doubt why victims get away with saying blood-curdling things on the steps of criminal courts, why there was so little opposition to the US invasion of Afghanistan directly after the 911 atrocities, and why there was so little opposition to establishing the state of Israel after the second world war. But, at least in these cases, the ‘profoundly insensitive’ option may actually be better than the consequences of not debating the issues.
Though at a different scale, the consequences of uncritically denying that deafness is a disability would be to legitimise the attempts of deaf parents to have deaf children by design [example], or to legitimise their attempts to deny their deaf children hearing aids or cochlear implants during the critical period of brain development when auditory input is necessary for the proper development of auditory processing capacities. Both of these decisions would lead to avoidable suffering, and without the child having any say in the matter. If, when the child grows up, they wish to discard their hearing aids or switch off their cochlear implant, they have that choice, but the reverse is not true – you cannot raise a deaf child without these devices and let them choose whether they want them as adults because, by then, the critical period of brain development will have long since passed and they will never be able to make sense of the sounds they hear.
The consequences of a belief being true frequently account for why people hold it far better than the evidence supporting it (this is the logical fallacy of argumentum ad consequentiam) and the issue of whether deafness is a disability is no different. For deaf people, these consequences are emotional – it liberates them from the intolerable proposition that they are physically flawed, while also providing an intellectually convenient way of dismissing the understandably frustrating attitudes held by many hearing people. On the other hand, to deny that deafness is a disability would be to legitimise decisions of deaf parents to have deaf children by design, or deny a deaf child hearing aids or a cochlear implant when they could otherwise benefit from them. These consequences tell us why it matters to those on either side of the debate, but the truth is what it is, good or bad. Only evidence can tell us whether deafness is a disability, and on this basis, it is hard to deny that it presents a disadvantage that, although not as grave as some other disabilities, is a disability nonetheless.
Thursday, December 01, 2005
“Homosexuality is immoral because it is unnatural” (non sequitur)
An unnecessary but common response to this claim is to dispute that homosexuality is unnatural by pointing to the abundant evidence of homosexuality in non-human species. But even if homosexuality were unnatural, this wouldn't make it wrong or unhealthy. If we were to shun everything that is unnatural, we'd be forced to abandon our houses, clothes, medicines, technology, art and much else. Clearly there are some unnatural things we consider good. Likewise, we take earthquakes, tsunamis, snake bites and sunburn to be natural, but not good. And those items on the supermarket shelf proclaiming all natural ingredients like salt, sugar and fat are not good for your health. Whether or not homosexuality is natural or unnatural tells us nothing about whether it is right or wrong, desirable or undesirable. The argument is a non-starter (and for those interested, an example of an argumentative fallacy called the appeal to nature).
Suppose for a moment that for some reason all the other unnatural things that we like are permissible because they have nothing to do with sex, that sex is a special case in which virtuousness always coincides with naturalness (we'd be guilty of a different argumentative fallacy called special pleading, but let's allow it in this case). If this were so, then we might look to the sexual behaviour of other species for examples of how to behave. We could model our behaviour on bonobo chimpanzees, who continually use sex to reinforce social bonds within a group (same sex or not)[source], female praying mantises who eat their mates once they have served their function, ducks who engage in homosexual necrophilia [source], dogs who mount their owners' legs, and so on. Homosexuality and a whole lot of other things would be permitted under this kind of moral philosophy. Not only that, but our guidance would be riddled with contradictions arising from the fact that different species have different sexual practices. For some species, promiscuity is the norm. For others, monogamy is, and so on. We would have to ask what is natural for humans specifically, but if we are contrasting 'natural' with 'man-made', what sense could there be to such a question?
Even if we wish to cling to the view that 'natural' equals 'good' and even if we can ignore the fact that nature seems to make contradictory judgements about what is good and bad, it would still be difficult to argue that homosexuality in humans is any more natural than practices like celibacy, something that few are inclined to condemn. If homosexuality is wrong or unhealthy, this kind of argument simply does not show it.
Note that the claim that homosexuality is immoral because it is unnatural is a part of a more general family of arguments, which also includes justifying promiscuity by arguing that monogamy is unnatural, or eating meat by arguing that vegetarianism is unnatural. As with homosexuality, nature has no consistent attitude towards these things, and even if it did, no moral conclusions could be drawn about these practices (at least on this basis) since 'natural' simply cannot be equated with 'good'.
Saturday, November 26, 2005
"Women are partially responsible for being raped"
A woman is totally or partially responsible for being raped if she “behaved in a flirtatious manner”, “is drunk”, “is wearing sexy or revealing clothing”, “has had many sexual partners”, “has failed to say ‘no’ clearly to the man” or “is alone and walking in a dangerous or deserted area”.
EVALUATION
The publication of these statistics was widely covered in the media on Monday, both in the UK and internationally, generating headlines such as “Women ‘to blame’ for being raped”, “1 in 3 Brits blame rape on women”, and “Rape victims were ‘asking for it’ - Shock Report”, but it is far from obvious that this emphasis on 'blame' actually constitutes a fair interpretation of the views of the survey’s respondents, an interpretation that was almost universally accepted in the media and which has its origins in the original press release issued by Amnesty in which their UK director Kate Allen is quoted as saying “It is shocking that so many people will lay the blame for being raped at the feet of women themselves”, referring to it as a “sexist blame culture”.
The wording of the questions featured in the survey leaves the views of respondents open to a number of interpretations. By holding the victim (partially) responsible in a particular context, some of the respondents might have meant that she deserved to be raped or that the rapist should receive a more lenient sentence. Others might have meant simply that a woman is more likely to be raped if she puts herself in the stated situation. But a respondent is very unlikely to interpret ‘responsibility’ in terms of deserving to be raped if they consider it unthinkable that this depends on the context. Why would the survey be asking about the level of responsibility if it didn’t vary from condition to condition? Given this, it is plausible that a significant proportion of respondents interpreted ‘responsibility’ in terms of whether the woman had placed herself in a situation in which she was more likely to fall victim to rape. This is regardless of any attribution of blame that might go along with that. There is an empirical question about whether there is actually an increased risk of rape in the contexts examined in the survey. Indeed, there is a distinct question for each scenario. But if there are measures that a victim could have taken to reduce the risk in a particular case, would that justify “lay[ing] the blame for being raped at the feet of women themselves”?
Taking measures to prevent becoming a victim of a crime is normal and uncontroversial in other domains. We take precautions when using our credit cards online, we ensure that valuables aren’t visible through our car windows, we leave a light on at home if going on a holiday. It is therefore not grossly implausible that there are things that a woman could do to prevent becoming a victim of rape. She can for instance, refuse to accept a drink from an unknown man under conditions where he could have drugged it (go here for more advice on date rape drugs).
Most of us would of course prefer a world in which there were no criminals so that we wouldn’t have to take such precautions. When we compare this ideal world against the reality, the person who takes no precautions is behaving exactly as we would all like to be able to, but the criminal is not behaving as we’d want. This is why we attempt to deter the criminal with threats of punishment and not the victim even if the victim could have prevented the crime by behaving differently. Our desire to change the behaviour of the criminal is motivated by idealism, whereas our desire to change the behaviour of potential victims is motivated by pragmatism. The notions of blame and punishment are only compatible with the former.
As for the specific claims in the survey, it would be deeply irresponsible for Amnesty to give the impression that these factors do not carry any risk if they in fact do. On the issue of drinking, studies do support a strong link between rape and alcohol use both on the part of the rapist and the victim [source, source]. There is also a correlation with the number of sexual partners a woman has had [source, source]. On the other hand, the risk associated with walking alone in a dangerous or deserted area may not be as large as many assume. According to UK Home Office statistics, 92% of rapes involving female victims are committed by people known to the victim (45% by current partners). Given this, it is unsurprising that more than half of all rapes (55%) actually occur within the victim’s own home, with a further 20% occurring in the perpetrator’s home. 13% occur in public places, often in the vicinity of licensed premises [source].
The extent of risks associated with dressing provocatively, flirting, and sending mixed signals are more difficult to quantify because of their subjectivity. It is often suggested that rape is not carried out with a motive of sexual gratification, but rather a desire to have power over the victim. If so, the way a woman is dressed would be irrelevant. To make this case convincingly, it would be necessary to show why perpetrators choose sexual means to this end rather than other conceivable forms of degradation. There are also difficulties reconciling this view with many cases involving date rape drugs where the rapist’s intention appears to be to get away with the crime without the woman having any clear indication that it occurred. There is some sense in which the rapist has power over the victim in these cases, but the victim is not necessarily aware of being subjugated.
While there are certain kinds of behaviours that appear to be correlated with rape, a rational response is not necessarily to avoid these behaviours, since a woman may judge these activities to be rewarding enough to warrant exposing herself to the associated risks. Being informed of the risks allows a woman to make this choice. The British Crime Survey estimate for the year 2000 was that approximately nine in every thousand women between the ages of 16 and 59 is the victim of some form of sexual victimisation every year, four of these involving rape [source].
Amnesty International also used the publication of their report to publicise the fact that the conviction rate in rape cases is extremely low, citing the “sexist blame culture” that the survey allegedly highlights as the reason that juries fail to convict. However, there are genuine difficulties in demonstrating guilt in rape cases that could at least partially account for the low conviction rates. Demonstrating whether consent was given often comes down to one person’s word against another’s. It is also far from obvious what the optimal conviction rate should be since there will be some presumably small proportion of cases that are based on fabricated charges that should not result in convictions.
Amnesty’s report could have a number of serious negative consequences aside from giving the false impression that women are powerless to reduce the risk of being raped. Publicising how low conviction rates actually are could plausibly embolden rapists, who now believe their chances of getting away with it are much better than they previously thought (I won’t repeat the statistics here). It might also discourage rape victims from reporting incidents, victims who are now convinced that they will be treated unfairly by the courts. If so, it would be grossly irresponsible for Amnesty International to do this, especially if the low conviction rate has nothing to do with a “sexist blame culture”. It would of course be equally irresponsible for the media to repeat these claims uncritically. Due to the difficulties inherent in convicting rapists, it may be more practical to reduce the incidence of rape by focussing on changing the attitudes of potential perpetrators.
By making casual claims about prejudice, we run the risk of generating cynicism about similar claims that are made in a more measured way. The potential damage that reports like Amnesty’s can do therefore extends well beyond the reputation of any particular organisation.