Saturday, September 22, 2012

The slogan "No means no" is a strategic mistake

I would think that the inability to tell whether a potential sexual partner is willing or not is a symptom of a severe cognitive deficit. If the slogan 'No' means no succeeded in getting this point across more often than it started discussions about whether no can sometimes mean yes, I'd be all for it, but unfortunately the stake people put in the latter discussion suggests they have mistaken it for the one that matters. Most people will happily acknowledge that no can mean yes when used sarcastically for instance, and really, the idea that this word is unique in the lexicon in being inflexibly bound to its literal meaning is implausible. I wouldn't want to hitch the fate of my cause to that kind of assertion. But we don't need to. The underlying message is perfectly sound: It is implausible that any mentally competent individual could fail to realize that their partner is an unwilling participant in a sexual act. That is true regardless of whether the participants are using words literally or not. Even if non-literal could be equated with ambiguous (and it certainly can't), it's very unlikely that a misunderstanding could endure for long under those circumstances without being corrected. Indeed, to defend a rape charge on the grounds that there was ambiguity over consent is tantamount to arguing that the defendant is incompetent to stand trial.

Monday, September 06, 2010

Charles Darwin on immigration

Based on the enormous number of people migrating to the United States during his lifetime, Charles Darwin predicted that the U.S. would soon be counted alongside the most powerful empires in history. His reasoning was simple: To move from one country to another requires an enormous amount of courage and energy - to take that journey you had to leave everything behind, cross an ocean and start a new life for yourself without any guarantee of success on the other side. Therefore, after many waves of immigration into the United States, the country would inevitably be brimming with so many highly-motivated people (demonstrated by their willingness to take that journey in the first place) that its workforce would have to be one of the most industrious in the world. Migration was, in Darwin's view, a kind of natural selection acting to weed out people who lacked the courage and ambition to leave home.


Reference:
Darwin, C. R. 1882. The descent of man, and selection in relation to sex. London: John Murray. 2nd edition. (p.142)

Wednesday, March 24, 2010

"Immigrants are taking all our jobs"

Population increases, whether due to immigration or births mean more workers vying for whatever jobs are available, but they also mean that there will be more jobs available because there will be more customers that need to be served and hence more staff needed to serve them. This is why a country with 100 million people will not generally have a higher unemployment rate than a country with 50 million.

Wednesday, December 03, 2008

"Same-sex marriage should be banned to protect what is fundamentally a religious institution" (plurium interrogationum)

Whether marriage should be considered a religious institution has consequences for how it should be legally defined. Marriage is certainly an important part of a religion like Christianity and the Christian Bible prescribes how Christians should go about it, but non-Christians marry as well, and people of other faiths are bound by different sets of commandments concerning marriage that are incompatible with the traditional teachings of Christianity. In Islam for instance, a man is technically permitted to take up to four wives so long as he treats each of them fairly (Qur'an 4:3). A Hindu marriage isn't preceded by courtship or even love, but arranged by family to avoid the blinding influence of infatuations. And if we open our eyes completely, we can't help but notice that non-religious people are in the habit of marrying as well, which means religion's claim on the institution of marriage appears rather dubious. But if marriage isn't specific to any one religion or to religion at all, then which (if any) religious values should be reflected in the laws governing it?

The only way to protect religious freedom is to enact laws that are neutral with respect to the manner in which marriages are carried out. This means that no ban should be imposed that would prohibit a practice sanctioned by a religion. A ban on same-sex marriages for instance would prevent any religious denomination that sanctions them such as the United Church of Christ from carrying them out. On the other hand, if same sex marriages were legal, it wouldn't force denominations that disapprove of them to carry them out any more than it would force them to allow polygamy or any other non-Christian forms of marriage. A ban on same-sex marriages is therefore not, as it is often presented, protecting religious freedoms, but quite directly attacking them.

Due to the offence that a same-sex inclusive definition of 'marriage' causes to certain religious sensibilities, one solution that has been adopted in various countries such as the UK and Australia has been to grant homosexuals the right to enter into a legal union that goes by a different name. These 'civil unions' confer partners in same-sex relationships the legal status as next of kin, which has consequences for how individuals are regarded when visiting their partners in hospital, how inheritance and custody issues are dealt with when one of them dies and so on. In many jurisdictions, civil unions confer all of the same rights that are enjoyed by married couples, just under a different name. As they say, if something looks like a dog, smells like a dog and barks like a dog, it probably is a dog, but when it comes to same-sex partnerships, there are those who insist that the institution that looks exactly like marriage and which confers all the rights of marriage isn't a 'marriage', but a 'civil union' and should be distinguished as such under the law. Put that way, it's surprising that anyone cares one way or the other whether these partnerships go by the same or different names, unless of course we acknowledge that this issue is acting as a proxy for a separate and much more fierce debate. The underlying conflict is of course over the morality of homosexuality itself, but it's being waged under the more widely palatable cover of concerns about religious freedoms and quibbles over the definition of 'marriage'. So the hostilities find a crack through which their flow is tolerated, except there's so much pressure behind them that we find a torrent where we'd expect a trickle.

Wednesday, February 13, 2008

Australia's Apology

When Australia was invaded in the late 18th century, white settlers brought with them viruses that had never existed in that part of the world before including influenza for which native peoples had never established any prior immunity. These viruses alone wiped out entire tribes of peoples. I refer to them as 'peoples' in the plural because although indigenous Australians are often lumped together as a single group, there were 500 distinct languages spoken around the continent at the time of the British invasion, which should give us some idea of the diversity and cultural richness that existed, especially when we compare this to Europe where there are only around 50 distinct languages today. This richness developed over a very long time, modern homo sapiens settling the continent of Australia some 50,000 years ago, before they'd even set foot in Europe. Yes, Australia was settled before Europe! Despite this, Australia's long history is barely acknowledged to this day.


Following initial white settlements, Aboriginal deaths were often of a more deliberate nature with dozens of recorded massacres occurring throughout the 19th and early 20th centuries. Many Aboriginal people were also driven off their land and thereby left to starve after being deprived of traditional sources of food and water.


It wasn't until as late as 1962 that indigenous Australians were even given the right to vote, and for a hundred years ending in 1969, the Australian government had a policy of forcibly removing Aboriginal children from their families in the hope of assimilating them into the dominant white culture. The youngest members of these 'stolen generations' are only in their 40s today and can still clearly remember the trauma of being separated from their mothers as children and forced to live in state-run orphanages.


Today, February 13th, 2008, after many years of campaigning and a recent change in federal government, the new Prime Minister of Australia Kevin Rudd will, for the first time, acknowledge the injustices that have been inflicted on indigenous Australians with an official government apology. The symbolic importance of this gesture for reconciliation is hard to overstate. It has been long-sought, but refused by the previous Howard government on the basis that it could potentially expose it to compensation claims from those affected by the policies of the past. The position of the Howard government was that even if previous policies were wrong, it was never going to admit to it. The present government is also ruling out compensation for those affected by the policies of previous governments, but is still going ahead with a full apology, at last. There has been a catalogue of injustice and the refusal to apologise was among them. And millions of Australians who care about putting things right are joining in solidarity today to say sorry too.

Thursday, August 09, 2007

Prejudice is the milk of a sacred cow

The irrelevance of personal experience

There is a popular myth, and that is all it is, that prejudices arise from a certain kind of ignorance that leads people to overgeneralise properties observed in a small number of unrepresentative individuals to all individuals belonging to their group. But even if some prejudices are acquired this way, it can hardly account for the majority of cases. The reason is because it implies that everyone who acquires a particular prejudice does so independently of everyone else, but if this were so, we’d have no way of explaining why particular prejudices attach themselves to particular sub-cultures during particular periods in history and not others. A more plausible explanation is that prejudices are mostly acquired as part of the set of unexamined beliefs and attitudes that constitute a person’s culture, and although reports of negative experiences are central to the justifications that people give for their views, personal experiences of this kind actually play very little part in establishing them.


The road to understanding prejudice starts with the recognition that people can come to be satisfied with a belief about a segment of society for reasons other than because it accords with reason and evidence. But to dismiss that as ignorance is to focus on the least interesting part of the story. There is nothing new about people forming beliefs without the aid of reason or evidence. That’s what made the dark ages so dark. Yet, despite the emphasis on rationality and empiricism that we inherit from Enlightenment thinkers, the most important predictors of an individual’s beliefs are still the beliefs of the family and broader culture they happen to be born into. When beliefs are so clearly correlated in this way, we have to suspect that the role reason and evidence play in adopting them is essentially nonexistent and that whatever justifications people find for them are rationalisations after the fact.

Satisfaction in the absence of reason and evidence

The drinking habits of people from the neighbouring cities of Cologne and Düsseldorf are suspicious in this sense. In the former city, the beer of choice is usually Kölsch, and in the latter, Altbier. Given that both brands are available in both cities, it would be perfectly possible for the inhabitants of each to sample both and come to their own independent conclusions about which they prefer, but they clearly don’t. Brand loyalty is one of many manifestations of the friendly rivalry between the two cities and serves to distinguish group identities. People from Cologne not only prefer Kölsch, but also people who prefer Kölsch, with Altbier serving the same function in Düsseldorf. Hence, it would be unsurprising if the pressure to go along with the crowd is what provides that extra incentive to drink the local brand over and above whatever intrinsic value the beers possess. This would all seem unremarkable if it were not for the observation that the people from these cities often seem completely earnest when they declare their preferences, suggesting that they are not merely overriding their personal judgements for the sake of conformity. There may be a few Machiavellian types who prefer the non-local brew secretly but pretend otherwise to reinforce their identity within the group, but this requires consciously distinguishing between the taste of the beer and the benefits of conformity. It seems just as plausible that people are actually being earnest about their preferences but simply conflating the sensation of taste with the positive feeling of belonging, the emotions being indistinguishable in the mix so that when they say they prefer their local brew, what they are actually evaluating is the whole range of emotions associated with the experience, not least those having to do with the camaraderie of it all. When they drink the rival beer, those positive associations are absent, and this would explain why they fail to hold it in the same regard. Irrespective of how this pattern of preferences arises, the positive feelings individuals attribute to their local brew can be so strong that it is incomprehensible to many of them that people from the rival city could genuinely hold exactly the opposite opinion. It seems so obvious which beer is better that those who disagree appear insincere or foolish, qualities that ordinarily make a person deserving of scorn. Negative feelings towards drinkers of the rival brand would not be readily explained if drinking preferences were merely self-conscious, Machiavellian displays.


Such preferences are at the trivial end of what we call ‘cultural values’, and these enjoy a kind of sacred status, being virtually immune from criticism. We are told that heritage and tradition are important, that we should be true to our roots, and so on. Strong taboos exist against abandoning cultural values despite them being the set of attitudes and beliefs that we unquestioningly inherit from our parents and peers rather than discover via reason and evidence. Viewed from this perspective, it is interesting to ask why we hold them in such regard, and to what extent prejudices and other unpleasantness are legitimised under this guise.

I don’t understand, therefore it’s stupid

When we are thoroughly convinced of something, it is virtually inevitable that we see those who disagree with us as foolish or dishonest, thereby attributing low status to them, but if our views are not founded on reason or evidence, the unavoidable conclusion is that our downward glances are examples of prejudice. If feelings of belonging are central to understanding how people unconsciously come to be satisfied with their beliefs, it is possible for prejudices to develop without a person being directly taught that its targets are of low status. The targets will seem unworthy simply because they fail to embrace what appear to us to be so obviously the right standards. Even racism, which is generally taken to be about superficial features rather than cultural differences can be understood in these terms, since a person who inherits a given set of racial characteristics will also very likely inherit a culture specific to it (culture being passed down in families too) and differences between the values of different cultures may account for why the other is taken to be foolish or dishonest, differences in outward physical characteristics simply being used to identify the groups involved. This is most evident when race is determined by ancestry in the complete absence of any physically identifying features. As for sexism, conformity to separate gender identities would explain why people are often dismissive about the concerns of the opposite sex, simply failing to understand why certain issues are important to them. The leap from failing to understand a person’s motives to concluding that they are a product of ignorance is seductive.


Received wisdom

Some opinions may not be based on reason or evidence directly, but happen to be correct nonetheless, either by sheer luck or because they are based on the reasoning or experience of those who have passed their wisdom down to us. Moral lessons that were hard-won by previous generations are often simply passed down in the form of myths and rituals without subsequent generations critically evaluating how they help us get along in the world. It is likely that the cultural values that have survived the ages have done so precisely because they gave strength and stability to the societies that embraced them, while the values that didn’t survive were lost along with the societies they helped destabilise and destroy. This means we shouldn’t dismiss traditional values purely on the basis that they are passed on uncritically rather than discovered, but we shouldn’t trust them unquestioningly either and for a number of reasons. Firstly, a value might have been desirable under the conditions in which it first appeared in the remote past, but serve no function in modern societies. For instance, a taboo against sex outside of a permanent relationship was quite practical before the advent of contraception because the potential for negative consequences was far greater for single mothers. Another reason to question the values passed down to us from previous generations is that the stability and strength of a society is not necessarily an indication of fairness. History is filled with examples of enduring empires that achieved their strength and stability by brutally suppressing dissent. The value systems that survived in these societies were mostly geared to the interests of those in positions of power, with mechanisms in place to punish disloyalty.


Cognitive dissonance and the suspicious clustering of beliefs

The set of values that identify a group can be quite arbitrary. There is for instance, no obvious reason why environmentalism and opposition to the Vietnam War were correlated or why opposition to abortion ought to go with support for tax cuts. We have to suspect that when apparently independent values cluster together more often than chance would dictate, group loyalties have largely overridden rational judgements about the issues involved. The notion that a person’s political views can be summarised as leftwing or rightwing is a paradigm example of this and is indefensible, perversely conflating every issue of contention into a single dimension of disagreement [though click here for an intriguing attempt to partially explain why the views of liberals and conservatives cluster the way they do]. Rational debate cannot really be sustained under such conditions, and as a consequence, those with party loyalties will be forced to make tortuous justifications that belie the fact that their values have been acquired without the mediation of reason and evidence, justifications that look for all the world like insincerity, but which are plausibly just elaborate forms of self-deception. The ‘proper’ view within a given social sphere is just whatever it is rewarding to believe, with rewards provided in the currency of acceptance and respect from the group that cultivates the individual.


In practice, any given individual will simultaneously be a member of many overlapping groups that have influence over different spheres of life. The influence of these groups on values is illustrated most clearly by looking at the different standards we apply to essentially identical problems that arise in different spheres, but the contradictions have to be pointed out to us for us to even realise they exist. A person might for instance believe that it is humane to put an animal down if it is suffering, but that it is wrong to end the life of a human being under comparable circumstances. Another person might boycott products from multinational companies that engage in questionable trade practices while simultaneously buying recreational drugs from a black market that is blatantly corrupt and murderous. Others might complain that people on benefits are living off the sweat of hard-working people but fail to draw the same conclusion about those whose incomes consist solely of returns on capital investments. When such inconsistencies are pointed out, people often make tortuous justifications to preserve the faith they have in their own rationality, but if, as I and others have argued, value systems are largely acquired without the mediation of critical thought, and merely adopted in accordance with group norms, then this faith is quite unfounded. This is what hypocrisy and cognitive dissonance are – the compartmentalisation of contradictory values that are held simultaneously, but no one intends to be a hypocrite and no one comes to be one by following the dictates of evidence.

The inertia of the status quo and the social function of irrational behaviour

If we dismissed all of this as ignorance rather than cultural, we’d miss another of the interesting questions in connection with all this. We’d be forced to conclude that where irrational beliefs endure, it is simply because the truth is too complex for the ignorant masses to grasp. But even if the masses are ignorant, the constant buffeting of even mildly challenging evidence should, if given enough time, drive the evolution of a group’s beliefs towards those that are rational and consistent, unless there is something actively preventing this. To the extent that this rationality is lacking, we should consider the counter intuitive possibility that irrational beliefs actually serve a function.


The status quo of a group has a kind of inertia even when its values are irrational precisely because embodying the group’s values is something that has currency for one’s status within it. To question these values would be to bite the hand that feeds you. Only those with high status within a group have the power to influence attitudes, but it is precisely the irrationality of certain traditions that makes them useful for demonstrating loyalty to a group. Rites of sacrifice and initiation embody this quite clearly. Only the truly dedicated are willing to do things that come at a personal cost in order to be accepted. Loyalty can for instance, be demonstrated by doing things that members of the broader society consider wrong. From this point of view, the dynamics behind peer pressure and adolescent rebellion make perfect sense. Peer pressure wouldn’t be a problem if our peers always had our best interests at heart. We are warned against bowing to it precisely because many of our peers are not looking out for us, but are instead merely seeking expressions of loyalty. Being a ‘cool’ kid is essentially code for being willing to take risks to be accepted. The existence of fashionable vices like harmful drug-use is also explainable in these terms. Likewise in religion, the ability to believe things that are strongly at odds with one’s observations is taken to be a mark of strong faith and the harder something is to believe, the stronger the faith required to believe it and the more virtuous its believer is deemed to be [click here for a discussion of this phenomenon as it applies to Catholicism and Judaism]. The discomfort of wearing a necktie to work also indicates commitment precisely because of the associated discomfort, and so on. In each case, it’s the inherent conflicts associated with doing these things that makes them useful for expressing loyalty to the groups that place importance on them, and it is this function that plausibly explains why the relevant values persist. A similar phenomenon is observed throughout the animal kingdom in the process of “costly signalling”, which many animals use to demonstrate commitment to potential mates and other conspecifics.

The unfinished business of The Enlightenment

The Enlightenment raised consciousness about a new set of methods for acquiring knowledge in a reliable way. It was only as recently as the sixteenth century that a word was even coined to label the concept of a fact, appearing first in the context of the law and later being extended to other domains. And it was only four hundred years ago that Galileo was arguing that we need to make observations of the world in order to understand it, rather than simply contemplating it. Prior to this period, the dogma of the church followed Plato’s view that worldly objects were degenerate copies of those to be found in the ‘realm of perfect forms’, only the latter being regarded as the proper objects of inquiry. Beginning with pioneers like Galileo, the seventeenth century saw the birth of the scientific method, which meant testing predictions via observations and experiments. With it, nature began to reveal herself apace, leading directly to the industrial revolution and the associated growth in technology. We take these ideas for granted today, forgetting that there was ever a time in which a case had to be made for embracing them, but there is another question lurking here, one which hasn’t received as much attention, but which is arguably necessary to finally break free from lingering vestiges of the Dark Ages. The question concerns how people were forming beliefs before the Enlightenment, and to what extent these habits have endured alongside our newfound appreciation for the scientific method.


Many of the conclusions we reach without reason or evidence are harmless. We don't come to believe we are hungry via an appreciation of how long it’s been since we’ve eaten. Our bodies just tell us, and with the exception of pathological cases, there’s nothing remotely dubious about believing what our bodies say about such things. Likewise, a mother doesn’t come to love her children because they’re smart or talented or have other qualities that we can appreciate rationally. Nature takes care of this via imprinting processes that are as relevant for humans as they are for hens. We cannot be wrong about how we feel, but we can be wrong about matters of fact and this is where non evidence-based methods of acquiring beliefs are potentially a problem.


I have attempted to argue here that a key factor in establishing unreliable beliefs is the confounding influence of membership within various social groups. This is backed up by studies of social conformity that suggest that what other members of a group say even about matters of direct experience can strongly bias not only what individuals report seeing, but also what they think they actually see [source]. Prior to the enlightenment too, disagreements about matters of opinion must have been settled in an almost entirely ‘tribal’ way, being accepted or rejected principally on the basis of whether those who expressed them were assumed to be on one’s own side or not, ignoring matters of actual substance.


None of this would matter much if it weren’t for the fact that political discourse in places like the United States is virtually devoid of policy debates, elections being fought almost entirely on the basis of the perceived trustworthiness of candidates. Attention is placed on a candidate’s stance on issues like abortion and gay rights, which pale in significance to matters like health and education, but are nevertheless central to marking tribal allegiances. The clever politicians choose language that could be taken as friendly by all sides. So long as they nod in the direction of the ‘correct’ values, they will win votes. All this occurs without discussing matters of actual substance.


When beliefs and values are among the badges people wear to identify themselves as members of a group, it also sets up a dynamic in which people compete for status by adopting extremist positions. Extremism is not just confined to religion, but arises in social groups generally. We can see it in teenagers who drink to excess to demonstrate their party credentials to one another, in musicians who compete amongst themselves by pursuing technical prowess rather than melodic qualities, and so on. These kinds of things are often harmless, frequently pointless, but obviously dangerous when the extremist positions involve advocating murder, terrorism and genocide.


To avoid these problems, people need to become more aware of how group membership impairs our reasoning, and approach with greater scepticism, values and beliefs that are used to mark group identity. In short, we should be suspicious of views that have currency in terms of group kudos. People are generally more credulous about things that are rewarding to believe, but especially when that reward involves status within social groups. This kudos credulity is worth labelling as a concept so that we can point to it and confront it openly. In time, we can only hope that this observation is taken for granted in the way that the conclusions of Enlightenment philosophy are.

Friday, June 22, 2007

"HIV/AIDS is God's way of punishing gays" (non sequitur)

Homosexual males have a higher risk of HIV infection than heterosexuals, who in turn have a higher risk of infection than lesbians. If HIV is part of God’s purpose, then we would have to infer that He dislikes gay men the most, dislikes heterosexuals somewhat and thoroughly approves of lesbians. HIV/AIDS clearly doesn’t differentiate between homosexuals and heterosexuals in the way that some people would like it to.

Monday, May 22, 2006

“Deafness is not a disability” (argumentum ad consequentiam)

Many deaf and hard of hearing individuals resent being told they have a disability. Passionate opposition to this idea is for instance found in the official documents of The National Association for the Deaf in the US:


Many within the medical profession continue to view deafness essentially as a disability and an abnormality and believe that deaf and hard of hearing individuals need to be "fixed" by cochlear implants. This pathological view must be challenged and corrected by greater exposure to and interaction with well-adjusted and successful deaf and hard of hearing individuals. [source]


Extreme proponents of this view regard giving a deaf child a cochlear implant or hearing aids as akin to ‘correcting’ the colour of a black person’s skin by making them white. They argue that deaf people are unlike hearing people, but that deafness is only a disadvantage to the extent that deaf people are unfairly excluded from a society engineered around the concerns of the hearing.


From an evolutionary point of view, this position is almost impossible to maintain. It would be to claim that the sense of hearing has no selective advantage in humans. But without it, an entire channel of information about potential threats and opportunities is closed off, information that a hearing person can exploit to great advantage. Hearing people have access to information about events occurring behind their backs, through occlusions and in complete darkness. Without looking, they have access to exceedingly subtle information about their surroundings. They can tell whether they are in a city or in the countryside, indoors or outdoors, in a large room or a small room. They can tell the difference between sounds produced in a room with tiles versus carpet. Even the sound produced by tapping on something reveals subtle information about the kind of material it is made of and whether it is hollow.


Hearing people take this information for granted and tend to focus on the consequences that hearing loss might have for their ability to communicate. But in the context of language, a stronger case can be made that the disadvantages associated with deafness are basically arbitrary, a result of the broader culture communicating via a different but equally expressive code. If hearing people communicated using signed languages, the code would be accessible to deaf people and in terms of expressiveness there would be nothing lost, signed languages being just as systematic as spoken languages and capable of conveying meanings that are just as subtle and precise. Indeed, a perfect comparison can be made with signed English, which is English in exactly the same sense that both the spoken and written forms are – just conveyed via yet another medium. In terms of what can be expressed, signed and spoken languages are equivalent, but there are nevertheless advantages and disadvantages associated with each. If you know a signed language, you can communicate through panes of glass (useful for expressing final sentiments through the window of a train as it pulls away from the platform), and in noisy environments such as nightclubs and factory floors. You can converse without waking babies and without alerting enemy soldiers. And you can talk with your mouth full. On the other hand, if you can speak and hear, you can communicate with someone in another room or in the dark. You can also converse while your hands and eyes are occupied – chopping vegetables or whatever. Hearing people can of course learn to sign if they choose, thus availing themselves of all the benefits of both modes of communication, but the reverse is not true. The ability for deaf people to communicate via the auditory channel is incontrovertibly impaired. 


Sound carries subtle information about the environment and allows communication in contexts that sign language does not, but sound is also a source of subjective pleasure, the experience of sound having a richness that is very difficult to explain to the congenitally deaf. How can one explain the contagious effect of laughter, the drama of rolling thunder, or the penetrating effect of a baby’s cry? These sounds get into us, affecting our emotional state directly. The same is of course true of music, but its effect on the hearing must seem almost mystical to those who have never heard it.


It is hard to avoid the conclusion that deaf people are genuinely missing out on something that hearing people experience, and there are almost no advantages to being deaf that would compensate for this, the exceptions being situations in which a deaf person is untroubled by sounds that a hearing person would find unpleasant or distracting (noisy neighbours, elevator music, construction sites, and so on). But even in these cases, a hearing person has the option of plugging their ears.


To most hearing people, going deaf would be far from an insignificant event, which is why it is often surprising for them to find that those who are born deaf usually view their condition quite differently. Hearing people and those with acquired deafness usually equate it with suffering, but those who have been deaf all their lives generally don’t think about it in this way. After all, they have never known any other life so have nothing to compare it to. It would be like a person lamenting that they couldn’t fly. We are not filled with bitterness and regret about being unable to fly because we’ve never had any serious expectations of being able to beyond a few bone-breaking experiments in childhood. This doesn’t mean that many of us wouldn’t like to be able to if we had the option, but it is senseless to pine for something that is impossible to obtain. With this in mind, imagine birds looking down at us from the tree tops endlessly offering their sympathy to us – their unfortunate flightless companions – and you may go some way to understanding why the deaf resent the ‘poor you’ attitude of hearing people. For the average hearing person, meeting a deaf person is a novelty, which confronts them with a contrast that fills them with pity. But for deaf people who know nothing else, these expressions of sympathy are jarring and misplaced. They are justifiably frustrated at constantly being made to feel like victims and yearn to be seen as more than just deaf [example]. This may go some way to explaining why there are those who deny that deafness is a disability, but this would be to miss the target of a legitimate complaint. The legitimate complaint is the same as that of anyone who stands out from the crowd for reasons that are beyond their control. People get described as the ‘tall guy’, the ‘fat girl’, the ‘one with the big nose’ and so on, descriptions that obviously overlook virtually everything that’s important about a person. Likewise, that ‘deaf girl’ is never just that. She might also be someone’s sister, love animals, have a keen interest in photography, or whatever. She is a complete person.


There may also be more familiar reasons for people denying that deafness is a disability having to do with people simply believing what they want to believe, but there is an important difference between dealing with something that we accept is unpleasant and dealing with it by denying that it is. This is of course, easier said than done. After someone dies for instance, there is no easy way to reconcile ourselves with the intolerable proposition, which leaves us shaking our heads in disbelief, that we will never again be able to talk to our loved-one again. Nor is there an easy way to accept, when a relationship ends, the intolerable proposition, which leaves us unable to breathe, that the person who knew us more intimately than anyone else in the world and who is therefore most qualified to pass judgements about us, has judged that we are no longer worthy of their love. Also intolerable is the proposition that there is something wrong with us, like being deaf, that will put us at a disadvantage for the rest of our lives. It is very difficult to console ourselves when confronted with the unbearable permanence of these things, so it is little wonder we find ourselves concluding – without evidence – that our dearly departed are now “in a better place” where we will someday be reunited with them, that our lover will someday come back into our arms once they realise their mistake, or that the problem lies not with us, but with a condescending society that treats us as though we are broken and need to be fixed.


Most of us see it as profoundly insensitive to challenge the views of those who we think are suffering, which is no doubt why victims get away with saying blood-curdling things on the steps of criminal courts, why there was so little opposition to the US invasion of Afghanistan directly after the 911 atrocities, and why there was so little opposition to establishing the state of Israel after the second world war. But, at least in these cases, the ‘profoundly insensitive’ option may actually be better than the consequences of not debating the issues.


Though at a different scale, the consequences of uncritically denying that deafness is a disability would be to legitimise the attempts of deaf parents to have deaf children by design [example], or to legitimise their attempts to deny their deaf children hearing aids or cochlear implants during the critical period of brain development when auditory input is necessary for the proper development of auditory processing capacities. Both of these decisions would lead to avoidable suffering, and without the child having any say in the matter. If, when the child grows up, they wish to discard their hearing aids or switch off their cochlear implant, they have that choice, but the reverse is not true – you cannot raise a deaf child without these devices and let them choose whether they want them as adults because, by then, the critical period of brain development will have long since passed and they will never be able to make sense of the sounds they hear.


The consequences of a belief being true frequently account for why people hold it far better than the evidence supporting it (this is the logical fallacy of argumentum ad consequentiam) and the issue of whether deafness is a disability is no different. For deaf people, these consequences are emotional – it liberates them from the intolerable proposition that they are physically flawed, while also providing an intellectually convenient way of dismissing the understandably frustrating attitudes held by many hearing people. On the other hand, to deny that deafness is a disability would be to legitimise decisions of deaf parents to have deaf children by design, or deny a deaf child hearing aids or a cochlear implant when they could otherwise benefit from them. These consequences tell us why it matters to those on either side of the debate, but the truth is what it is, good or bad. Only evidence can tell us whether deafness is a disability, and on this basis, it is hard to deny that it presents a disadvantage that, although not as grave as some other disabilities, is a disability nonetheless.

Thursday, December 01, 2005

“Homosexuality is immoral because it is unnatural” (non sequitur)

An unnecessary but common response to this claim is to dispute that homosexuality is unnatural by pointing to the abundant evidence of homosexuality in non-human species. But even if homosexuality were unnatural, this wouldn't make it wrong or unhealthy. If we were to shun everything that is unnatural, we'd be forced to abandon our houses, clothes, medicines, technology, art and much else. Clearly there are some unnatural things we consider good. Likewise, we take earthquakes, tsunamis, snake bites and sunburn to be natural, but not good. And those items on the supermarket shelf proclaiming all natural ingredients like salt, sugar and fat are not good for your health. Whether or not homosexuality is natural or unnatural tells us nothing about whether it is right or wrong, desirable or undesirable. The argument is a non-starter (and for those interested, an example of an argumentative fallacy called the appeal to nature).


Suppose for a moment that for some reason all the other unnatural things that we like are permissible because they have nothing to do with sex, that sex is a special case in which virtuousness always coincides with naturalness (we'd be guilty of a different argumentative fallacy called special pleading, but let's allow it in this case). If this were so, then we might look to the sexual behaviour of other species for examples of how to behave. We could model our behaviour on bonobo chimpanzees, who continually use sex to reinforce social bonds within a group (same sex or not)[source], female praying mantises who eat their mates once they have served their function, ducks who engage in homosexual necrophilia [source], dogs who mount their owners' legs, and so on. Homosexuality and a whole lot of other things would be permitted under this kind of moral philosophy. Not only that, but our guidance would be riddled with contradictions arising from the fact that different species have different sexual practices. For some species, promiscuity is the norm. For others, monogamy is, and so on. We would have to ask what is natural for humans specifically, but if we are contrasting 'natural' with 'man-made', what sense could there be to such a question?


Even if we wish to cling to the view that 'natural' equals 'good' and even if we can ignore the fact that nature seems to make contradictory judgements about what is good and bad, it would still be difficult to argue that homosexuality in humans is any more natural than practices like celibacy, something that few are inclined to condemn. If homosexuality is wrong or unhealthy, this kind of argument simply does not show it.


Note that the claim that homosexuality is immoral because it is unnatural is a part of a more general family of arguments, which also includes justifying promiscuity by arguing that monogamy is unnatural, or eating meat by arguing that vegetarianism is unnatural. As with homosexuality, nature has no consistent attitude towards these things, and even if it did, no moral conclusions could be drawn about these practices (at least on this basis) since 'natural' simply cannot be equated with 'good'.

Saturday, November 26, 2005

"Women are partially responsible for being raped"

CLAIM
A woman is totally or partially responsible for being raped if she “behaved in a flirtatious manner”, “is drunk”, “is wearing sexy or revealing clothing”, “has had many sexual partners”, “has failed to say ‘no’ clearly to the man” or “is alone and walking in a dangerous or deserted area”.

EVALUATION
According to a recent survey commissioned by Amnesty International, a significant proportion of British adults consider a woman partially responsible for being raped in each of these contexts. The details are summarised below.

The publication of these statistics was widely covered in the media on Monday, both in the UK and internationally, generating headlines such as “Women ‘to blame’ for being raped”, “1 in 3 Brits blame rape on women”, and “Rape victims were ‘asking for it’ - Shock Report”, but it is far from obvious that this emphasis on 'blame' actually constitutes a fair interpretation of the views of the survey’s respondents, an interpretation that was almost universally accepted in the media and which has its origins in the original press release issued by Amnesty in which their UK director Kate Allen is quoted as saying “It is shocking that so many people will lay the blame for being raped at the feet of women themselves”, referring to it as a “sexist blame culture”.


The wording of the questions featured in the survey leaves the views of respondents open to a number of interpretations. By holding the victim (partially) responsible in a particular context, some of the respondents might have meant that she deserved to be raped or that the rapist should receive a more lenient sentence. Others might have meant simply that a woman is more likely to be raped if she puts herself in the stated situation. But a respondent is very unlikely to interpret ‘responsibility’ in terms of deserving to be raped if they consider it unthinkable that this depends on the context. Why would the survey be asking about the level of responsibility if it didn’t vary from condition to condition? Given this, it is plausible that a significant proportion of respondents interpreted ‘responsibility’ in terms of whether the woman had placed herself in a situation in which she was more likely to fall victim to rape. This is regardless of any attribution of blame that might go along with that. There is an empirical question about whether there is actually an increased risk of rape in the contexts examined in the survey. Indeed, there is a distinct question for each scenario. But if there are measures that a victim could have taken to reduce the risk in a particular case, would that justify “lay[ing] the blame for being raped at the feet of women themselves”?


Taking measures to prevent becoming a victim of a crime is normal and uncontroversial in other domains. We take precautions when using our credit cards online, we ensure that valuables aren’t visible through our car windows, we leave a light on at home if going on a holiday. It is therefore not grossly implausible that there are things that a woman could do to prevent becoming a victim of rape. She can for instance, refuse to accept a drink from an unknown man under conditions where he could have drugged it (go here for more advice on date rape drugs).


Most of us would of course prefer a world in which there were no criminals so that we wouldn’t have to take such precautions. When we compare this ideal world against the reality, the person who takes no precautions is behaving exactly as we would all like to be able to, but the criminal is not behaving as we’d want. This is why we attempt to deter the criminal with threats of punishment and not the victim even if the victim could have prevented the crime by behaving differently. Our desire to change the behaviour of the criminal is motivated by idealism, whereas our desire to change the behaviour of potential victims is motivated by pragmatism. The notions of blame and punishment are only compatible with the former.


As for the specific claims in the survey, it would be deeply irresponsible for Amnesty to give the impression that these factors do not carry any risk if they in fact do. On the issue of drinking, studies do support a strong link between rape and alcohol use both on the part of the rapist and the victim [source, source]. There is also a correlation with the number of sexual partners a woman has had [source, source]. On the other hand, the risk associated with walking alone in a dangerous or deserted area may not be as large as many assume. According to UK Home Office statistics, 92% of rapes involving female victims are committed by people known to the victim (45% by current partners). Given this, it is unsurprising that more than half of all rapes (55%) actually occur within the victim’s own home, with a further 20% occurring in the perpetrator’s home. 13% occur in public places, often in the vicinity of licensed premises [source].


The extent of risks associated with dressing provocatively, flirting, and sending mixed signals are more difficult to quantify because of their subjectivity. It is often suggested that rape is not carried out with a motive of sexual gratification, but rather a desire to have power over the victim. If so, the way a woman is dressed would be irrelevant. To make this case convincingly, it would be necessary to show why perpetrators choose sexual means to this end rather than other conceivable forms of degradation. There are also difficulties reconciling this view with many cases involving date rape drugs where the rapist’s intention appears to be to get away with the crime without the woman having any clear indication that it occurred. There is some sense in which the rapist has power over the victim in these cases, but the victim is not necessarily aware of being subjugated.


While there are certain kinds of behaviours that appear to be correlated with rape, a rational response is not necessarily to avoid these behaviours, since a woman may judge these activities to be rewarding enough to warrant exposing herself to the associated risks. Being informed of the risks allows a woman to make this choice. The British Crime Survey estimate for the year 2000 was that approximately nine in every thousand women between the ages of 16 and 59 is the victim of some form of sexual victimisation every year, four of these involving rape [source].


Amnesty International also used the publication of their report to publicise the fact that the conviction rate in rape cases is extremely low, citing the “sexist blame culture” that the survey allegedly highlights as the reason that juries fail to convict. However, there are genuine difficulties in demonstrating guilt in rape cases that could at least partially account for the low conviction rates. Demonstrating whether consent was given often comes down to one person’s word against another’s. It is also far from obvious what the optimal conviction rate should be since there will be some presumably small proportion of cases that are based on fabricated charges that should not result in convictions.


Amnesty’s report could have a number of serious negative consequences aside from giving the false impression that women are powerless to reduce the risk of being raped. Publicising how low conviction rates actually are could plausibly embolden rapists, who now believe their chances of getting away with it are much better than they previously thought (I won’t repeat the statistics here). It might also discourage rape victims from reporting incidents, victims who are now convinced that they will be treated unfairly by the courts. If so, it would be grossly irresponsible for Amnesty International to do this, especially if the low conviction rate has nothing to do with a “sexist blame culture”. It would of course be equally irresponsible for the media to repeat these claims uncritically. Due to the difficulties inherent in convicting rapists, it may be more practical to reduce the incidence of rape by focussing on changing the attitudes of potential perpetrators.


By making casual claims about prejudice, we run the risk of generating cynicism about similar claims that are made in a more measured way. The potential damage that reports like Amnesty’s can do therefore extends well beyond the reputation of any particular organisation.