Sadly I don’t have so much time for writing these days, especially anything too strenuous or research-based, so I think I’ll do a series on organisms that have interested me over the years – or that I’ve just recently been fascinated by, for that matter.
Over at Not Exactly Rocket Science, there’s an article to whet the appetite as well as to apply a corrective to our thinking about everyone’s favourite wild cat, the cheetah (the name derives from Sanskrit, and cheetahs are found in Iran as well as Africa, and were probably more widespread in Asia in earlier times). Cheetahs are a vulnerable species, with about 10,000 of them currently existing in the wild. They’re described as a ‘charismatic species’, meaning that they’re utilised a lot as ‘ambassadors’ to draw attention to environmental and habitat issues for wildlife in general – along with elephants, humpback whales, giant pandas, California condors, grey wolves and such.
Cheetahs are, of course, built for speed in every way, though agility, with an incredible acceleration and deceleration rate, is also a key to their success. They can accelerate from zero to 40mph in just three strides – faster than the most sophisticated racing cars. Claims that their lightning runs leave them half-dead with heat exhaustion much of the time are, however, wildly exaggerated, as are the claims that they lose as much as half of their kills to lions and hyenas. In fact, cheetahs use up far more of their energy seeking out or tracking down potential kills than they do actually chasing them. A cheetah sprint takes up only 45 seconds a day on average – that’s less time than I spend on my high intensity interval training.
The key to maintaining cheetahs in the wild, then, is not to add to their greatest and most energy-sapping problem: finding food. Adding obstacles to their habitat, such as fences and enclosures, and depleting that habitat of their favourite food – gazelle, deer and impala, and the odd young zebra or springbok – would make life that bit more painful for them.
Speed, of course, is the cheetah’s big specialisation, what it’s adapted for. In fact over-specialisation is arguably its main problem, as it doesn’t have the bulk or strength to fight off other predatory mammals, all of which annoyingly compete for the same food. It’s light, with a weight that averages around 50 kgs, and its aerodynamically evolved head and body trade speed for strength, meaning that its jaws and teeth don’t have the size or force of other wild cats. It has a flattened ribcage but larger than usual heart and lungs for large intakes of air and fast pumping of blood. It also has a longer and larger tail than most cats, which it uses as a rudder for balance as it sets off on one of its twisting and turning runs. Its claws are only semi-retractable, unlike those of most cats (its genus name, Acinonyx, is Greek for ‘no-move claw’). This gives it extra grip while running. Males and females are the same size and hard to tell apart from distance.
Cheetahs don’t roar but they make up for it with a range of other noises, including purring like a – well, a cat, when experiencing domestic plenitude. They also hiss, spit, growl and even yowl when faced with danger. Cubs make a bird-like chirping sound, and the mother makes a similar sound when trying to locate her young. A sound called churring – no idea what that sounds like – is used on social and sexual occasions between adults. Male cheetahs form lifelong partnerships, often but not always with brothers, while females are solitary, bringing the kids up by themselves. They tend to mate with a variety of males – which hardly makes it mating, really. Interestingly, though the females are regular hunters, they’re not territorial, unlike the males, who practice group territoriality, each member of the gang contributing his scent.
Female cheetahs put their kids – or those that survive, as there’s a heavy infant mortality rate – through a tough survival training schedule before abandoning them at around 18 months. At around 2 years of age the females go their lonesome ways and the males hang together, sometimes combining with other blokes. It seems to work for them. In fact I think I read somewhere that males live longer on average than females, which wouldn’t surprise me. Fending for yourself all the time’s a deadly business, even when it’s all laid on in the big smoke, never mind having to chase your meals every day into old age. So spare a thought for the cheetahs, especially the girls, under-appreciated as always.
I’ve written a fair bit about the rise of the ‘no religion’ sector of society, in Australia and elsewhere, which has obvious implications for the role of Christianity in politics in the western world. In Australia some generations ago, Melbourne’s Catholic Archbishop Daniel Mannix, and later his protege B A Santamaria, were hugely influential political figures. The formation of the Catholic DLP (Democratic Labour Party) by Sanatamaria, with the support of Mannix, effectively split the left, handing the conservatives political power for decades before Whitlam’s 1972 election victory. Since then, however, there hasn’t been much overt influence on politics from religion, though of course we’ve had religious PMs, including the current mad monk. Nor have we had any major political parties, that I know of, in which Christianity, or any denomination thereof, is part of its name.
Not so in other western countries. So-called Christian Democracy parties are quite common in Western Europe, usually on the centre-right. Belgium has the Christian Democratic and Flemish Party, formerly the Christian People’s Party; Switzerland has the Christian Democratic People’s Party as well as the Evangelical People’s Party; the Netherlands has the Christian Democratic Appeal Party, and Italy has the Union of Christian and Centre Democrats (though better known by its more secular title, the Union of the Centre, UDC).
Probably the most successful and powerful Christian political party in Europe, though, is Germany’s Christian Democratic Union, whose leader, Angela Merkel, has been Germany’s Chancellor for the past nine years. The party has been in power more often than not, though often in coalition, since 1945. In recent times, the CDU has formed a more or less permanent partnership with the Bavarian CSU (Christian Social Union), which is generally more Catholic and conservative.
According to Germany’s 2011 census, their percentage of Christians is almost identical to Australia’s, at a little over 60%, pretty well evenly divided between Catholics and (essentially Lutheran) Protestants. However, as with Australia, the numbers are falling rapidly, and churches are closing and being converted to other uses throughout the country. The ‘no religion’ category has won more votes recently than either the Papists or the Heretics. Interestingly, the eastern part of the country, which was under communist rule for 40 years, is much more atheist than the rest. So for how much longer will Germany’s CDU retain its Christian moniker?
According to its party platform, the CDU derives its policies from both ‘political Catholicism’ and ‘political Protestantism’, whatever that means. The vapidity of such claims, together with the obviously rising secularism of the populace, might explain why Angela Merkel played down any Christian elements in her and her party’s thinking during the 2005 elections. Merkel herself is the daughter of a Lutheran minister but was brought up in the atheist East and is a physicist by training. Recently, though (just prior to last year’s elections) she ‘came out’ for the first time as a Christian, possibly for complex political reasons (the rise of Islam is a much more significant factor in German domestic politics than in Australian). She even claimed, quite nonsensically, that Christianity was ‘the world’s most persecuted religion’. (Actually this is a common view, according to Pew Research, in the USA. It seems many Christians believe that the waning of Christianity’s popularity is a form of persecution). Merkel was elected for another 4-year term in 2013, and her more emphatic public identification with Christianity in recent times means that her party will be stuck with its name as long as she’s at the helm. My guess is she’ll be ripe for retirement in 2017.
Of course, as with most western states, religion in Germany has in recent decades, if not centuries, become a more ‘internal’ matter, and less political, with much ‘encouragement’ from the state. For more detail on that, check out the Peace of Augsburg of 1555 and its newly-defined principle, Cuius regio, eius religio, and also the concept of forum internum. This is definitely a good thing, given the Thirty Years War and all, but it seems that, as a quid pro quo for religious non-interference in politics, Germany’s Grundgesetz (its Basic Law, or Constitution) has been very generous in its delineation of religious freedom, and this may cause problems if Germany continues to play host to more challenging, and less ‘internalised’, religious beliefs. The Grundgesetz came into being in 1949, but many of its statutes pertaining to religion date back to the 1919 Weimar constitution. Unsurprisingly, no religions other than an increasingly emasculated (if that’s not too sexist a term) Christianity would have been considered relevant in those days.
Much of what follows, and some of the preceding, is taken from the article ‘Religion and the secular state in Germany’, by Stefan Korioth and Ino Augsberg. The constitution guarantees freedom of individual religion and philosophical creed (Weltanschauung) – thus also guaranteeing freedom not to have a religion. In article 3 of the constitution it’s stated that ‘no person shall be favored or disfavored because of his or her personal religious opinions’, and in article 33, ‘neither the enjoyment of civil and political rights, nor eligibility for public offices, nor rights acquired in the public service shall be dependent upon religious affiliations’. Other articles guarantee that there shall be no state church, and create a separation of church and state. In fact the German constitution is unusually detailed in its coverage of the status of religious entities vis-a-vis the state. It is above all concerned to emphasise the principle of state neutrality, but this has caused some difficulties in that the state has withdrawn even so far as to be reluctant to define religion for legal purposes. There is, as Korioth and Augsberg point out, no numerus clausus, or fixed number, of religious confessions, and it has been left to religious communities themselves to define their religiosity. Not surprisingly this has led to ongoing issues with regard to the legal status of religious groups. With the inevitable continuing decline in Christianity, and the rise of more challenging religions, and the disaffected youth who choose to identify with a more intolerant version of those religions, this will be a problem in the future. Hopefully, however painful, it will remain a fringe problem for the ongoing secularisation of Germany.
Just to round things off, Merkel’s newly-found public Christianity is a reminder that often changes have to wait until people die off, if that doesn’t sound too morbid or callous. Of course they don’t have to die physically, they may just have to die in terms of power or influence. Merkel’s position reminds me of others, such as Antonin Scalia of the US Supreme Court, and the late Fred Phelps of the Westboro Baptist Church (not that I place these people on the same moral or intellectual plane). The movement towards secularism isn’t so much about changing people’s minds, though that’s always a worthy pursuit. It’s about a changing zeitgeist that feeds those who are brought up within it. Older people die, younger people come to prominence, bringing the newly transformed zeitgeist to the fore. That’s how the flat-earthers, who once filled provincial town halls with their lectures, finally faded from view; they weren’t out-argued or persuaded from their views, they simply died, and their descendants imbibed the new zeitgeist. Not an excuse for complacency, but a reason for hope, and a reason for contributing to that zeitgeist in a positive way.
I have to say, from a very young age, I considered myself a feminist. And then I read (sometime in the seventies, long before Emma Watson was born, bless her cotton socks) that you couldn’t be a feminist as a male, because it was some kind of uniquely female thing, whereas I, like Emma, thought it was a simple matter of believing that females were equal to men in every respect, and that it didn’t matter who did the believing – male, female, androgyne or alien.
Emma Watson’s Hermione is an iconic figure. Perhaps I should say J K Rowling’s Hermione, though millions identify Emma as Hermione. Yet, interestingly, Emma described herself in her speech, self-depracatingly, as ‘that Harry Potter girl’, inadvertently reminding us of her role as support to the main protagonist.
I don’t in any way want to disparage the Harry Potter novels, which I’m sure would have been just as successful with Harriet Potter as the heroine – at least I hope so. I personally have observed how much Hermione has inspired young girls, as an intelligent, level-headed problem-solver. So it was with great delight that I, along with many others, have been able to see that Emma was not just playing a part as Hermione; that she genuinely wants to use her prominence to push for the recognition of women globally.
I would go further – and I suspect she would agree, though she didn’t go that far in her speech – and say that the world would be better for having more women in prominent positions – that it would be safer, more collaborative, and more congenial. But maybe I’m being a little idealistic…
In any case, the ‘he for she’ initiative is one that I endorse whole-heartedly, because it allows men to have their say without experiencing any of the weird responses from both sides. It’s simply about equality, and respect.
‘Atheism is not a great religion. It has always been for elites rather than for ordinary folk. And until the 20th century, its influence on world history was as inconsequential as Woody Allen’s god. Even today the impact of atheism outside of Europe is inconsequential. A recent Gallop poll found that 9% of adults in Western Europe (where the currency does not trust in God) described themselves as ‘convinced atheists’. That figure fell to 4% in eastern and central Europe, 3% in Latin America, 2% in the Middle East, and 1% in North America and Africa. Most Americans say they would not vote for an atheist for president.’
Stephen Prothero, from God is not one: the eight rival religions that run the world & why their differences matter (2010).
I should admit at the outset that I’ve not read Prothero’s book, and probably never will, as time is precious and there are too many other titles and areas of knowledge and endeavour that appeal to me. However, since, as a humanist and skeptic I have a passing interest in the religious mindset and in promoting critical thinking and humanism, I think the above quote is worth dwelling on critically.
First, the claim that ‘atheism is not a great religion’. It’s an interesting remark because it can be interpreted in two ways. First, that atheism is not a religion of any kind, great or small; second that atheism is a religion, but not a great one. I strongly suspect that Prothero has the second view in mind, while also playing on the first one. Of course atheism isn’t a religion and it’s tedious to have to play this game with theists (assuming Prothero is one) for the zillionth time, but my own experience on being confronted with the idea of a supernatural entity for the first time at around eight or nine was one of scepticism, though I didn’t then have a name for it. I don’t think scepticism could ever be called a religion. And nothing that I’ve ever experienced since has tempted me to believe in the existence of supernatural entities.
Next comes the claim that atheism has always been for elites rather than ordinary folk. This is probably true, but we need to reflect on the term ‘elite’. I assume Prothero can only mean intellectual elites. The Oxford dictionary succinctly defines an elite as ‘a select group that is superior in terms of ability or qualities to the rest of a group or society’. Generally, therefore, the best of society, or the leaders. It’s broadly true, especially in the West, that you won’t get to the top in business without a good business brain, you won’t get to the top in politics without a good political brain and you won’t get to the top in science without a good scientific brain, and these are all positive qualities. The elites are the best, and the best tend to be society’s movers and shakers.
Yet Prothero doesn’t appear to agree, quite. His juxtaposing of the two sentences intimates that atheism is not a great religion because it has always been for elites. What are we to make of this? My guess is that he’s trying to downplay atheism but has made a bit of a mess of it. And there’s more of this. Before the 20th century, we’re informed, atheism was as influential ‘as Woody Allen’s god’, by which, I presume, he’s referring to Allen’s farce of 1975, God, with which I’m not particularly familiar. I do know, though, that it’s fashionable these days to trash Woody Allen, so the message appears to be that, before 1900 or so, atheism was very inconsequential indeed.
A reasonable person might wonder here why Prothero seems so keen to diminish atheism. A big clue is surely to be found in the subtitle to Prothero’s book. Which raises some questions: What are these eight religions? Are they really rivals? Do they run the world?
The contents page answers the first question: Islam, Christianity, Confucianism, Hinduism, Buddhism, Yoruba religion, Judaism and Daoism make up the Premier League. Presumably Jainism, Sikhism and Zoroastrianism are struggling in the lower divisions. There is some debate amongst authorities as to whether Confucianism or Daoism are recognised religions, and they’e often found blended, along with Buddhism, in Chinese folk tradition – so, maybe not so much rivals.
Surely the most important question, though, is whether these religions ‘run the world’. I have the strong suspicion that Prothero hasn’t given deep consideration to his terms here, but I’ll try to do it for him. What does ‘running the world’ entail? I’ve heard people say that multinational corporations run the world, or that various superpowers do so, or have done so, but the idea that the major religions run the world between them is a novel one to me. Of course, if I want to find out whether Prothero provides evidence for his claim, or sets out to prove it, I’d have to read his book, and I’m reluctant to do so. It’s surely far more likely he’s tossed in the subtitle as something provocative, a piece of unsubstantiated rhetoric.
A lot of ingredients make the human world run, including trade, transport, law, festivals, education, sex, empathy and new ideas. Customs, habits and religious rituals play their part for many of us too. However, there’s no doubt that, for most westerners, global networking, the take-up of higher education, multiculturalism and travel have transformed earlier customs and habits, with religion taking a major hit in the process. The places where religion is holding its own are those where such modern trends are less evident.
Prothero also seems to be downplaying the 20th century when he writes that the influence of atheism was negligible before that time, as if to say ‘setting aside the 20th century, religion has been the most powerful force in humanity.’ Maybe so, but you can’t set aside the 20th century, a century which saw the human population rise from less than two billion to around 7 billion, a century of unprecedented and mind-boggling advances in science and technology, and in the education required to keep abreast of them, and which has seen a massive rise in travel and global communication. Continuing into the 21st century, these developments have been transformative for those exposed to them. It is unlikely to be coincidental that the same period has seen ‘the rise of the nones’ as by far the most significant development in religion for centuries – or more likely, since the first shrine was constructed. Of course, correlation isn’t causation, and I’m not going to delve deeply into causative factors here, but the phenomenon is real, though Prothero engages in what seems to me a desperate attempt to minimise it with his data. I’ll examine his statistics more closely later.
Prothero also presents the ‘inconsequential outside of Europe’ argument, which, apart from dismissing Australians like me – where more than 23% professed to having no religion in the last census (2011), with some 9% also choosing not to answer the optional question on religion – seems to dismiss Europe as an aberration in much the same way as he dismisses the 20th century. Yet in the last seventy years since the end of WW2, western Europe has only been an aberration in terms of its stability, its growing unity, its overall prosperity, its high levels of literacy and other positives on the registers of well-being and civility. Surely we should hope that such aberrations might spread worldwide. Many of the western European nations are regarded and valued as ‘elite states’, where religious strife, a problem in the heart of Europe for centuries up to and including the Thirty Years’ War of the 17th century, is now almost entirely confined to its immigrant populations. These are now among the least religious countries in the world.
So let’s look at Prothero’s data. He states that 9% of Western European adults are ‘committed atheists’. Why, one wonders, does he choose this category? Most atheists aren’t ‘committed’ if by this is meant proselytising for non-belief in supernatural beings. They don’t go around ‘being atheists’. As I’ve said, I consider myself first and foremost as a sceptic, and it’s out of scepticism and a need for evidence and for the best explanation of phenomena that I consider belief in creator beings, astrology, acupuncture, fairies and homeopathy as best explained by psychology, ignorance and credulity.
My view is that Prothero chooses the ‘committed atheist’ category for the same reason that William Lane Craig does – to minimise the clear-cut ‘rise of the nones’, to reduce non-belief to the smallest category he can get away with.
Prothero cites a website for his figures on ‘committed atheists’ (9% in western Europe, 4% in eastern and central Europe, 3% in Latin America, 2% in the Middle East and 1% in North America), which is a 2005 Gallup Poll. I cannot find the 2005 poll, but an updated 2012 Gallup Poll is very revealing, as it compares some figures with those from 2005. What it reveals, sadly, is a degree of intellectual dishonesty on Prothero’s part. Prothero claims that atheism is inconsequential outside of Europe, yet the same Gallup Poll from which he took his figures – but this time the 2012 version – states that 47% of Chinese self-describe as committed atheists*. Presumably this was slightly up on 2005 (the 2005 figure for China isn’t given), because almost every nation shows a rise in atheism in recent years, but the huge percentage, together with 31% of Japanese ‘committed atheists’ completely discredits Prothero’s ‘inconsequential outside of Europe’ claim.
It’s worth giving more comprehensive data on western Europe here, based on the 2012 poll by Gallup International. The 9% figure for ‘committed atheists’ is now 14%, with a further 32% describing themselves as ‘not religious’, and 3% ‘no answer or not sure’. The rest, 51%, described themselves as religious. It’s clear that, by the next poll, most western Europeans will not describe themselves as religious. Only 14% of Chinese people currently describe themselves as such – and as we all know, China will soon take over the world.
I was surprised, too, that only 1% of North Americans were committed atheists, according to Prothero. I can’t confirm this, but according to the 2012 poll, the figure is 6%, with a further 33% claiming to be ‘not religious’. The percentage of the self-described religious is a surprisingly low 57%. Perhaps Prothero combined the North American and African figures to arrive at the 1% mark. Who knows what paths motivated reasoning will lead a person down.
The 2012 poll, if it’s reliable, is revealing about the speed with which religion is being abandoned in some parts. In France, for example the percentage of ‘committed atheists’ has jumped from 14 to 29%, an extraordinary change in age-old belief systems in less than a decade.
But beyond these statistics about how people see themselves, the change is most marked, in the west, by the vastly diminished role of religion in public life. It’s precisely Prothero’s claim that religions ‘run the world’ that is most suspect. In virtually every western country, secularism, the insistence that the church and the state remain separate, has become more firmly established in the 20th century. The political influence of the Christian churches in particular has noticeably waned. Of course there are a few theocratic nations, but their numbers are decreasing, and none of them are major world powers. If you believe, as most do, that the world is run by governments and commercial enterprises, it’s hard to see where religion fits into this scheme. In some regions it may be the glue that holds societies together, but these regions appear to be diminishing. Religions these days receive more publicity for the damage they do than for any virtues they may possess. Any modern westerner might think of them as ruining the world rather than running it.
The fact is that, in every western country without exception (yes, that includes the USA), the trend away from religious belief is so rapid it’s almost impossible to keep up with. I’ve already written about the data in New Scientist suggesting that the ‘nones’ are the fourth religious category after Christians, Moslems and Hindus, numbering some 700 million. Wikipedia goes one further, putting the nones third with 1.1 billion. Of course these figures are as rubbery as can be, but its indisputable that this is overwhelmingly a modern phenomenon, covering the past fifty or sixty years in particular. It’s accelerating and unlikely to reverse itself in the foreseeable.
Books like Prothero’s are symptomatic of the change. Remember The Twilight of Atheism (which I also haven’t read)? Deny what’s going on, promote the positive power and eternal destiny of religion and all will be well.
Well, it won’t. Something’s happening here but you don’t know what it is, do you, Mr Prothero?
*To be fair to Prothero, it looks like no 2005 figures for China were available, though the large figures for Japan certainly were. Also, though these figures for China have been uncritically reported by the media, the sample size, as mentioned on Gallup International’s website, was preposterously small – some 500 people, less than one two-millionth of the Chinese population. The survey was apparently conducted online, but no details were given about the distribution of those surveyed. Given the resolutely secular Chinese government’s tight control of its citizens and media, I would treat any statistics coming out of that country with a large dose of salt.
I’ve read at least enough about WW1 to be aware that its causes, and the steps made towards war, were very complex and contestable. There are plenty of historians, professional and amateur, who’ve suggested that, if not for x, or y, war may have been avoided. However, I don’t think there’s any doubt that a ‘force’, one which barely exists today, a force felt by all sides in the potential conflict of the time, made war very difficult to avoid. I’ll call this force the appetite for war, but it needs to be understood more deeply, to divest it of its vagueness. We know that, in 1914, lads as young as 14 sneaked their way into the militaries of their respective countries to experience the irresistible thrill of warfare. A great many of them paid the ultimate price. Few of these lambs to the slaughter were discouraged from their actions – on the contrary. Yet 100 years on, this attitude seems bizarre, disgusting and obscene. And we don’t even seem to realise how extraordinarily fulsome this transformation has been.
Let’s attempt to go back to those days. They were the days when the size of your empire was the measure of your manliness. The Brits had a nice big fat one, and the Germans were sorely annoyed, having come late to nationhood and united military might, but with few foreign territories left to conquer and dominate. They continued to build up their arsenal while fuming with frustration. Expansionism was the goal of all the powerful nations, as it always had been, and in earlier centuries, as I’ve already outlined, it was at the heart of scores of bloody European conflicts. In fact, it’s probably fair to say that the years of uneasy peace before 1914 contributed to the inevitability of the conflict. Peace was considered an almost ‘unnatural’ state, leading to lily-livered namby-pambiness in the youth of Europe. Another character-building, manly war was long overdue.
Of course, all these expansionist wars of the past led mostly to stalemates and backwards and forwards exchanges of territory, not to mention mountains of dead bodies and lakes of blood, but they made numerous heroic reputations – Holy Roman Emperor Charles V and his son Philip II of Spain, Gustavus Adolphus of Sweden, Frederick the Great of Prussia, Peter the Great of Russia, Louis XIV of France and of course Napoleon Bonaparte. These ‘greats’ of the past have always evoked mixed reactions in me, and the feelings are well summed up by Pinker in The Better Angels of our Nature:
The historic figures who earned the honorific ‘So-and-So the Great’ were not great artists, scholars, doctors or inventors, people who enhanced human happiness or wisdom. They were dictators who conquered large swaths of territory and the people in them. If Hitler’s luck had held out a bit longer, he probably would have gone down in history as Adolf the Great.
While I’m not entirely sure about that last sentence, these reflections are themselves an indication of how far we’ve come, and how far we’ve been affected by the wholesale slaughter of two world wars and the madness of the ‘mutually assured destruction’ era that followed them. The fact that we’ve now achieved a military might far beyond the average person’s ability to comprehend, rendering obsolete the old world of battlefields and physical heroics, has definitely removed much of the thrill of combat, now more safely satisfied in computer games. But let’s return again to that other country, the past.
In the same month that the war began, August 1914, the Order of the White Feather was founded, with the support of a number of prominent women of the time, including the author and anti-suffragette Mrs Humphrey Ward (whom we might now call Mary) and the suffragette leaders Emmeline and Cristobel Pankhurst. It was extremely popular, so much so that it interfered with government objectives – white feathers were sent even to those convalescing from the horrors of the front lines, and to those dedicated to arms manufacturing in their home countries. Any male of a certain age who wasn’t in uniform or ‘over there’ was fair game. Not that the white feather idea was new with WWI – it had been made popular by the novel The Four Feathers (1902), set in the First War of Sudan in 1882, and the idea had been used in the British Empire since the eighteenth century – but it reached a crescendo of popularity, a last explosive gasp – or not quite, for it was revived briefly during WWII, but since then, and partly as a result of the greater awareness of the carnage of WWI, the white feather has been used more as a symbol of peace and pacifism. The Quakers in particular took it to heart as a badge of honour, and it became a symbol for the British Peace Pledge Union (PPU) in the thirties, a pacifist organisation with a number of distinguished writers and intellectuals, such as Aldous Huxley, Bertrand Russell and Storm Jameson.
There was no PPU or anything like it, however, in the years before WWI. Yet the enthusiasm for war of 1914 soon met with harsh reality in the form of Ypres and the Somme. By the end of 1915 the British Army was ‘depleted’ to the tune of over half a million men, and conscription was introduced, for the first time ever in Britain, in 1916. It had been mooted for some time, for of course the war had been catastrophic for ordinary soldiers from the start, and it quickly became clear that more bodies were needed. Not surprisingly, though, resistance to the carnage had begun to grow. An organisation called the No-Conscription Fellowship (NCF), consisting mainly of socialists and Quakers, was established, and it campaigned successfully to have a ‘conscience clause’ inserted in the 1916 Military Service (conscription) Act. The clause allowed people to refuse military service if it conflicted with their beliefs, but they had to argue their case before a tribunal. Of course ‘conshies’ were treated with some disdain, and were less tolerated by the British government as the war proceeded, during which time the Military Service Act was expanded, first to include married men up to 41 years of age (the original Act had become known as the Batchelor’s Bill) and later to include men up to 51 years of age. But the British government’s attitude didn’t necessarily represent that of the British people, and the NCF and related organisations grew in numbers as the war progressed, in spite of government and jingoist media campaigns to suppress them.
In Australia, two conscription bills, in 1916 and 1917, failed by a slim majority. In New Zealand, the government simply imposed the Military Service Act on its people without bothering to ask them. Those who resisted were often treated brutally, but their numbers increased as the war progressed. However, at no time, in any of the warring nations, did the anti-warriors have the numbers to be a threat to their governments’ ‘sunken assets’ policies.
So why was there such an appetite then and why is the return of such an appetite unthinkable today? Can we just put it down to progress? Many skeptics are rightly suspicious of ‘progress’ as a term that breeds complacency and even an undeserved sense of superiority over the primitives of the past, but Pinker and others have argued cogently for a civilising process that has operated, albeit partially and at varying rates in various states, since well before WWI, indeed since the emergence of governments of all stripes. The cost, in human suffering, of WWI and WWII, and the increasingly sophisticated killing technology that has recently made warfare as unimaginable and remote as quantum mechanics, have led to a ‘long peace’ in the heart of Europe at least – a region which, as my previous posts have shown, experienced almost perpetual warfare for centuries. We shouldn’t, of course, assume that the present stability will be the future norm, but there are reasons for optimism (as far as warfare and violence is concerned – the dangers for humanity lie elsewhere).
Firstly, the human rights movement, in the form of an international movement dedicated to peace and stability between nations for the sake of their citizens, was born out of WWI in the form of the League of Nations, which, while not strong enough to resist the Nazi impetus toward war in the thirties, formed the structural foundation for the later United Nations. The UN is, IMHO, a deeply flawed organisation, based as it is on the false premise of national sovereignty and the inward thinking thus entailed, but as an interim institution for settling disputes and at least trying to keep the peace, it’s far better than nothing. For example, towards the end of the 20th century, the concepts of crimes against humanity and genocide were given more legal bite, and heads of state began, for the first time in history, to be held accountable for their actions in international criminal courts run by the UN. Obviously, considering the invasion of Iraq and other atrocities, we have a long way to go, but hopefully one day even the the most powerful and, ipso facto, most bullying nations will be forced to submit to international law.
Secondly, a more universal and comprehensive education system in the west, which over the past century and particularly in recent decades, has emphasised critical thinking and individual autonomy, has been a major factor in the questioning of warfare and conscription, and in recognising the value of children and youth, and loosening the grip of authority figures. People are far less easily conned into going into war than ever before, and are generally more sceptical of their governments.
Thirdly, globalism and the internationalism of our economy, our science. our communications systems, and the problems we face, such as energy, food production and climate change, have meant that international co-operation is far more important to us than empire-building. Science, for those literate enough to understand it, has all but destroyed the notion of race and all the baggage attend upon it. There are fewer barriers to empathy – to attack other nations is tantamount to attacking ourselves. The United Nations, ironic though that title often appears to be, has spawned or inspired many other organisations of international co-operation, from the ICC to the Intergovernmental Panel on Climate Change.
There are many other related developments which have moved us towards co-operation and away from belligerence, among them being the greater democratisation of nations – the enlargement of the franchise in existing democracies or pro to-democracies, and the democratisation of former Warsaw Pact and ‘Soviet Socialist’ nations – and the growing similarity of national interests, leading to more information and trade exchanges.
So there’s no sense that the ‘long peace’ in Europe, so often discussed and analysed, is going to be broken in the foreseeable future. To be sure, it hasn’t been perfect, with the invasions of Hungary in 1956 and Czechoslovakia in 1968, and the not-so-minor Balkans War of the 90s, and I’m not sure if the Ukraine is a European country (and neither are many Ukrainians it seems), but the broad movements are definitely towards co-operation in Europe, movements that we can only hope will continue to spread worldwide.
Oscar Wilde once wrote: As long as war is regarded as wicked it will always have its fascination. When it is looked upon as vulgar it will cease to be popular.
This remark might seem trivial perhaps because Wilde himself is sometimes seen as a mere wit and because the word vulgar is now no longer popular (it has a certain vulgarity about it), but with different phrasing I’ve often thought along similar lines. In exasperation I describe to myself the current horrors in Palestine and Iraq and Syria as the acts of religious primitives, and fights in bars as the acts of bogans. I’m really talking about what used to be called vulgarity. it’s partly this way of thinking that makes me annoyed about the so-called war on terrorism, as if these were warriors, with their inherent fascination, instead of vulgar criminals.
Take cigarette smoking for example. When I see smokers on the streets these days, I think of sad sacks and the left behind. My zeitgeist-tinted specs see them as wash-outs and losers, adjusting my focus to catch clearly the ever-changing face of the properly vulgar, as it was once termed.
The National School Chaplaincy Program is on the face of it a curiously retrograde program that first came into being in 2006, near the end of Howard’s conservative Prime Ministership. It apparently began its life with a conversation in May of that year between the Victoria-based federal minister, Greg Hunt, and one Peter Rawlings, a member of Access Ministries and a volunteer in primary school religious instruction. Rawlings suggested to Hunt that it would be a great idea to install Christian ‘support workers’ in state schools throughout the Mornington Peninsula area. Hunt, whose religious beliefs are a mystery to me, apparently though this a great idea, one that should be extended to the whole nation, with federal support. His boss, PM Howard, who claims to be a committed Christian, was also whole-hearted in his support as were various other conservative MPs.
Given that over 23% of Australians are openly non-religious (a decidedly conservative figure), and that the rise of the non-religious over the last twenty years is the most significant change in religion in this country, and given that every Christian denomination is in decline, some of them spectacularly, and given the fall in church attendances, and the increasing multiculturalism and multi-religiousity of that proportion of society that is still religious, I personally find it unfathomable that this proposal has received such support. I can only suppose that such organisations as the Australian Christian Lobby, Access Ministries and Scripture Union Australia have far more political power and clout than I could ever have thought possible.
In any case, Labor, under its devoutly Christian PM Kevin Rudd, chose not to throw the scheme out when it came to power in late 2007. Labor has long supported the separation of church and state, a position reinforced, one would’ve thought, by the increasing secularisation of our polity in recent years. Rudd himself was keen to reassure people that he supported church-state separation, as did his Education Minister, Peter Garrett (another Christian). So why did they persist in this program? Wouldn’t a financial boost to school counselling have been a simpler, more effective and far less controversial option? Of course the program fits the current conservative government’a agenda perfectly. It has hit schools hard with its recent cost-cutting ‘share the pain’ budget, while at the same time earmarking some $245 million for chaplaincy over the next four years or so. The program will replace the existing School Welfare Program from the start of 2015, thus undermining school counselling and psychological services. In May of this year, a provision to allow for non-secular ‘chaplains’ was struck out. It was a finicky provision in any case, only allowing for non-secular welfare workers when all attempts to find an ordained chaplain for the job had failed. However, the striking out of the provision gives a clear indication that this is a federally-backed religious (or more specifically Christian) position. It can also come at a great cost to individual workers and to recipients of their services, as this story from the Sydney Morning Herald of May 21 shows:
Last week’s budget delivered a double blow to youth welfare worker Joanne Homsi. For the past 18 months, Ms Homsi has worked in two high schools in the St George and Sutherland area, supporting students with drug and alcohol issues, low confidence, family problems and suicidal thoughts. As well as talking with students, she has connected them to mental health centres, remedial learning programs and other services. Ms Homsi loves the job, and the schools value her work. But in December she will be looking for a new job – and there will not be a safety net to catch her if she cannot find one. Because she is under 30, she would have to wait six months before she can receive any unemployment benefits under tough new rules for young job seekers.
The federal government, in any case, hasn’t generally been in the business of providing funding for these kinds of services, which is usually a states responsibility, but of course schools will look for funding anywhere they can, and to have that scarce funding tied to a Christian belief system seems wildly anachronistic. This Essential Report poll gives the clear impression that the program is unsupported by the general public, but maybe there’s a larger ‘silent’ public that the conservatives are appealing to, or maybe they simply don’t care about what the public prefers. Their argument would be that take-up of the program is entirely voluntary. In other words, cash-strapped schools are faced with this option or no option as far as federal funding is concerned.
There are plenty of parents who are willing to take a stand on this issue, however. One of them, Ron Williams, Australian Humanist of the Year in 2012, was recently successful yet again in his second High Court challenge to the NSCP, though the government, and the Labor opposition, are perversely determined to find their way around the High Court’s ruling. In a 6-0 result worthy of the German national team the High Court found that the funding model of the government was inappropriate. When Williams’ first High Court challenge was successful in 2012, the then Gillard Labor government rushed through the Financial Framework Amendment Legislation Act to enable it to fund a range of programs without legislation. Some $6 million was provided to the Scripture Union of Queensland. To add insult to injury for Williams, a father of four school-age children, funding was provided to his own children’s school to employ a chaplain.
The recent High Court finding says that the funding mechanism is invalid. This affects many other federal government funding mechanisms too, as it happens, and that’s a big headache for the government, but the most interesting finding related to the NSCP is that the funding, in the overwhelming view of the High Court, did not provide a benefit to students – which it should according to section 51 (XXiiiA) of the constitution, in order to be valid. In other words the High Court overwhelmingly disagreed with John Howard’s claim, made at the launch of the NSCP in October 2006, that ‘students need the guidance of chaplains, rather than counsellors’. I don’t think there’s any doubt that, had the money been earmarked for counsellors, the High Court would indeed have seen that as a benefit to students.
So the government is trying to find new funding models for a variety of programs it wants to hold onto, but it’s got a problem on its hands with chaplaincies. They have to be Christians, but they can’t proselytise, they’re there to give spiritual guidance to students, but this isn’t seen legally as providing a benefit, and how do they do that anyway without proselytising? What a holy mess it is, to be sure.