the new ussr illustrated

assorted reflections from the urbane society for sceptical romantics

he for she

leave a comment »

458369-2c624958-4207-11e4-b1b2-871c50b86f82

I have to say, from a very young age, I considered myself a feminist. And then I read (sometime in the seventies, long before Emma Watson was born, bless her cotton socks) that you couldn’t be a feminist as a male, because it was some kind of uniquely female thing, whereas I, like Emma, thought it was a simple matter of believing that females were equal to men in every respect, and that it didn’t matter who did the believing – male, female, androgyne or alien.

Emma Watson’s Hermione is an iconic figure. Perhaps I should say J K Rowling’s Hermione, though millions identify Emma as Hermione. Yet, interestingly, Emma described herself  in her speech, self-depracatingly, as ‘that Harry Potter girl’, inadvertently reminding us of her role as support to the main protagonist.

I don’t in any way want to disparage the Harry Potter novels, which I’m sure would have been just as successful with Harriet Potter as the heroine – at least I hope so. I personally have observed how much Hermione has inspired young girls, as an intelligent, level-headed problem-solver. So it was with great delight that I, along with many others, have been able to see that Emma was not just playing a part as Hermione; that she genuinely wants to use her prominence to push for the recognition of women globally.

I would go further – and I suspect she would agree, though she didn’t go that far in her speech – and say that the world would be better for having more women in prominent positions – that it would be safer, more collaborative, and more congenial. But maybe I’m being a little idealistic…

In any case, the ‘he for she’ initiative is one that I endorse whole-heartedly, because it allows men to have their say without experiencing any of the weird responses from both sides. It’s simply about equality, and respect.

Written by Stewart Henderson

September 28, 2014 at 8:40 pm

Posted in gender, sex

Tagged with , ,

some people really don’t like atheists

leave a comment »

stephen-prothero-speech

‘Atheism is not a great religion. It has always been for elites rather than for ordinary folk. And until the 20th century, its influence on world history was as inconsequential as Woody Allen’s god. Even today the impact of atheism outside of Europe is inconsequential. A recent Gallop poll found that 9% of adults in Western Europe (where the currency does not trust in God) described themselves as ‘convinced atheists’.  That figure fell to 4% in eastern and central Europe, 3% in Latin America, 2% in the Middle East, and 1% in North America and Africa. Most Americans say they would not vote for an atheist for president.’

Stephen Prothero, from God is not one: the eight rival religions that run the world & why their differences matter (2010).

I should admit at the outset that I’ve not read Prothero’s book, and probably never will, as time is precious and there are too many other titles and areas of knowledge and endeavour that appeal to me. However, since, as a humanist and skeptic I have a passing interest in the religious mindset and in promoting critical thinking and humanism, I think the above quote is worth dwelling on critically.

First, the claim that ‘atheism is not a great religion’. It’s an interesting remark because it can be interpreted in two ways. First, that atheism is not a religion of any kind, great or small; second that atheism is a religion, but not a great one. I strongly suspect that Prothero has the second view in mind, while also playing on the first one. Of course atheism isn’t a religion and it’s tedious to have to play this game with theists (assuming Prothero is one) for the zillionth time, but my own experience on being confronted with the idea of a supernatural entity for the first time at around eight or nine was one of scepticism, though I didn’t then have a name for it. I don’t think scepticism could ever be called a religion. And nothing that I’ve ever experienced since has tempted me to believe in the existence of supernatural entities.

Next comes the claim that atheism has always been for elites rather than ordinary folk. This is probably true, but we need to reflect on the term ‘elite’. I assume Prothero can only mean intellectual elites. The Oxford dictionary succinctly defines an elite as ‘a select group that is superior in terms of ability or qualities to the rest of a group or society’. Generally, therefore, the best of society, or the leaders. It’s broadly true, especially in the West, that you won’t get to the top in business without a good business brain, you won’t get to the top in politics without a good political brain and you won’t get to the top in science without a good scientific brain, and these are all positive qualities. The elites are the best, and the best tend to be society’s movers and shakers.

Yet Prothero doesn’t appear to agree, quite. His juxtaposing of the two sentences intimates that atheism is not a great religion because it has always been for elites. What are we to make of this? My guess is that he’s trying to downplay atheism but has made a bit of a mess of it. And there’s more of this. Before the 20th century, we’re informed, atheism was as influential ‘as Woody Allen’s god’, by which, I presume, he’s referring to Allen’s farce of 1975, God, with which I’m not particularly familiar. I do know, though, that it’s fashionable these days to trash Woody Allen, so the message appears to be that, before 1900 or so, atheism was very inconsequential indeed.

A reasonable person might wonder here why Prothero seems so keen to diminish atheism. A big clue is surely to be found in the subtitle to Prothero’s book. Which raises some questions: What are these eight religions? Are they really rivals? Do they run the world?

The contents page answers the first question: Islam, Christianity, Confucianism, Hinduism, Buddhism, Yoruba religion, Judaism and Daoism make up the Premier League. Presumably Jainism, Sikhism and Zoroastrianism are struggling in the lower divisions. There is some debate amongst authorities as to whether Confucianism or Daoism are recognised religions, and they’e often found blended, along with Buddhism, in Chinese folk tradition – so, maybe not so much rivals.

Surely the most important question, though, is whether these religions ‘run the world’. I have the strong suspicion that Prothero hasn’t given deep consideration to his terms here, but I’ll try to do it for him. What does ‘running the world’ entail? I’ve heard people say that multinational corporations run the world, or that various superpowers do so, or have done so, but the idea that the major religions run the world between them is a novel one to me. Of course, if I want to find out whether Prothero provides evidence for his claim, or sets out to prove it, I’d have to read his book, and I’m reluctant to do so. It’s surely far more likely he’s tossed in the subtitle as something provocative, a piece of unsubstantiated rhetoric.

A lot of ingredients make the human world run, including trade, transport, law, festivals, education, sex, empathy and new ideas. Customs, habits and religious rituals play their part for many of us too. However, there’s no doubt that, for most westerners, global networking, the take-up of higher education, multiculturalism and travel have transformed earlier customs and habits, with religion taking a major hit in the process. The places where religion is holding its own are those where such modern trends are less evident.

Prothero also seems to be downplaying the 20th century when he writes that the influence of atheism was negligible before that time, as if to say ‘setting aside the 20th century, religion has been the most powerful force in humanity.’ Maybe so, but you can’t set aside the 20th century, a century which saw the human population rise from less than two billion to around 7 billion, a century of unprecedented and mind-boggling advances in science and technology, and in the education required to keep abreast of them, and which has seen a massive rise in travel and global communication. Continuing into the 21st century, these developments have been transformative for those exposed to them. It is unlikely to be coincidental that the same period has seen ‘the rise of the nones’ as by far the most significant development in religion for centuries – or more likely, since the first shrine was constructed. Of course, correlation isn’t causation, and I’m not going to delve deeply into causative factors here, but the phenomenon is real, though Prothero engages in what seems to me a desperate attempt to minimise it with his data. I’ll examine his statistics more closely later.

Prothero also presents the ‘inconsequential outside of Europe’ argument, which, apart from dismissing Australians like me – where more than 23% professed to having no religion in the last census (2011), with some 9% also choosing not to answer the optional question on religion – seems to dismiss Europe as an aberration in much the same way as he dismisses the 20th century. Yet in the last seventy years since the end of WW2, western Europe has only been an aberration in terms of its stability, its growing unity, its overall prosperity, its high levels of literacy and other positives on the registers of well-being and civility. Surely we should hope that such aberrations might spread worldwide. Many of the western European nations are regarded and valued as ‘elite states’, where religious strife, a problem in the heart of Europe for centuries up to and including the Thirty Years’ War of the 17th century, is now almost entirely confined to its immigrant populations. These are now among the least religious countries in the world.

So let’s look at Prothero’s data. He states that 9% of Western European adults are ‘committed atheists’. Why, one wonders, does he choose this category? Most atheists aren’t ‘committed’ if by this is meant proselytising for non-belief in supernatural beings. They don’t go around ‘being atheists’. As I’ve said, I consider myself first and foremost as a sceptic, and it’s out of scepticism and a need for evidence and for the best explanation of phenomena that I consider belief in creator beings, astrology, acupuncture, fairies and homeopathy as best explained by psychology, ignorance and credulity.

My view is that Prothero chooses the ‘committed atheist’ category for the same reason that William Lane Craig does – to minimise the clear-cut ‘rise of the nones’, to reduce non-belief to the smallest category he can get away with.

Prothero cites a website for his figures on ‘committed atheists’ (9% in western Europe, 4% in eastern and central Europe, 3% in Latin America, 2% in the Middle East and 1% in North America), which is a 2005 Gallup Poll. I cannot find the 2005 poll, but an updated 2012 Gallup Poll is very revealing, as it compares some figures with those from 2005. What it reveals, sadly, is a degree of intellectual dishonesty on Prothero’s part. Prothero claims that atheism is inconsequential outside of Europe, yet the same Gallup Poll from which he took his figures – but this time the 2012 version – states that 47% of Chinese self-describe as committed atheists*. Presumably this was slightly up on 2005 (the 2005 figure for China isn’t given), because almost every nation shows a rise in atheism in recent years, but the huge percentage, together with 31% of Japanese ‘committed atheists’ completely discredits Prothero’s ‘inconsequential outside of Europe’ claim.

It’s worth giving more comprehensive data on western Europe here, based on the 2012 poll by Gallup International. The 9% figure for ‘committed atheists’ is now 14%, with a further 32% describing themselves as ‘not religious’, and 3% ‘no answer or not sure’. The rest, 51%, described themselves as religious. It’s clear that, by the next poll, most western Europeans will not describe themselves as religious. Only 14% of Chinese people currently describe themselves as such – and as we all know, China will soon take over the world.

I was surprised, too, that only 1% of North Americans were committed atheists, according to Prothero. I can’t confirm this, but according to the 2012 poll, the figure is 6%, with a further 33% claiming to be ‘not religious’. The percentage of the self-described religious is a surprisingly low 57%. Perhaps Prothero combined the North American and African figures to arrive at the 1% mark. Who knows what paths motivated reasoning will lead a person down.

The 2012 poll, if it’s reliable, is revealing about the speed with which religion is being abandoned in some parts. In France, for example the percentage of ‘committed atheists’ has jumped from 14 to 29%, an extraordinary change in age-old belief systems in less than a decade.

But beyond these statistics about how people see themselves, the change is most marked, in the west, by the vastly diminished role of religion in public life. It’s precisely Prothero’s claim that religions ‘run the world’ that is most suspect. In virtually every western country, secularism, the insistence that the church and the state remain separate, has become more firmly established in the 20th century. The political influence of the Christian churches in particular has noticeably waned. Of course there are a few theocratic nations, but their numbers are decreasing, and none of them are major world powers. If you believe, as most do, that the world is run by governments and commercial enterprises, it’s hard to see where religion fits into this scheme. In some regions it may be the glue that holds societies together, but these regions appear to be diminishing. Religions these days receive more publicity for the damage they do than for any virtues they may possess. Any modern westerner might think of them as ruining the world rather than running it.

The fact is that, in every western country without exception (yes, that includes the USA), the trend away from religious belief is so rapid it’s almost impossible to keep up with. I’ve already written about the data in New Scientist suggesting that the ‘nones’ are the fourth religious category after Christians, Moslems and Hindus, numbering some 700 million. Wikipedia goes one further, putting the nones third with 1.1 billion. Of course these figures are as rubbery as can be, but its indisputable that this is overwhelmingly a modern phenomenon, covering the past fifty or sixty years in particular. It’s accelerating and unlikely to reverse itself in the foreseeable.

Books like Prothero’s are symptomatic of the change. Remember The Twilight of Atheism (which I also haven’t read)? Deny what’s going on, promote the positive power and eternal destiny of religion and all will be well.

Well, it won’t. Something’s happening here but you don’t know what it is, do you, Mr Prothero?

 

*To be fair to Prothero, it looks like no 2005 figures for China were available, though the large figures for Japan certainly were. Also, though these figures for China have been uncritically reported by the media, the sample size, as mentioned on Gallup International’s website, was preposterously small – some 500 people, less than one two-millionth of the Chinese population. The survey was apparently conducted online, but no details were given about the distribution of those surveyed. Given the resolutely secular Chinese government’s tight control of its citizens and media, I would treat any statistics coming out of that country with a large dose of salt.

Written by Stewart Henderson

August 25, 2014 at 10:33 pm

1914 – 2014: celebrating a loss of appetite

leave a comment »

Statue_of_Europe-(Unity-in-Peace)

 

I’ve read at least enough about WW1 to be aware that its causes, and the steps made towards war, were very complex and contestable. There are plenty of historians, professional and amateur, who’ve suggested that, if not for x, or y, war may have been avoided. However, I don’t think there’s any doubt that a ‘force’, one which barely exists today, a force felt by all sides in the potential conflict of the time, made war very difficult to avoid. I’ll call this force the appetite for war, but it needs to be understood more deeply, to divest it of its vagueness. We know that, in 1914, lads as young as 14 sneaked their way into the militaries of their respective countries to experience the irresistible thrill of warfare. A great many of them paid the ultimate price. Few of these lambs to the slaughter were discouraged from their actions – on the contrary. Yet 100 years on, this attitude seems bizarre, disgusting and obscene. And we don’t even seem to realise how extraordinarily fulsome this transformation has been.

Let’s attempt to go back to those days. They were the days when the size of your empire was the measure of your manliness. The Brits had a nice big fat one, and the Germans were sorely annoyed, having come late to nationhood and united military might, but with few foreign territories left to conquer and dominate. They continued to build up their arsenal while fuming with frustration. Expansionism was the goal of all the powerful nations, as it always had been, and in earlier centuries, as I’ve already outlined, it was at the heart of scores of bloody European conflicts. In fact, it’s probably fair to say that the years of uneasy peace before 1914 contributed to the inevitability of the conflict. Peace was considered an almost ‘unnatural’ state, leading to lily-livered namby-pambiness in the youth of Europe. Another character-building, manly war was long overdue.

Of course, all these expansionist wars of the past led mostly to stalemates and backwards and forwards exchanges of territory, not to mention mountains of dead bodies and lakes of blood, but they made numerous heroic reputations – Holy Roman Emperor Charles V and his son Philip II of Spain, Gustavus Adolphus of Sweden, Frederick the Great of Prussia, Peter the Great of Russia, Louis XIV of France and of course Napoleon Bonaparte. These ‘greats’ of the past have always evoked mixed reactions in me, and the feelings are well summed up by Pinker in The Better Angels of our Nature:

The historic figures who earned the honorific ‘So-and-So the Great’ were not great artists, scholars, doctors or inventors, people who enhanced human happiness or wisdom. They were dictators who conquered large swaths of territory and the people in them. If Hitler’s luck had held out a bit longer, he probably would have gone down in history as Adolf the Great.

While I’m not entirely sure about that last sentence, these reflections are themselves an indication of how far we’ve come, and how far we’ve been affected by the wholesale slaughter of two world wars and the madness of the ‘mutually assured destruction’ era that followed them. The fact that we’ve now achieved a military might far beyond the average person’s ability to comprehend, rendering obsolete the old world of battlefields and physical heroics, has definitely removed much of the thrill of combat, now more safely satisfied in computer games. But let’s return again to that other country, the past.

In the same month that the war began, August 1914, the Order of the White Feather was founded, with the support of a number of prominent women of the time, including the author and anti-suffragette Mrs Humphrey Ward (whom we might now call Mary) and the suffragette leaders Emmeline and Cristobel Pankhurst. It was extremely popular, so much so that it interfered with government objectives – white feathers were sent even to those convalescing from the horrors of the front lines, and to those dedicated to arms manufacturing in their home countries. Any male of a certain age who wasn’t in uniform or ‘over there’ was fair game. Not that the white feather idea was new with WWI – it had been made popular by the novel The Four Feathers (1902), set in the First War of Sudan in 1882, and the idea had been used in the British Empire since the eighteenth century – but it reached a crescendo of popularity, a last explosive gasp – or not quite, for it was revived briefly during WWII, but since then, and partly as a result of the greater awareness of the carnage of WWI, the white feather has been used more as a symbol of peace and pacifism. The Quakers in particular took it to heart as a badge of honour, and it became a symbol for the British Peace Pledge Union (PPU) in the thirties, a pacifist organisation with a number of distinguished writers and intellectuals, such as Aldous Huxley, Bertrand Russell and Storm Jameson.

There was no PPU or anything like it, however, in the years before WWI. Yet the enthusiasm for war of 1914 soon met with harsh reality in the form of Ypres and the Somme. By the end of 1915 the British Army was ‘depleted’ to the tune of over half a million men, and conscription was introduced, for the first time ever in Britain, in 1916. It had been mooted for some time, for of course the war had been catastrophic for ordinary soldiers from the start, and it quickly became clear that more bodies were needed. Not surprisingly, though, resistance to the carnage had begun to grow. An organisation called the No-Conscription Fellowship (NCF), consisting mainly of socialists and Quakers, was established, and it campaigned successfully to have a ‘conscience clause’ inserted in the 1916 Military Service (conscription) Act. The clause allowed people to refuse military service if it conflicted with their beliefs, but they had to argue their case before a tribunal. Of course ‘conshies’ were treated with some disdain, and were less tolerated by the British government as the war proceeded, during which time the Military Service Act was expanded, first to include married men up to 41 years of age (the original Act had become known as the Batchelor’s Bill) and later to include men up to 51 years of age. But the British government’s attitude didn’t necessarily represent that of the British people, and the NCF and related organisations grew in numbers as the war progressed, in spite of government and jingoist media campaigns to suppress them.

In Australia, two conscription bills, in 1916 and 1917, failed by a slim majority. In New Zealand, the government simply imposed the Military Service Act on its people without bothering to ask them. Those who resisted were often treated brutally, but their numbers increased as the war progressed. However, at no time, in any of the warring nations, did the anti-warriors have the numbers to be a threat to their governments’ ‘sunken assets’ policies.

So why was there such an appetite then and why is the return of such an appetite unthinkable today? Can we just put it down to progress? Many skeptics are rightly suspicious of ‘progress’ as a term that breeds complacency and even an undeserved sense of superiority over the primitives of the past, but Pinker and others have argued cogently for a civilising process that has operated, albeit partially and at varying rates in various states, since well before WWI, indeed since the emergence of governments of all stripes. The cost, in human suffering, of WWI and WWII, and the increasingly sophisticated killing technology that has recently made warfare as unimaginable and remote as quantum mechanics, have led to a ‘long peace’ in the heart of Europe at least – a region which, as my previous posts have shown, experienced almost perpetual warfare for centuries. We shouldn’t, of course, assume that the present stability will be the future norm, but there are reasons for optimism (as far as warfare and violence is concerned – the dangers for humanity lie elsewhere).

Firstly, the human rights movement, in the form of an international movement dedicated to peace and stability between nations for the sake of their citizens, was born out of WWI in the form of the League of Nations, which, while not strong enough to resist the Nazi impetus toward war in the thirties, formed the structural foundation for the later United Nations. The UN is, IMHO, a deeply flawed organisation, based as it is on the false premise of national sovereignty and the inward thinking thus entailed, but as an interim institution for settling disputes and at least trying to keep the peace, it’s far better than nothing. For example, towards the end of the 20th century, the concepts of crimes against humanity and genocide were given more legal bite, and heads of state began, for the first time in history, to be held accountable for their actions in international criminal courts run by the UN. Obviously, considering the invasion of Iraq and other atrocities, we have a long way to go, but hopefully one day even the the most powerful and, ipso facto, most bullying nations will be forced to submit to international law.

Secondly, a more universal and comprehensive education system in the west, which over the past century and particularly in recent decades, has emphasised critical thinking and individual autonomy, has been a major factor in the questioning of warfare and conscription, and in recognising the value of children and youth, and loosening the grip of authority figures. People are far less easily conned into going into war than ever before, and are generally more sceptical of their governments.

Thirdly, globalism and the internationalism of our economy, our science. our communications systems, and the problems we face, such as energy, food production and climate change, have meant that international co-operation is far more  important to us than empire-building. Science, for those literate enough to understand it, has all but destroyed the notion of race and all the baggage attend upon it. There are fewer barriers to empathy – to attack other nations is tantamount to attacking ourselves. The United Nations, ironic though that title often appears to be, has spawned or inspired many other organisations of international co-operation, from the ICC to the Intergovernmental Panel on Climate Change.

There are many other related developments which have moved us towards co-operation and away from belligerence, among them being the greater democratisation of nations – the enlargement of the franchise in existing democracies or pro to-democracies, and the democratisation of former Warsaw Pact and ‘Soviet Socialist’ nations – and the growing similarity of national interests, leading to more information and trade exchanges.

So there’s no sense that the ‘long peace’ in Europe, so often discussed and analysed, is going to be broken in the foreseeable future. To be sure, it hasn’t been perfect, with the invasions of Hungary in 1956 and Czechoslovakia in 1968, and the not-so-minor Balkans War of the 90s, and I’m not sure if the Ukraine is a European country (and neither are many Ukrainians it seems), but the broad movements are definitely towards co-operation in Europe, movements that we can only hope will continue to spread worldwide.

Written by Stewart Henderson

August 22, 2014 at 9:05 am

perceptions of war and fighting and other things

leave a comment »

and believe me, Schopenhauer never looked like that

and believe me, Schopenhauer never looked like that

Oscar Wilde once wrote: As long as war is regarded as wicked it will always have its fascination. When it is looked upon as vulgar it will cease to be popular.

This remark might seem trivial perhaps because Wilde himself is sometimes seen as a mere wit and because the word vulgar is now no longer popular (it has a certain vulgarity about it), but with different phrasing I’ve often thought along similar lines. In exasperation I describe to myself the current horrors in Palestine and Iraq and Syria as the acts of religious primitives, and fights in bars as the acts of bogans. I’m really talking about what used to be called vulgarity. it’s partly this way of thinking that makes me annoyed about the so-called war on terrorism, as if these were warriors, with their inherent fascination, instead of vulgar criminals.

Take cigarette smoking for example. When I see smokers on the streets these days, I think of sad sacks and the left behind. My zeitgeist-tinted specs see them as wash-outs and losers, adjusting my focus to catch clearly the ever-changing face of the properly vulgar, as it was once termed.

Written by Stewart Henderson

August 17, 2014 at 4:15 am

the strange case of school chaplaincy

leave a comment »

Ron Williams - hero of the High Court challenge

Ron Williams – hero of the High Court challenge

The National School Chaplaincy Program is on the face of it a curiously retrograde program that first came into being in 2006, near the end of Howard’s conservative Prime Ministership. It apparently began its life with a conversation in May of that year between the Victoria-based federal minister, Greg Hunt, and one Peter Rawlings, a member of Access Ministries and a volunteer in primary school religious instruction. Rawlings suggested to Hunt that it would be a great idea to install Christian ‘support workers’ in state schools throughout the Mornington Peninsula area. Hunt, whose religious beliefs are a mystery to me, apparently though this a great idea, one that should be extended to the whole nation, with federal support. His boss, PM Howard, who claims to be a committed Christian, was also whole-hearted in his support as were various other conservative MPs.

Given that over 23% of Australians are openly non-religious (a decidedly conservative figure), and that the rise of the non-religious over the last twenty years is the most significant change in religion in this country, and given that every Christian denomination is in decline, some of them spectacularly, and given the fall in church attendances, and the increasing multiculturalism and multi-religiousity of that proportion of society that is still religious, I personally find it unfathomable that this proposal has received such support. I can only suppose that such organisations as the Australian Christian Lobby, Access Ministries and Scripture Union Australia have far more political power and clout than I could ever have thought possible.

In any case, Labor, under its devoutly Christian PM Kevin Rudd, chose not to throw the scheme out when it came to power in late 2007. Labor has long supported the separation of church and state, a position reinforced, one would’ve thought, by the increasing secularisation of our polity in recent years. Rudd himself was keen to reassure people that he supported church-state separation, as did his Education Minister, Peter Garrett (another Christian). So why did they persist in this program? Wouldn’t a financial boost to school counselling have been a simpler, more effective and far less controversial option? Of course the program fits the current conservative government’a agenda perfectly. It has hit schools hard with its recent cost-cutting ‘share the pain’ budget, while at the same time earmarking some $245 million for chaplaincy over the next four years or so. The program will replace the existing School Welfare Program from the start of 2015, thus undermining school counselling and psychological services. In May of this year, a provision to allow for non-secular ‘chaplains’ was struck out. It was a finicky provision in any case, only allowing for non-secular welfare workers when all attempts to find an ordained chaplain for the job had failed. However, the striking out of the provision gives a clear indication that this is a federally-backed religious (or more specifically Christian) position. It can also come at a great cost to individual workers and to recipients of their services, as this story from the Sydney Morning Herald of May 21 shows:

Last week’s budget delivered a double blow to youth welfare worker Joanne Homsi. For the past 18 months, Ms Homsi has worked in two high schools in the St George and Sutherland area, supporting students with drug and alcohol issues, low confidence, family problems and suicidal thoughts. As well as talking with students, she has connected them to mental health centres, remedial learning programs and other services. Ms Homsi loves the job, and the schools value her work. But in December she will be looking for a new job – and there will not be a safety net to catch her if she cannot find one. Because she is under 30, she would have to wait six months before she can receive any unemployment benefits under tough new rules for young job seekers.

The federal government, in any case, hasn’t generally been in the business of providing funding for these kinds of services, which is usually a states responsibility, but of course schools will look for funding anywhere they can, and to have that scarce funding tied to a Christian belief system seems wildly anachronistic. This Essential Report poll gives the clear impression that the program is unsupported by the general public, but maybe there’s a larger ‘silent’ public that the conservatives are appealing to, or maybe they simply don’t care about what the public prefers. Their argument would be that take-up of the program is entirely voluntary. In other words, cash-strapped schools are faced with this option or no option as far as federal funding is concerned.

There are plenty of parents who are willing to take a stand on this issue, however. One of them, Ron Williams, Australian Humanist of the Year in 2012, was recently successful yet again in his second High Court challenge to the NSCP, though the government, and the Labor opposition, are perversely determined to find their way around the High Court’s ruling. In a 6-0 result worthy of the German national team the High Court found that the funding model of the government was inappropriate. When Williams’ first High Court challenge was successful in 2012, the then Gillard Labor government rushed through the Financial Framework Amendment Legislation Act to enable it to fund a range of  programs without legislation. Some $6 million was provided to the Scripture Union of Queensland. To add insult to injury for Williams, a father of four school-age children, funding was provided to his own children’s school to employ a chaplain.

The recent High Court finding says that the funding mechanism is invalid. This affects many other federal government funding mechanisms too, as it happens, and that’s a big headache for the government, but the most interesting finding related to the NSCP is that the funding, in the overwhelming view of the High Court, did not provide a benefit to students – which it should according to section 51 (XXiiiA) of the constitution, in order to be valid. In other words the High Court overwhelmingly disagreed with John Howard’s claim, made at the launch of the NSCP in October 2006, that ‘students need the guidance of chaplains, rather than counsellors’. I don’t think there’s any doubt that, had the money been earmarked for counsellors, the High Court would indeed have seen that as a benefit to students.

So the government is trying to find new funding models for a variety of programs it wants to hold onto, but it’s got a problem on its hands with chaplaincies. They have to be Christians, but they can’t proselytise, they’re there to give spiritual guidance to students, but this isn’t seen legally as providing a benefit, and how do they do that anyway without proselytising? What a holy mess it is, to be sure.

 

Written by Stewart Henderson

August 2, 2014 at 5:45 pm

a brief history of pre-20th century European violence, part 2

leave a comment »

 

not quite a game

not quite a game

In my first post in this series I wrote about the 17th century and wars. In this post I’ll look at the wars of the 18th and 19th centuries.

Britain was more or less at peace in 1700, but it was soon involved in the War of the Spanish Succession (1701-14), a messy dispute, ostensibly about who should succeed the mentally and physically incapacitated Charles II on the Spanish throne. The death toll may have reached a few hundred thousand, but of course little clear data is available. The war divided Spain itself, but essentially it pitted the France of Louis XIV against its neighbours, including the British, the Dutch and the so-called Holy Roman Empire. It brought to an end the Habsburg dynasty in Spain.

Meanwhile elsewhere in Europe the Great Northern War (1700-21) was being fought. The Russian Tsar Peter the Great and his allies were fighting to curtail the power of the Swedish King Charles XII. Sweden had created an empire for itself out of the devastating Thirty Years’ War, but the Great Northern War finally ended Sweden’s dominance and established Russia as a major power. Again the casualties numbered a few hundred thousand – many dying of famine and disease. The Battle of Poltava, won by the Russians, was the most decisive single event.

Queen Anne’s War (1702-13) was arguably not a European War, or arguably a fully European War fought on American soil, often with most of the combatants on both sides being American Indians. Casualties, however, were relatively light. Also, during the War of the Spanish Succession there was an internal revolt of Huguenots (Protestants) in the isolated Cévennes region of France. The Huguenots had been persecuted for decades, but in the case of this uprising, known as the Camisard Rebellion, atrocities were committed on both sides. Hostilities began in 1702, and the ‘troubles’ weren’t settled until the death of Louis XIV in 1715.

Further east in Hungary a group of noblemen enlisted the aid of Louis XIV to bring about an end to Austrian Habsburg rule in the region. The consequent conflict is known as Rakoczi’s War of Independence (1703-11), a complex affair which also involved the Ottoman Turks, who had only recently given up all their Hungarian territories. Rakoczi, one of the noblemen, was unsuccessful, but he’s considered a national hero by Hungarians today.

The Russo-Ottoman War 1710-11 broke out largely as a result of Russian pressure on the Ottoman Sultan Ahmed III to hand over Sweden’s Charles XII, who had taken refuge at the Ottoman court during the Great Northern War. The conflict drew in Cossacks on both sides, and the Swedes aligned with the Turks. Some 50,000 were killed.

The Ottoman-Venetian War of 1714-18 was the seventh of that name. The Republic of Venice had, up to that time, been a powerful state for a full millennium, but it was in decline having lost its greatest overseas possession, the island of Crete, in the late 17th century. It had, however, captured the Greek Morean Peninsula from the Ottomans, who were determined to regain it. They mustered a huge army, and were often savage in victory, but the Venetians were saved from complete humiliation by the intervention of Austria in 1716.

In 1715 the first Jacobite rising saw a number of battles fought in Britain, including Sherrifmuir and Preston. The Jacobites  were supporters of the ‘Tory’ James Stuart, son of the deposed King James II, against the Whig George I. The Catholic Jacobites also featured in the War of the Quadruple Alliance (1718-20), in which they supported Spain against the quadruple alliance of Britain, France, Austria (representing the Holy Roman Empire) and the Dutch Republic. Savoy joined this alliance later. This was the only occasion in the 18th century that Britain and France were on the same side. The war was also fought in America. The allies were victorious, unsurprisingly, and Philip V of Spain soon sued for peace.

The Russo-Persian War (1722-3) was a result of the expansionist policies of Peter the Great, and of his concern about Ottoman expansion in Safavid Persia. Thousands died in these ‘manoeuvres’, which today would be a matter of diplomacy. All the territory gained by Peter was ceded back to the Persians by Empress Anna of Russia in 1732, to secure Persian support in the next great conflict with the Ottomans.

A brief Anglo-Spanish War (1727-9) saw the Spanish lay siege to Gibraltar while the British blockaded Porto Bello, both unsuccessfully. The Treaty of Seville at the end of this ‘war’ saw everything returned to the status quo, though of course thousands of lives were lost, mostly, as usual, from famine and disease.

One of the bloodiest wars in the first half of the 18th century was the War of the Polish Succession of 1733-8, not to be confused with the 16th century war of the same name! This war saw France, Spain, the Duchy of Parma and the Kingdom of Sardinia rallying in support of one aspirant to the Polish throne, while the Russian Empire, Habsburg Austria, the Kingdom of Prussia and Saxony supported another. The casualties, which may have numbered around 100,000, were mostly French and Austrian. It resulted in the Treaty of Vienna, the ascension of Augustus III to the throne, and various transfers of territories in the endless carving and recarving of the meat of Europe. Meanwhile the Austro-Russian-Turkish War (1735-9), a struggle between the Russians and Habsburgs and the Ottoman Empire, ended largely in a stalemate, with several tens of thousands dead, almost entirely of famine and disease.

One could go on, and on. The War of the Austrian Succession (1740-8) involved, again, most of the European powers and resulted in hundreds of thousands of deaths. Between this and the French Revolutionary Wars at the end of the eighteenth century, which killed an estimated 1 million people (one historian, James Trager, estimated 3-600,000 deaths in the suppression of the Vendee revolt within France in 1793) there were of course numerous conflicts large and small, including the bloody Seven Years War (1756-63, though many historians argue for a longer period), which also saw at least 1 million dead.

The Napoleonic Wars of the early nineteenth century built on the warfare habits of the French revolutionaries. There are no accurate figures, of course, in any of these wars – any more than there are accurate figures on the deaths caused by the US invasion and occupation of Iraq today – but an estimate of 2 million dead as a result of Napoleon’s campaigns is regarded as moderate. This was certainly the largest death toll of any war or campaign of the nineteenth century. Of the other wars of the century, the Crimean War (1853-6) killed about 300,000, three more of the many Russo-Turkish Wars (1806-12, 1828-9 and 1877-8) each killed about 200,00, a conservative estimate, and the Franco-Prussian War killed anywhere from 200,000 to above 700,00o, depending on various historians.

I’ve only included here the major conflicts civil and international, within Europe, but of course European forces were in conflict in Africa, Asia and the Americas throughout the century. It’s also never been easy to determine the eastern boundary of Europe. The ethnic cleansing of the Circassian peoples on the north-eastern shores of the Black Sea was undoubtedly one of the most murderous campaigns of the 19th century, with wildly varying estimates claiming up to 1.5 million deaths. Whether or not this was a specifically European holocaust is obviously a trivial question.

Although there was something of a lull in warfare at the end of the nineteenth century, attitudes towards war, its manliness and its character-building nature, were still dominant, as we can see in much of the rhetoric that preceded and influenced the so-called Great War. It was the carnage of that war, it seems, that first started to change attitudes, reflected in the war poets and in the criticisms aired thereafter. The Second World War, and particularly the horrors of Nazism, had a catalysing effect, and the culturally decisive decade of the sixties (and another war in South East Asia) succeeded in irreversibly changing attitudes to war, manliness and much else besides.

So ends part 2 in this series. Next I want to go back over that 300-year period from 1600 to the beginning of the twentieth century to look at domestic and other forms of violence within European society.

 

Written by Stewart Henderson

July 19, 2014 at 7:02 pm

Posted in history, violence, war

Tagged with , , ,

how much damage is synthetic fertiliser doing to soil?

leave a comment »

“A nation that destroys its soil destroys itself.” – Franklin D. Roosevelt

Fertilizer_by_the_Numbers_Ch2

In a recent conversation, in which I was accused of being too black-and-white about the positives of conventional agriculture and GMOs, the damaging effects of synthetic fertiliser were mentioned as a negative, as it ‘kills the soil’s organisms, including earthworms’.

So now I’m going to focus on that issue specifically, and follow the evidence where it leads me. There’s no doubt that intensive agriculture and mono-cropping are having a negative impact on soil quality, just as there’s no doubt that intensive agriculture  is currently required to feed the world’s human population. So what’s to be done? First, we could reduce or stabilise the world’s population, which we’re trying to do. Second, we can try to find biotech solutions, developing a type of intensive agriculture that’s less damaging to the soil and the environment – and organic approaches might help us in this. GMOs also offer promise, developing crops which require less in the way of fertilisers and pesticides, and deliver higher yields.

There are other ways of looking at this and so many other problems, as I’ve recently become aware of complexity theory, which I’ll write about soon, but for now I’ll look at the claims being made and the solutions being offered.

So what exactly is synthetic or chemical fertiliser doing to our soil?  Needless to say, in order to obtain accurate data in answer to this question we have to negotiate our way through sources  dedicated to maximising, or minimising, the harm being done. So I’ll start with a definition. Here’s one from a website called Diffen, dedicated apparently to making unbiased comparisons between rival goods and services, in this case chemical v organic fertilisers.

A chemical fertiliser is defined as any inorganic material of wholly or partially synthetic origin that is added to the soil to sustain plant growth. Chemical fertilisers are produced synthetically from inorganic materials. Since they are prepared from inorganic materials artificially, they may have some harmful acids, which stunt the growth of microorganisms found in the soil helpful for plant growth naturally. They’re rich in the three essential nutrients needed for plant growth. Some examples of chemical fertilisers are ammonium sulphate, ammonium phosphate, ammonium nitrate, urea, ammonium chloride and the like.

Diffen goes on to describe the pros and cons, but there isn’t much detail beyond high acidity and ‘changes to soil fertility’. A 2009 article in Scientific American goes further, describing these mostly petroleum-based fertilisers as having these dire effects:

wholesale pollution of most of our streams, rivers, ponds, lakes and even coastal areas, as these synthetic chemicals run-off into the nearby waterways.

What this article doesn’t mention is that human waste (i.e feces), grey water etc is also getting into our waterways and causing damage, and it’s hard to separate out these many forms of pollution. In any case, I’m confining this piece to direct damage to the soil rather than to waterways, important though that obviously is.

One of the principal causes of soil degradation is leaching, the loss of water-soluble plant nutrients through rains and storms, and irrigation. Fertiliser can contribute to this problem. When nitrate (NO3) is added to the soil to boost plant growth, excess NO3 ions aren’t able to be absorbed by the soil and are eventually leached out into groundwater and waterways. The degree of leaching depends on soil type, the nitrate content of the soil, and the degree of absorption of the nitrates by the plants or crops on that soil.  Again, though, the leaching is caused by water, and the soil degradation is largely a natural process, though over-irrigation can contribute. This is why the older soils, such as those in Australia, are the most lacking in nutrients. They’ve been subjected to eons of wind and water weathering. The richest areas have been renewed by volcanic activity.

Not all chemical fertiliser is the same, or of the same quality. Phosphate fertilisers commonly contain impurities such as fluorides and the heavy metals cadmium and uranium. Removing these completely is costly, so fertiliser can come in grades of purity (most backyard-gardener fertiliser, the stuff that comes in little pellets, is very pure). Many widely used phosphate fertilisers contain fluoride, and this has prompted research into the effects of a higher concentration of fluoride in soil. The effect on plants has been found to be minimal, as plants take up very little fluoride. Livestock ingesting contaminated soils as they munch on plants could be a bigger problem, as could be fluoride’s effect on soil microorganisms. Fluoride is very immobile in soil, so groundwater is unlikely to be contaminated.

Acidification from the regular use and over-use of acidulated phosphate fertilisers has been a problem in some areas, particularly in Malaysia and Indonesia, where aluminium toxicity has caused severe soil degradation. Acidity of soils is a serious problem in Australia, where in NSW more than half the agricultural land is affected. Most agricultural plants require a pH of 5.5 to 8.0 to grow best, though some plants are  much more tolerant than others of lower pH levels. Surface acidity can be corrected with the application of ground limestone, but subsurface acidity is a growing problem and much more difficult to correct. Acidification is generally a slow natural process caused by wind and water weathering, but it can be greatly accelerated by the use of fertilisers containing ammonium or urea. It can also be caused by a build-up of organic matter. As an example of the complexity of all this, superphosphate doesn’t directly affect soil acidity but it promotes the growth of clover and other legumes, a build-up of organic matter which increases soil acidity.

A comment on fertiliser and worms. No, they don’t kill worms, and because they stimulate plant growth they’re likely to increase the population of worms – but there are worms and worms. Some are highly invasive and have been transported from elsewhere. Some can be damaging to plants. At the same time new plants, and new worms, tend to adapt to each other over time. Again, complexity cannot be underestimated.

Another concern about chemical fertiliser, again not connected to soil quality as such, is nitrous oxide emissions. About 75% of nitrous oxide emissions from human activity in the USA came from chemical fertiliser use in agriculture in 2012, and we are steadily adding to the nitrous oxide levels in the atmosphere. Nitrous oxide is a greenhouse gas which, on a unit comparison, is 300 times more damaging than carbon dioxide.

In conclusion, it’s likely that everything you do in agriculture has a downside. There are no free lunches. The key is to obtain as much knowledge as possible, not only about your patch, but about nutrient and resource cycles generally. It’s all connected.

Oh and above all be sceptical of some of the ridiculous claims, and the ridiculous propaganda, out there. Check them out on a reputable, evidence-based site.

Written by Stewart Henderson

July 13, 2014 at 1:42 pm

Follow

Get every new post delivered to your Inbox.

Join 145 other followers