I read recently that regular moderate exercise sloshes up the blood, washing immune cells from vessel walls. This brings those cells back into the mainstream so to speak, where they can be more effective in combating infection. It makes no small difference – a simple study in which 500 adults were tracked for 12 weeks found that those who engaged in regular aerobic exercise sessions were found to suffer considerably less from upper respiratory tract infections – precisely my personal area of concern. Levels of immune cells in the blood double during exercise.
There’s also good news in this for those of us who couldn’t become gym junkies no matter how hard we tried. Too much exercise (but that means quite a lot) can undo all the good by raising levels of cortisol, noradrenaline and other stress hormones, which alter immune cell functioning. Stress, though, is another one of those complex indicators of health. Mild bouts of stress can be healthful, again boosting blood levels of immune cells. So don’t relax too much, but don’t overdo it.
Even so, exercise helps with everything, and that’s something worth promoting because the recommended dose of exercise isn’t being swallowed by the majority of people in the west. Of course we’ve always kind of known about the benefits of exercise, but the hard evidence has really been coming in lately. A really interesting study was published in the Lancet in 1953, at a time when the rising incidence of heart attacks was becoming a worry. It compared bus conductors to bus drivers on London’s busy double-deckers. The conductors, who spent much of their working day running up and down steps, had half as many heart attacks as their driver colleagues. This landmark study has of course been followed by many others that have confirmed the positive effects of exercise in reducing the incidence of stroke, cancer, diabetes, liver and kidney disease, osteoporosis, dementia and d barkepression.
So what exactly is the goldilocks zone for exercise? Well, anything is better than nothing, and most of us know we’re not doing enough. I’m not quite a senior citizen yet, but studies have been done with the elderly requiring them to do 40 minute walks three times a week, which is hardly strenuous. I catch a tram to work, which requires a ten-minute walk each way, and then a five minute walk each way to my workplace – 30 minutes a day, five days a week, though it would doubtless be better if those 30 minutes were continuous, and if I didn’t dawdle much of the time. The benefits of such a regime have been shown through before-and-after brain imaging. Expansion of the hippocampi, either through the growth of new brain cells, or greater synaptic connectivity, and a restoration of long-distance connections across the brain.
Mental exercise shouldn’t be forgotten either. It has been known for a couple of decades that intellectual stimulation can provide a kind of ‘cognitive reserve’ which can buffer us against the kinds of physical brain deterioration typical of Alzheimer’s and other forms of dementia, but clearer proofs of this have been gathered recently. Magnetic resonance imaging of Alzheimer’s sufferers has caught the goings-on in the brain while cognitive tasks are being performed. Highly educated people – brain workers if you will – are better able to develop alternative neuronal networks to compensate for damaged areas. I would assume though that it’s not so much about education but about brain usage. Keep tackling new things. Keep using your brain in new ways. And your body for that matter.
Cognitive reserve is now seen as a real thing, and has been pinpointed as residing in the dorsolateral prefrontal cortex, a key area for learning, short term memory, attention and language. Increased activity in this area suggests flexibility in thinking and problem solving. Information processing efficiency is also a key to a healthy brain. Having a high IQ, something I’ve often been sceptical about in the past, is an indication of information processing efficiency, even if the information is often culturally specific. It appears that physical brain deterioration, from Alzheimer’s, stroke and and other causes, can be fended off by compensating neural network development and increased information processing efficiency in certain people, until the deterioration becomes too great to be compensated for, after which things tend to go downhill very rapidly. By the time the symptoms of Alzheimer’s appear in such people, the physical damage is already well advanced.
A major message from all this is that you should try to develop lifestyle habits involving physical and mental exercise. Always a work in progress.
I note that one of the in terms these days is ‘hat tip’ (h/t), so h/t for this piece to New Scientist, the collection, edition 3: a guide to a better you.
… It may destroy diseases of the imagination, owing to too deep a sensibility, and it may attach the affections to objects, permanent, important, and intimately related to the interests of the human species.
Humphry Davy, on the value of science, in ‘Discourse introductory to a course of lectures on chemistry’, 1802
A great many of us would like to live a long and healthy life, with a greater emphasis on health than length. But both please, if possible, thanks.
I’ve been reading the issues of New Scientist: the collection as they come out. The first issue dealt with the Big Questions, namely Reality, Existence, God, Consciousness, Life, Time, Self, Sleep and Death. Bit of a roller coaster ride, leaving me dizzy, confused, but often enlightened, and sometimes even exhilarated. So, better than a roller coaster. The second issue, entitled The Unknown Universe, took me far out beyond multiverses, quantum loops, energetic dark matter and the eventful horizons of black holes, and essentially taught me that modern cosmology is a mess of competing theories, often competing, it seemed, to be the most egregious ideas that are compatible with mathematical possibility. However, it may be that the studious avoidance of scary maths in these essays/summaries may have made them seem more loopy (or strangulatingly stringy) than they are.
The third issue was more down to earth, and not only earth but me, and you, dear reader. It’s entitled The scientific guide to a better you, and it’s all about longevity, health and success.
So what’s the secret, at least for the first two? Basically, eat healthily, with not too much meat, make sure you have good genes, don’t be too much of a loner (too late for me, I’ve been a loner for 40 years, and that’s unlikely to change, but I’l try, as I always say), be intelligent, active and exploratory. That’s the message of the first half of this issue anyway.
What interested me, though, was the detail. Measurements. Blood sugar, cholesterol, heart rate and many other factors and parameters, most of which I didn’t know I had to be concerned about. The various essays are peppered with these measures of health or lack thereof, but how does your average Jo like me get a measure of these things without pestering doctors on a weekly basis about wellness instead of sickness?
So, for fun, I thought I’d look into these ways of measuring ourselves and see if we can manage them from home. A sort of practical guide to centenarianism and beyond.
1. Body mass index (BMI)
Your BMI is a very rough-and-ready guide to whether or not you’re a healthy weight for your height. Various websites can calculate this for you instantly if you know your height and weight. My current BMI is 26, according to the Heart Foundation, which it regards as ‘overweight’, though very close to the borderline between ‘overweight’ and ‘healthy’. About three years ago my BMI was 29, well into the overweight category, in fact getting close to obese. I decided to eat less, without fasting or ‘going on a diet’, and to try to up my exercise, and over a 2-year period I brought my BMI down from 29 to 23, well into the healthy range. Since then it has crept back up to 26, and I’m struggling to get it back down again. I just need to lose a couple of kilos, and keep them off. The myriad other ways of measuring your health these days might make the old BMI seem outmoded - it doesn’t measure your fat to muscle ratio, for example, or the amount of fat around your heart and other organs – but I find it a useful guide for me, and the cheapest available.
2. Heart rate/blood pressure
Measured in beats per minute, your heart rate naturally varies with exertion, and also with anxiety, stress, illness, drug use and so on. The normal resting heart rate for an adult human ranges from 60 to 100 bpm. You can measure your own heart rate (your pulse) at any time by finding an artery close to the surface. The radial artery on the wrist, the one you see heading in the direction of the thumb, is commonly used due to ease of location, but don’t try it with your thumb which has its own strong pulse. I’ve just located my own wrist pulse and measured it as 62bpm. That’s the first time I’ve ever done it. However, I imagine it would be harder to measure after a bout of HIIT (high intensity interval training), which I sometimes indulge in, or after a moderately strenuous bike-ride. It would be even harder while you’re in the middle of exercise, so that’s where heart rate monitors, including those that can be worn on the wrist, come in handy. A quick google-glance tells me that such wrist devices are selling at $100 to $150. However, caveat emptor, as doubt is being cast on their accuracy. Electrocardiographs (ECGs, or EKGs), which measure the electrical activity of your heart, provide a much more accurate record than heart rate monitors, which are apparently only really effective when you’re at rest. One of the problems is that these optical monitors use light to track your blood, and to get an ‘accurate’ reading, you need to be very still, which sort of defeats the purpose. Reporter Sharon Profis, with the help of cardiologist Jon Saroff of Kaiser Permanente medical center in San Francisco compared various wrist monitor brands with the gold standard EKG measurements, and found them well off-beam especially at over 100 bpm. However, the Garmin Vivofit chest strap monitor, which measures electrical activity, was very accurate. This device can be bought for around $150 in Australia.
Cholesterol’s an essential organic molecule, a sterol, a structural component of our cell membranes. It’s biosynthesised, mainly by our liver cell, often as a precursor to such vital entities as steroid hormones and vitamin D, and researchers have tracked the 37-step process of its synthesis. Cholesterol is transported through the blood within lipoproteins, and that’s where you get HDL (high-density lipoprotein) and LDL (low-density lipoprotein) cholesterol, of which the former is the one that causes problems. Some 32% of Australian adults have high blood cholesterol, the primary cause of atherosclerosis, leading to clogging of major blood vessels. Ways of lowering your LDL levels include not smoking, avoiding transfats, regular moderate exercise, and healthy eating including fruit, veg, grains and pulses and sterol-enriched foods. But of course you know all that. The big question is, can you measure your cholesterol from home? The current answer appears to be no, according to the Harvard Medical School (though I note that their article is 11 years old). The problem is that home testing kits can’t separate the ‘good’ HDL cholesterol from the ‘bad” (LDL). Measuring your overall cholesterol levels might be useful, but the real issue is the proportion that is LDL, not to mention that cholesterol can also be carried by other molecules such as triglycerides.
4. Blood sugar/glucose
Glucose is a vital source of energy for the body’s cells, and its levels are associated with the hormone insulin, produced by the pancreas. Blood glucose levels naturally vary throughout the day, and having a level regularly above normal is termed hyperglycemia. Hypoglycemia is the term for low levels. Diabetes (technically Diabetes mellitus) is the disease most commonly associated with high blood sugar. General symptoms are frequent urination, hunger pangs and increased thirst. The mean normal blood sugar level is around 5.5 mM (millimolars). That’s the international standard measure – the Americans measure it differently, which causes the usual confusion. Not surprisingly, considering the global rise in diabetes, blood glucose meters for use at home are readily available, but they’re mostly specially devised for use by diabetics, supervised by healthcare professionals. You can of course buy one and DIY but you must learn to be inured to pricks, and unless you’re at risk, which I’m not, as I don’t have much of a sweet tooth, don’t have particularly high cholesterol, and have never evinced any diabetic symptoms, it’s probably not worth the investment. The essential test associated with ‘pre-diabetes’ or hypoglycaemia is a glucose-tolerance test (GTT).
5. Sequence your genome
According to the Australian government’s National Health and Medical Research Council (NHMRC):
Rapid advances in DNA sequencing technologies now allow an individual’s whole genome to be sequenced. Although this is still relatively expensive, it is likely that in the near future it will become affordable and readily available.
Ah, that other country, the near future. But it is a fact that the price is coming down, from $10 million in 2005 to a mere $1 million in 2007 when James Watson’s genome was sequenced. The going rate in 2012 was under $10,000, and this year (2014) the Garvan Institute of Medical Research in Sydney became one of only three institutes in the world to deliver whole sequenced genomes at under $1000. However, there’s a problem. Your genome will mean nothing to you without expert analysis and interpretation, at a hefty price tag. So what would be the purpose, from a health perspective, of ‘doing your genome’? If you’re already quite healthy, do you want to spend up to $1000 only to find out that you carry a gene which may pre-dispose you to a disease that’s currently non-preventable? Our genome is very complex, so much so that current thinking on the subject, and especially on the introns, the sections that don’t code for proteins, has become more cloudy than ever. We know, or think we know, that the number of introns an organism has is positively correlated with that organism’s complexity, but that’s about all we know for sure, and considering the enormous complexity of the interaction between genetics and environment, together with our lack of knowledge of the role of so much of our genome (over 98% of which is non-coding DNA), the question of whether it’s worth sequencing at this time is a live one. Of course if the price comes down to $100, or the price of a latte (which will soon be up around that figure) then it’d be well worth it; you would have it there awaiting scientific breakthroughs on all that non-coding stuff.
If you’ve been paying attention to the world of human health, you’ll know that the microbiome is all the rage at the moment. the term was coined by Joshua Lederburg, who defined it thus, according to Wikipedia:
A microbiome is “the ecological community of commensal, symbiotic, and pathogenic microorganisms that literally share our body space.”
You may well have heard the impressive statistic that you have ten times more bacterial cells (and, most interestingly, archaean cells) growing on or in you than bodily (eukaryotic) cells, though this might become less impressive when you learn that the combined weight of those cells amounts to only a few hundred grams. Still, recent research on the microbiota has turned up some interesting results, especially for health. One finding, which may make it difficult to assess your own microbiome, is that different sets of microbes appear to perform the same function for different people. So you won’t just need to know the genetic content of your microbiome, but its function. Still, we can learn a lot already from our microbiome, according to Catalyst, the ABC science program. For example, we inherit a lot of bacteria from our mothers, via her breast milk, not only directly but because the sugars in breast milk encourage the growth of particular types of bacteria. Most of this gut bacteria does its work in the large intestine or bowel region. They’re anaerobic beasties, so they die when exposed to air. However, recent technological developments (and how often can that story be told) have allowed us to learn far more about them, by sequencing their genes inside the gut. From this we’ve learned that our gut bacteria are vital components of our immune system. And since these bacteria rely on our own diets for their nourishment, the kind of microbiome we have is profoundly related to what we eat. A diverse microbiome results, apparently, from eating a high-fibre diet, and low-fibre processed food, and the ingesting of antibiotics, is reducing that diversity, and contributing to multiple health problems. It appears that a less diverse microbiome finds itself under stress, leading to inflammation, an immune response that can damage our own tissue. As a sufferer from bronchiectasis, a chronic (and incurable) inflammation of the airways due probably to early childhood damage, I’m particularly concerned to limit the extent of inflammation through diet and exercise, so this is probably the aspect of my health I’m most concerned to monitor. And there’s also the relationship between gut bacteria and obesity. Some 62% of Australians are overweight or obese, and I’m one of that majority, and trying not to be.
It has been shown clearly, in mice at least, that a high-fibre diet reduces bronco-constriction, improving resistance to asthma and other airways conditions such as COPD. This is mainly due to the production of short-chain fatty acids by particular bacteria. The short-chain fatty acids are produced though the digestion of dietary fibre. Interestingly, acetate, found in vinegar, is a short-chain fatty acid, and a natural anti-inflammatory, so that’s something I should include regularly in my diet.
Finding out what your particular microbiome is, and how it might align with your health, is a simple if rather unpalatable and ‘intimate’ process. You can apply for a kit from the American Gut Project, an organisation dedicated to researching microbiota. The kit is for obtaining a sample of your ‘biomass’ as they call it, which you then send back to the AGP for analysis. All of this was spelt out in the above-linked Catalyst program, but since that program was aired two months ago, the AGP has been inundated with more biomass than it can deal with, so there’s been a backlog of logs, as it were. I plan to send for a kit anyway. The AGP sends back the results, apparently, with hopefully an analysis of the microbiome easy enough for a layperson to understand.
So there’s six areas to look at, either independently or with the help of your GP or other professionals, in terms of measuring how you’re going in terms of overall health, and there are many more aspects of your bodily chemistry and physiology to check up on – hormones, neurotransmitters, bone density, sight, hearing, lung capacity and so forth. Or you can follow the standard advice on diet and exercise, try to avoid stress and hope for the best. And above all don’t stop laughing and dancing, otherwise life would hardly be worth living.
Sadly I don’t have so much time for writing these days, especially anything too strenuous or research-based, so I think I’ll do a series on organisms that have interested me over the years – or that I’ve just recently been fascinated by, for that matter.
Over at Not Exactly Rocket Science, there’s an article to whet the appetite as well as to apply a corrective to our thinking about everyone’s favourite wild cat, the cheetah (the name derives from Sanskrit, and cheetahs are found in Iran as well as Africa, and were probably more widespread in Asia in earlier times). Cheetahs are a vulnerable species, with about 10,000 of them currently existing in the wild. They’re described as a ‘charismatic species’, meaning that they’re utilised a lot as ‘ambassadors’ to draw attention to environmental and habitat issues for wildlife in general – along with elephants, humpback whales, giant pandas, California condors, grey wolves and such.
Cheetahs are, of course, built for speed in every way, though agility, with an incredible acceleration and deceleration rate, is also a key to their success. They can accelerate from zero to 40mph in just three strides – faster than the most sophisticated racing cars. Claims that their lightning runs leave them half-dead with heat exhaustion much of the time are, however, wildly exaggerated, as are the claims that they lose as much as half of their kills to lions and hyenas. In fact, cheetahs use up far more of their energy seeking out or tracking down potential kills than they do actually chasing them. A cheetah sprint takes up only 45 seconds a day on average – that’s less time than I spend on my high intensity interval training.
The key to maintaining cheetahs in the wild, then, is not to add to their greatest and most energy-sapping problem: finding food. Adding obstacles to their habitat, such as fences and enclosures, and depleting that habitat of their favourite food - gazelle, deer and impala, and the odd young zebra or springbok – would make life that bit more painful for them.
Speed, of course, is the cheetah’s big specialisation, what it’s adapted for. In fact over-specialisation is arguably its main problem, as it doesn’t have the bulk or strength to fight off other predatory mammals, all of which annoyingly compete for the same food. It’s light, with a weight that averages around 50 kgs, and its aerodynamically evolved head and body trade speed for strength, meaning that its jaws and teeth don’t have the size or force of other wild cats. It has a flattened ribcage but larger than usual heart and lungs for large intakes of air and fast pumping of blood. It also has a longer and larger tail than most cats, which it uses as a rudder for balance as it sets off on one of its twisting and turning runs. Its claws are only semi-retractable, unlike those of most cats (its genus name, Acinonyx, is Greek for ‘no-move claw’). This gives it extra grip while running. Males and females are the same size and hard to tell apart from distance.
Cheetahs don’t roar but they make up for it with a range of other noises, including purring like a – well, a cat, when experiencing domestic plenitude. They also hiss, spit, growl and even yowl when faced with danger. Cubs make a bird-like chirping sound, and the mother makes a similar sound when trying to locate her young. A sound called churring – no idea what that sounds like – is used on social and sexual occasions between adults. Male cheetahs form lifelong partnerships, often but not always with brothers, while females are solitary, bringing the kids up by themselves. They tend to mate with a variety of males – which hardly makes it mating, really. Interestingly, though the females are regular hunters, they’re not territorial, unlike the males, who practice group territoriality, each member of the gang contributing his scent.
Female cheetahs put their kids – or those that survive, as there’s a heavy infant mortality rate – through a tough survival training schedule before abandoning them at around 18 months. At around 2 years of age the females go their lonesome ways and the males hang together, sometimes combining with other blokes. It seems to work for them. In fact I think I read somewhere that males live longer on average than females, which wouldn’t surprise me. Fending for yourself all the time’s a deadly business, even when it’s all laid on in the big smoke, never mind having to chase your meals every day into old age. So spare a thought for the cheetahs, especially the girls, under-appreciated as always.
I’ve written a fair bit about the rise of the ‘no religion’ sector of society, in Australia and elsewhere, which has obvious implications for the role of Christianity in politics in the western world. In Australia some generations ago, Melbourne’s Catholic Archbishop Daniel Mannix, and later his protege B A Santamaria, were hugely influential political figures. The formation of the Catholic DLP (Democratic Labour Party) by Sanatamaria, with the support of Mannix, effectively split the left, handing the conservatives political power for decades before Whitlam’s 1972 election victory. Since then, however, there hasn’t been much overt influence on politics from religion, though of course we’ve had religious PMs, including the current mad monk. Nor have we had any major political parties, that I know of, in which Christianity, or any denomination thereof, is part of its name.
Not so in other western countries. So-called Christian Democracy parties are quite common in Western Europe, usually on the centre-right. Belgium has the Christian Democratic and Flemish Party, formerly the Christian People’s Party; Switzerland has the Christian Democratic People’s Party as well as the Evangelical People’s Party; the Netherlands has the Christian Democratic Appeal Party, and Italy has the Union of Christian and Centre Democrats (though better known by its more secular title, the Union of the Centre, UDC).
Probably the most successful and powerful Christian political party in Europe, though, is Germany’s Christian Democratic Union, whose leader, Angela Merkel, has been Germany’s Chancellor for the past nine years. The party has been in power more often than not, though often in coalition, since 1945. In recent times, the CDU has formed a more or less permanent partnership with the Bavarian CSU (Christian Social Union), which is generally more Catholic and conservative.
According to Germany’s 2011 census, their percentage of Christians is almost identical to Australia’s, at a little over 60%, pretty well evenly divided between Catholics and (essentially Lutheran) Protestants. However, as with Australia, the numbers are falling rapidly, and churches are closing and being converted to other uses throughout the country. The ‘no religion’ category has won more votes recently than either the Papists or the Heretics. Interestingly, the eastern part of the country, which was under communist rule for 40 years, is much more atheist than the rest. So for how much longer will Germany’s CDU retain its Christian moniker?
According to its party platform, the CDU derives its policies from both ‘political Catholicism’ and ‘political Protestantism’, whatever that means. The vapidity of such claims, together with the obviously rising secularism of the populace, might explain why Angela Merkel played down any Christian elements in her and her party’s thinking during the 2005 elections. Merkel herself is the daughter of a Lutheran minister but was brought up in the atheist East and is a physicist by training. Recently, though (just prior to last year’s elections) she ‘came out’ for the first time as a Christian, possibly for complex political reasons (the rise of Islam is a much more significant factor in German domestic politics than in Australian). She even claimed, quite nonsensically, that Christianity was ‘the world’s most persecuted religion’. (Actually this is a common view, according to Pew Research, in the USA. It seems many Christians believe that the waning of Christianity’s popularity is a form of persecution). Merkel was elected for another 4-year term in 2013, and her more emphatic public identification with Christianity in recent times means that her party will be stuck with its name as long as she’s at the helm. My guess is she’ll be ripe for retirement in 2017.
Of course, as with most western states, religion in Germany has in recent decades, if not centuries, become a more ‘internal’ matter, and less political, with much ‘encouragement’ from the state. For more detail on that, check out the Peace of Augsburg of 1555 and its newly-defined principle, Cuius regio, eius religio, and also the concept of forum internum. This is definitely a good thing, given the Thirty Years War and all, but it seems that, as a quid pro quo for religious non-interference in politics, Germany’s Grundgesetz (its Basic Law, or Constitution) has been very generous in its delineation of religious freedom, and this may cause problems if Germany continues to play host to more challenging, and less ‘internalised’, religious beliefs. The Grundgesetz came into being in 1949, but many of its statutes pertaining to religion date back to the 1919 Weimar constitution. Unsurprisingly, no religions other than an increasingly emasculated (if that’s not too sexist a term) Christianity would have been considered relevant in those days.
Much of what follows, and some of the preceding, is taken from the article ‘Religion and the secular state in Germany’, by Stefan Korioth and Ino Augsberg. The constitution guarantees freedom of individual religion and philosophical creed (Weltanschauung) – thus also guaranteeing freedom not to have a religion. In article 3 of the constitution it’s stated that ‘no person shall be favored or disfavored because of his or her personal religious opinions’, and in article 33, ‘neither the enjoyment of civil and political rights, nor eligibility for public offices, nor rights acquired in the public service shall be dependent upon religious affiliations’. Other articles guarantee that there shall be no state church, and create a separation of church and state. In fact the German constitution is unusually detailed in its coverage of the status of religious entities vis-a-vis the state. It is above all concerned to emphasise the principle of state neutrality, but this has caused some difficulties in that the state has withdrawn even so far as to be reluctant to define religion for legal purposes. There is, as Korioth and Augsberg point out, no numerus clausus, or fixed number, of religious confessions, and it has been left to religious communities themselves to define their religiosity. Not surprisingly this has led to ongoing issues with regard to the legal status of religious groups. With the inevitable continuing decline in Christianity, and the rise of more challenging religions, and the disaffected youth who choose to identify with a more intolerant version of those religions, this will be a problem in the future. Hopefully, however painful, it will remain a fringe problem for the ongoing secularisation of Germany.
Just to round things off, Merkel’s newly-found public Christianity is a reminder that often changes have to wait until people die off, if that doesn’t sound too morbid or callous. Of course they don’t have to die physically, they may just have to die in terms of power or influence. Merkel’s position reminds me of others, such as Antonin Scalia of the US Supreme Court, and the late Fred Phelps of the Westboro Baptist Church (not that I place these people on the same moral or intellectual plane). The movement towards secularism isn’t so much about changing people’s minds, though that’s always a worthy pursuit. It’s about a changing zeitgeist that feeds those who are brought up within it. Older people die, younger people come to prominence, bringing the newly transformed zeitgeist to the fore. That’s how the flat-earthers, who once filled provincial town halls with their lectures, finally faded from view; they weren’t out-argued or persuaded from their views, they simply died, and their descendants imbibed the new zeitgeist. Not an excuse for complacency, but a reason for hope, and a reason for contributing to that zeitgeist in a positive way.
I have to say, from a very young age, I considered myself a feminist. And then I read (sometime in the seventies, long before Emma Watson was born, bless her cotton socks) that you couldn’t be a feminist as a male, because it was some kind of uniquely female thing, whereas I, like Emma, thought it was a simple matter of believing that females were equal to men in every respect, and that it didn’t matter who did the believing – male, female, androgyne or alien.
Emma Watson’s Hermione is an iconic figure. Perhaps I should say J K Rowling’s Hermione, though millions identify Emma as Hermione. Yet, interestingly, Emma described herself in her speech, self-depracatingly, as ‘that Harry Potter girl’, inadvertently reminding us of her role as support to the main protagonist.
I don’t in any way want to disparage the Harry Potter novels, which I’m sure would have been just as successful with Harriet Potter as the heroine – at least I hope so. I personally have observed how much Hermione has inspired young girls, as an intelligent, level-headed problem-solver. So it was with great delight that I, along with many others, have been able to see that Emma was not just playing a part as Hermione; that she genuinely wants to use her prominence to push for the recognition of women globally.
I would go further – and I suspect she would agree, though she didn’t go that far in her speech – and say that the world would be better for having more women in prominent positions – that it would be safer, more collaborative, and more congenial. But maybe I’m being a little idealistic…
In any case, the ‘he for she’ initiative is one that I endorse whole-heartedly, because it allows men to have their say without experiencing any of the weird responses from both sides. It’s simply about equality, and respect.
‘Atheism is not a great religion. It has always been for elites rather than for ordinary folk. And until the 20th century, its influence on world history was as inconsequential as Woody Allen’s god. Even today the impact of atheism outside of Europe is inconsequential. A recent Gallop poll found that 9% of adults in Western Europe (where the currency does not trust in God) described themselves as ‘convinced atheists’. That figure fell to 4% in eastern and central Europe, 3% in Latin America, 2% in the Middle East, and 1% in North America and Africa. Most Americans say they would not vote for an atheist for president.’
Stephen Prothero, from God is not one: the eight rival religions that run the world & why their differences matter (2010).
I should admit at the outset that I’ve not read Prothero’s book, and probably never will, as time is precious and there are too many other titles and areas of knowledge and endeavour that appeal to me. However, since, as a humanist and skeptic I have a passing interest in the religious mindset and in promoting critical thinking and humanism, I think the above quote is worth dwelling on critically.
First, the claim that ‘atheism is not a great religion’. It’s an interesting remark because it can be interpreted in two ways. First, that atheism is not a religion of any kind, great or small; second that atheism is a religion, but not a great one. I strongly suspect that Prothero has the second view in mind, while also playing on the first one. Of course atheism isn’t a religion and it’s tedious to have to play this game with theists (assuming Prothero is one) for the zillionth time, but my own experience on being confronted with the idea of a supernatural entity for the first time at around eight or nine was one of scepticism, though I didn’t then have a name for it. I don’t think scepticism could ever be called a religion. And nothing that I’ve ever experienced since has tempted me to believe in the existence of supernatural entities.
Next comes the claim that atheism has always been for elites rather than ordinary folk. This is probably true, but we need to reflect on the term ‘elite’. I assume Prothero can only mean intellectual elites. The Oxford dictionary succinctly defines an elite as ‘a select group that is superior in terms of ability or qualities to the rest of a group or society’. Generally, therefore, the best of society, or the leaders. It’s broadly true, especially in the West, that you won’t get to the top in business without a good business brain, you won’t get to the top in politics without a good political brain and you won’t get to the top in science without a good scientific brain, and these are all positive qualities. The elites are the best, and the best tend to be society’s movers and shakers.
Yet Prothero doesn’t appear to agree, quite. His juxtaposing of the two sentences intimates that atheism is not a great religion because it has always been for elites. What are we to make of this? My guess is that he’s trying to downplay atheism but has made a bit of a mess of it. And there’s more of this. Before the 20th century, we’re informed, atheism was as influential ‘as Woody Allen’s god’, by which, I presume, he’s referring to Allen’s farce of 1975, God, with which I’m not particularly familiar. I do know, though, that it’s fashionable these days to trash Woody Allen, so the message appears to be that, before 1900 or so, atheism was very inconsequential indeed.
A reasonable person might wonder here why Prothero seems so keen to diminish atheism. A big clue is surely to be found in the subtitle to Prothero’s book. Which raises some questions: What are these eight religions? Are they really rivals? Do they run the world?
The contents page answers the first question: Islam, Christianity, Confucianism, Hinduism, Buddhism, Yoruba religion, Judaism and Daoism make up the Premier League. Presumably Jainism, Sikhism and Zoroastrianism are struggling in the lower divisions. There is some debate amongst authorities as to whether Confucianism or Daoism are recognised religions, and they’e often found blended, along with Buddhism, in Chinese folk tradition – so, maybe not so much rivals.
Surely the most important question, though, is whether these religions ‘run the world’. I have the strong suspicion that Prothero hasn’t given deep consideration to his terms here, but I’ll try to do it for him. What does ‘running the world’ entail? I’ve heard people say that multinational corporations run the world, or that various superpowers do so, or have done so, but the idea that the major religions run the world between them is a novel one to me. Of course, if I want to find out whether Prothero provides evidence for his claim, or sets out to prove it, I’d have to read his book, and I’m reluctant to do so. It’s surely far more likely he’s tossed in the subtitle as something provocative, a piece of unsubstantiated rhetoric.
A lot of ingredients make the human world run, including trade, transport, law, festivals, education, sex, empathy and new ideas. Customs, habits and religious rituals play their part for many of us too. However, there’s no doubt that, for most westerners, global networking, the take-up of higher education, multiculturalism and travel have transformed earlier customs and habits, with religion taking a major hit in the process. The places where religion is holding its own are those where such modern trends are less evident.
Prothero also seems to be downplaying the 20th century when he writes that the influence of atheism was negligible before that time, as if to say ‘setting aside the 20th century, religion has been the most powerful force in humanity.’ Maybe so, but you can’t set aside the 20th century, a century which saw the human population rise from less than two billion to around 7 billion, a century of unprecedented and mind-boggling advances in science and technology, and in the education required to keep abreast of them, and which has seen a massive rise in travel and global communication. Continuing into the 21st century, these developments have been transformative for those exposed to them. It is unlikely to be coincidental that the same period has seen ‘the rise of the nones’ as by far the most significant development in religion for centuries – or more likely, since the first shrine was constructed. Of course, correlation isn’t causation, and I’m not going to delve deeply into causative factors here, but the phenomenon is real, though Prothero engages in what seems to me a desperate attempt to minimise it with his data. I’ll examine his statistics more closely later.
Prothero also presents the ‘inconsequential outside of Europe’ argument, which, apart from dismissing Australians like me - where more than 23% professed to having no religion in the last census (2011), with some 9% also choosing not to answer the optional question on religion - seems to dismiss Europe as an aberration in much the same way as he dismisses the 20th century. Yet in the last seventy years since the end of WW2, western Europe has only been an aberration in terms of its stability, its growing unity, its overall prosperity, its high levels of literacy and other positives on the registers of well-being and civility. Surely we should hope that such aberrations might spread worldwide. Many of the western European nations are regarded and valued as ‘elite states’, where religious strife, a problem in the heart of Europe for centuries up to and including the Thirty Years’ War of the 17th century, is now almost entirely confined to its immigrant populations. These are now among the least religious countries in the world.
So let’s look at Prothero’s data. He states that 9% of Western European adults are ‘committed atheists’. Why, one wonders, does he choose this category? Most atheists aren’t ‘committed’ if by this is meant proselytising for non-belief in supernatural beings. They don’t go around ‘being atheists’. As I’ve said, I consider myself first and foremost as a sceptic, and it’s out of scepticism and a need for evidence and for the best explanation of phenomena that I consider belief in creator beings, astrology, acupuncture, fairies and homeopathy as best explained by psychology, ignorance and credulity.
My view is that Prothero chooses the ‘committed atheist’ category for the same reason that William Lane Craig does – to minimise the clear-cut ‘rise of the nones’, to reduce non-belief to the smallest category he can get away with.
Prothero cites a website for his figures on ‘committed atheists’ (9% in western Europe, 4% in eastern and central Europe, 3% in Latin America, 2% in the Middle East and 1% in North America), which is a 2005 Gallup Poll. I cannot find the 2005 poll, but an updated 2012 Gallup Poll is very revealing, as it compares some figures with those from 2005. What it reveals, sadly, is a degree of intellectual dishonesty on Prothero’s part. Prothero claims that atheism is inconsequential outside of Europe, yet the same Gallup Poll from which he took his figures – but this time the 2012 version – states that 47% of Chinese self-describe as committed atheists*. Presumably this was slightly up on 2005 (the 2005 figure for China isn’t given), because almost every nation shows a rise in atheism in recent years, but the huge percentage, together with 31% of Japanese ‘committed atheists’ completely discredits Prothero’s ‘inconsequential outside of Europe’ claim.
It’s worth giving more comprehensive data on western Europe here, based on the 2012 poll by Gallup International. The 9% figure for ‘committed atheists’ is now 14%, with a further 32% describing themselves as ‘not religious’, and 3% ‘no answer or not sure’. The rest, 51%, described themselves as religious. It’s clear that, by the next poll, most western Europeans will not describe themselves as religious. Only 14% of Chinese people currently describe themselves as such – and as we all know, China will soon take over the world.
I was surprised, too, that only 1% of North Americans were committed atheists, according to Prothero. I can’t confirm this, but according to the 2012 poll, the figure is 6%, with a further 33% claiming to be ‘not religious’. The percentage of the self-described religious is a surprisingly low 57%. Perhaps Prothero combined the North American and African figures to arrive at the 1% mark. Who knows what paths motivated reasoning will lead a person down.
The 2012 poll, if it’s reliable, is revealing about the speed with which religion is being abandoned in some parts. In France, for example the percentage of ‘committed atheists’ has jumped from 14 to 29%, an extraordinary change in age-old belief systems in less than a decade.
But beyond these statistics about how people see themselves, the change is most marked, in the west, by the vastly diminished role of religion in public life. It’s precisely Prothero’s claim that religions ‘run the world’ that is most suspect. In virtually every western country, secularism, the insistence that the church and the state remain separate, has become more firmly established in the 20th century. The political influence of the Christian churches in particular has noticeably waned. Of course there are a few theocratic nations, but their numbers are decreasing, and none of them are major world powers. If you believe, as most do, that the world is run by governments and commercial enterprises, it’s hard to see where religion fits into this scheme. In some regions it may be the glue that holds societies together, but these regions appear to be diminishing. Religions these days receive more publicity for the damage they do than for any virtues they may possess. Any modern westerner might think of them as ruining the world rather than running it.
The fact is that, in every western country without exception (yes, that includes the USA), the trend away from religious belief is so rapid it’s almost impossible to keep up with. I’ve already written about the data in New Scientist suggesting that the ‘nones’ are the fourth religious category after Christians, Moslems and Hindus, numbering some 700 million. Wikipedia goes one further, putting the nones third with 1.1 billion. Of course these figures are as rubbery as can be, but its indisputable that this is overwhelmingly a modern phenomenon, covering the past fifty or sixty years in particular. It’s accelerating and unlikely to reverse itself in the foreseeable.
Books like Prothero’s are symptomatic of the change. Remember The Twilight of Atheism (which I also haven’t read)? Deny what’s going on, promote the positive power and eternal destiny of religion and all will be well.
Well, it won’t. Something’s happening here but you don’t know what it is, do you, Mr Prothero?
*To be fair to Prothero, it looks like no 2005 figures for China were available, though the large figures for Japan certainly were. Also, though these figures for China have been uncritically reported by the media, the sample size, as mentioned on Gallup International’s website, was preposterously small – some 500 people, less than one two-millionth of the Chinese population. The survey was apparently conducted online, but no details were given about the distribution of those surveyed. Given the resolutely secular Chinese government’s tight control of its citizens and media, I would treat any statistics coming out of that country with a large dose of salt.
I’ve read at least enough about WW1 to be aware that its causes, and the steps made towards war, were very complex and contestable. There are plenty of historians, professional and amateur, who’ve suggested that, if not for x, or y, war may have been avoided. However, I don’t think there’s any doubt that a ‘force’, one which barely exists today, a force felt by all sides in the potential conflict of the time, made war very difficult to avoid. I’ll call this force the appetite for war, but it needs to be understood more deeply, to divest it of its vagueness. We know that, in 1914, lads as young as 14 sneaked their way into the militaries of their respective countries to experience the irresistible thrill of warfare. A great many of them paid the ultimate price. Few of these lambs to the slaughter were discouraged from their actions – on the contrary. Yet 100 years on, this attitude seems bizarre, disgusting and obscene. And we don’t even seem to realise how extraordinarily fulsome this transformation has been.
Let’s attempt to go back to those days. They were the days when the size of your empire was the measure of your manliness. The Brits had a nice big fat one, and the Germans were sorely annoyed, having come late to nationhood and united military might, but with few foreign territories left to conquer and dominate. They continued to build up their arsenal while fuming with frustration. Expansionism was the goal of all the powerful nations, as it always had been, and in earlier centuries, as I’ve already outlined, it was at the heart of scores of bloody European conflicts. In fact, it’s probably fair to say that the years of uneasy peace before 1914 contributed to the inevitability of the conflict. Peace was considered an almost ‘unnatural’ state, leading to lily-livered namby-pambiness in the youth of Europe. Another character-building, manly war was long overdue.
Of course, all these expansionist wars of the past led mostly to stalemates and backwards and forwards exchanges of territory, not to mention mountains of dead bodies and lakes of blood, but they made numerous heroic reputations - Holy Roman Emperor Charles V and his son Philip II of Spain, Gustavus Adolphus of Sweden, Frederick the Great of Prussia, Peter the Great of Russia, Louis XIV of France and of course Napoleon Bonaparte. These ‘greats’ of the past have always evoked mixed reactions in me, and the feelings are well summed up by Pinker in The Better Angels of our Nature:
The historic figures who earned the honorific ‘So-and-So the Great’ were not great artists, scholars, doctors or inventors, people who enhanced human happiness or wisdom. They were dictators who conquered large swaths of territory and the people in them. If Hitler’s luck had held out a bit longer, he probably would have gone down in history as Adolf the Great.
While I’m not entirely sure about that last sentence, these reflections are themselves an indication of how far we’ve come, and how far we’ve been affected by the wholesale slaughter of two world wars and the madness of the ‘mutually assured destruction’ era that followed them. The fact that we’ve now achieved a military might far beyond the average person’s ability to comprehend, rendering obsolete the old world of battlefields and physical heroics, has definitely removed much of the thrill of combat, now more safely satisfied in computer games. But let’s return again to that other country, the past.
In the same month that the war began, August 1914, the Order of the White Feather was founded, with the support of a number of prominent women of the time, including the author and anti-suffragette Mrs Humphrey Ward (whom we might now call Mary) and the suffragette leaders Emmeline and Cristobel Pankhurst. It was extremely popular, so much so that it interfered with government objectives – white feathers were sent even to those convalescing from the horrors of the front lines, and to those dedicated to arms manufacturing in their home countries. Any male of a certain age who wasn’t in uniform or ‘over there’ was fair game. Not that the white feather idea was new with WWI – it had been made popular by the novel The Four Feathers (1902), set in the First War of Sudan in 1882, and the idea had been used in the British Empire since the eighteenth century – but it reached a crescendo of popularity, a last explosive gasp – or not quite, for it was revived briefly during WWII, but since then, and partly as a result of the greater awareness of the carnage of WWI, the white feather has been used more as a symbol of peace and pacifism. The Quakers in particular took it to heart as a badge of honour, and it became a symbol for the British Peace Pledge Union (PPU) in the thirties, a pacifist organisation with a number of distinguished writers and intellectuals, such as Aldous Huxley, Bertrand Russell and Storm Jameson.
There was no PPU or anything like it, however, in the years before WWI. Yet the enthusiasm for war of 1914 soon met with harsh reality in the form of Ypres and the Somme. By the end of 1915 the British Army was ‘depleted’ to the tune of over half a million men, and conscription was introduced, for the first time ever in Britain, in 1916. It had been mooted for some time, for of course the war had been catastrophic for ordinary soldiers from the start, and it quickly became clear that more bodies were needed. Not surprisingly, though, resistance to the carnage had begun to grow. An organisation called the No-Conscription Fellowship (NCF), consisting mainly of socialists and Quakers, was established, and it campaigned successfully to have a ‘conscience clause’ inserted in the 1916 Military Service (conscription) Act. The clause allowed people to refuse military service if it conflicted with their beliefs, but they had to argue their case before a tribunal. Of course ‘conshies’ were treated with some disdain, and were less tolerated by the British government as the war proceeded, during which time the Military Service Act was expanded, first to include married men up to 41 years of age (the original Act had become known as the Batchelor’s Bill) and later to include men up to 51 years of age. But the British government’s attitude didn’t necessarily represent that of the British people, and the NCF and related organisations grew in numbers as the war progressed, in spite of government and jingoist media campaigns to suppress them.
In Australia, two conscription bills, in 1916 and 1917, failed by a slim majority. In New Zealand, the government simply imposed the Military Service Act on its people without bothering to ask them. Those who resisted were often treated brutally, but their numbers increased as the war progressed. However, at no time, in any of the warring nations, did the anti-warriors have the numbers to be a threat to their governments’ ‘sunken assets’ policies.
So why was there such an appetite then and why is the return of such an appetite unthinkable today? Can we just put it down to progress? Many skeptics are rightly suspicious of ‘progress’ as a term that breeds complacency and even an undeserved sense of superiority over the primitives of the past, but Pinker and others have argued cogently for a civilising process that has operated, albeit partially and at varying rates in various states, since well before WWI, indeed since the emergence of governments of all stripes. The cost, in human suffering, of WWI and WWII, and the increasingly sophisticated killing technology that has recently made warfare as unimaginable and remote as quantum mechanics, have led to a ‘long peace’ in the heart of Europe at least – a region which, as my previous posts have shown, experienced almost perpetual warfare for centuries. We shouldn’t, of course, assume that the present stability will be the future norm, but there are reasons for optimism (as far as warfare and violence is concerned – the dangers for humanity lie elsewhere).
Firstly, the human rights movement, in the form of an international movement dedicated to peace and stability between nations for the sake of their citizens, was born out of WWI in the form of the League of Nations, which, while not strong enough to resist the Nazi impetus toward war in the thirties, formed the structural foundation for the later United Nations. The UN is, IMHO, a deeply flawed organisation, based as it is on the false premise of national sovereignty and the inward thinking thus entailed, but as an interim institution for settling disputes and at least trying to keep the peace, it’s far better than nothing. For example, towards the end of the 20th century, the concepts of crimes against humanity and genocide were given more legal bite, and heads of state began, for the first time in history, to be held accountable for their actions in international criminal courts run by the UN. Obviously, considering the invasion of Iraq and other atrocities, we have a long way to go, but hopefully one day even the the most powerful and, ipso facto, most bullying nations will be forced to submit to international law.
Secondly, a more universal and comprehensive education system in the west, which over the past century and particularly in recent decades, has emphasised critical thinking and individual autonomy, has been a major factor in the questioning of warfare and conscription, and in recognising the value of children and youth, and loosening the grip of authority figures. People are far less easily conned into going into war than ever before, and are generally more sceptical of their governments.
Thirdly, globalism and the internationalism of our economy, our science. our communications systems, and the problems we face, such as energy, food production and climate change, have meant that international co-operation is far more important to us than empire-building. Science, for those literate enough to understand it, has all but destroyed the notion of race and all the baggage attend upon it. There are fewer barriers to empathy – to attack other nations is tantamount to attacking ourselves. The United Nations, ironic though that title often appears to be, has spawned or inspired many other organisations of international co-operation, from the ICC to the Intergovernmental Panel on Climate Change.
There are many other related developments which have moved us towards co-operation and away from belligerence, among them being the greater democratisation of nations - the enlargement of the franchise in existing democracies or pro to-democracies, and the democratisation of former Warsaw Pact and ‘Soviet Socialist’ nations – and the growing similarity of national interests, leading to more information and trade exchanges.
So there’s no sense that the ‘long peace’ in Europe, so often discussed and analysed, is going to be broken in the foreseeable future. To be sure, it hasn’t been perfect, with the invasions of Hungary in 1956 and Czechoslovakia in 1968, and the not-so-minor Balkans War of the 90s, and I’m not sure if the Ukraine is a European country (and neither are many Ukrainians it seems), but the broad movements are definitely towards co-operation in Europe, movements that we can only hope will continue to spread worldwide.