Sunday, 4 September 2011

Dunbar's Number

You are probably vaguely aware that Dunbar’s Number is 150. It’s the maximum number of people that we can socially interact with, not the number of our close friends — that’s much smaller.
It’s the maximum number of people where peer pressure will keep the group intact. The maximum size of a group that we can describe as ‘us’ rather than ‘them’.
If an organisation has more than 150 members, it needs a managerial structure.
The basic fighting unit of the Roman army — the maniple — had around 130 to 150 soldiers.
A British Army company has around 150 soldiers.
And there are many more such examples. Dunbar correlates this to the size of the neocortex. Even if we wanted to have groups larger than 150, we aren’t adequately equipped.
There are a couple of other examples, but I don’t think they are related to the neocortex.
I remember in a physics lesson in school, that we were introduced to the idea that Adam and Eve were 150 feet tall. I really don’t know if this was apocryphal or not, but the idea was easily disproven: their bones would not have been strong enough.
And in another peculiar piece of early theology, the question was: how many angels can dance on the head of a pin?
And yes, someone worked it out. It was 150.
So now you know.

Friday, 2 September 2011

The Seven Colours of the Rainbow


Newton originally said there were five colours in the rainbow, but later revised this upwards to seven. I don’t know about you, but I find it very difficult to see the differences between blue, indigo and violet.
So why the revision upwards?
Newton had a finger in many pies; he was into alchemy, which was then a mainstream science, and religion. He tried to reconstruct Solomon’s Temple from the details in the Bible.
Now, I don’t quite follow why 10 is a perfect number, but to people like Newton it was. As part of this ‘perfection’, there were the trinity of Father, Son and Holy Ghost. If their number is removed from 10, we get seven. So all God’s creations, if they are to be perfect, much contain seven parts.
So, the rainbow has to have seven colours.
And no, I don’t really follow the theology either.

Friday, 5 August 2011

Translation III

The translation of scientific papers and the like is pretty straightforward, if rather formulaic. But I think there is a major step change in the level of difficulty when it comes to literature. Getting the right tone, understanding the subtle nuances of meaning and symbolism must be really taxing, and would need a deep understanding of both languages and their literature. The translator really has to get into the author’s head, to fully understand him or her, before attempting a translation if the work is to be faithfully reproduced.
In part, this must be because English has so many words; there are about 500,000 of them — more than in German and French combined. It’s partly a result of English being a mixture with ancient germanic roots, to which was added some viking and lots of Norman French; and to which lots of words were appropriated from other languages, and from the colonial exploits.
Not all english words are in current use; coney was replaced by rabbit, and tharmes by intestines or guts or bowels. Reflecting its ‘mixed-race’ origins, English often has synonyms or near-synonyms for many things, and sometimes the shades of meaning are subtle. Jack and Jill took a ‘pail’ to get the water, not a ‘bucket’, yet a pail is a bucket. And while you might measure garden manure in bucketfuls, you couldn’t really measure it in pailfuls — it just wouldn’t be ‘right’. (It’s curious, too, that Jack and Jill’s well was on top of a hill, not at the bottom, the obvious place.)
Not all translators stuck rigidly to the idea of an accurate, faithful to the author translation. Sir Richard Burton’s Rubaiyat of Omar Khayyam was as much his own work as Omar’s. Burton, too, revised his ideas several times, adding more verses, and changing previous ones substantially. A comparison with a literal translation shows just how inventive he could be.
There’s another interesting take on translation. Orphan Pamuk writes in Turkish, though I gather that he understands English. The primary translation of his (recent) works is into English; great care is taken to get this as accurate to his text, idioms, meanings etc as possible. This english translation is then used for further (re)translations, rather than going directly from the Turkish.



PS For a more extensive treatment of translation (and a translation) have a look at:

Is That a Fish in Your Ear?: Translation and the Meaning of Everything: Amazon.co.uk: David Bellos: Books

Translation II

There is a rotunda in the city of Thun is Switzerland that houses a panorama of the city. This was painted a couple of hundred years ago by Marquard Wocher — so it’s called the Wocher Panorama. And very impressive it is too. Panoramas and dioramas were common enough in the nineteenth century, but many of them have now disappeared, so they are relatively rare now.
There was a special exhibition a few years ago to celebrate the bicentenary, with the usual ‘merchandise’, including a book. This described the history of panoramas in general, and how this painting was done, where it had been and exhibited, and how it got back to Thun.
Reading the book, I discovered that these sorts of painters were called ‘little masters’. I didn’t realise at first what was meant by this — I didn’t imagine that whoever did the painting was a dwarf, nor did I think they were somehow second rate. Fortunately, the text was in both German and English, and I found that the ‘little master’ was a Kleinmeister. And yes, the literal word-for-word translation of Kleinmeister is little [or small] master. 
A Kleinmeister is a miniaturist in English, not a very common word. A miniaturist is someone who paints scenes on very small objects. A master or Meister is one who has been admitted to this grade in the guild; to do this, he produces his Meisterstuck or masterpiece to the Guild’s technical committee for inspection, assessment and approval. A masterpiece isn’t necessarily the best work the master will ever produce, even if this is the common meaning nowadays.
A quibble over a single word? Well, yes, but the translation had been done by a professional service, though clearly one without any special expertise in painting techniques. And it was a German service, so I don’t know whether they used a native English speaker for the translation. The rest of the translation conveyed the meaning adequately, even if the phraseology was a bit wooden and stilted at times.
There’s also a tendency to simplification in translations such as this, to leave out things that a German reader would immediately understand, but which an English reader probably would not. I’m thinking of references to things in German culture — the average English reader simply would not get the message in the way that a German reader would.

Translation I

Simplistically, translation is converting one written language into another, and interpretation is converting the spoken word into another (spoken) tongue. Transliteration is really a word-by-word conversion.
A translation ideally reads as if it was the original, but getting the niceties of language and the idioms across isn’t as easy as you might think. It’s straightforward to get the meaning more or less right — we have all seen examples of DIY instructions where the meaning is apparent, even if a bit mangled.
I’ve done some translations from German to English; these were scientific, medical academic papers. This is usually simple enough, provided you know the conventions in the two languages. In other words, you need to be familiar with how such a paper appears in German and how it would appear in English. Such papers usually follow the IMRAD standard — introduction, methods, results and discussion. It’s only the discussion which can be troublesome at times, as it’s opinions and meanings that you are trying to get across.
I sometimes got the papers in German, sometimes as a first draft in English. The papers in English were interesting; there were times when I had to translate back into German to get an idea of what the authors were trying to say, before I could put down what I thought they were really trying to say.
I was asked at times how I would put a German phrase into English, just a phrase in a sentence. I often couldn’t — it wasn’t that I didn’t understand it, it was because the structure of the sentence wasn’t what would be expected in English, the translated phrase simply did not fit in.
In these papers and articles, there are standard forms of expression in both languages. Thus, Darstellung der Gallenblase in German becomes The gallbladder was exposed in English — a literal translation would be exposure [or exposing] [of] the gallbladder. Now, this latter is quite understandable, but it is not how it would be written in English. This is something that the translator just has to know — it’s not enough to know what the words and phrases mean, they have to be rendered into the equivalent words and phrases in English. And the only way that this can be done is through knowledge of the phraseology of both languages as used in the specific academic articles.
After a few goes with translating the academic papers, my solution was simple; I would more or less completely rewrite them. I made sure that all the facts were there, but I would re-arrange the order of things, convert long sentences into two, re-paragraph. In short, I did whatever it took to make it look like an English ‘original’ and not a translation. But then, I did know what the authors were saying, I understood the background and the technical terms; I’d guess that non-specialised translation services would struggle with this.
You might say that it’s the meaning that’s important, the infelicities don’t really matter. Perhaps this is correct, but it’s so much easier to read something that is written fluently, where you just know what it means without having to struggle through it. And if it’s a struggle, you are more likely to give up, or, if you are the peer reviewer, to give up and suggest that it needs total revision. 
Human nature, really.

Wednesday, 6 April 2011

A New Year

So, it’s the sixth of April and the new fiscal year begins today. Or, to you and me, a new tax year. Strange date to choose for a new tax year, don’t you think? Why not be logical and rational and start at New Year? But it’s 6 April, and has been for a long time now. I’ve never yet met an accountant or banker who could tell me why this arbitrary date had been chosen.
English taxation goes back, like so many things, to Magna Carta in 1215; taxation had to be renewed annually, with taxes being due at New Year.
There were actually two New Years, a secular one on 1 January, and a liturgical one on 25 March. This is the Feast of the Assumption, the traditional date on which Mary was told that she was ‘with child’. 
Tax was due at the liturgical New Year.
By 1752, the calendar in Britain was 11 days out of synch with the sun and the seasons. A change from the old Julian calendar to the (new) Gregorian calendar had been resisted on the grounds that it was ‘Popish’, but the discrepancy was to big to be tenable any longer.
A commission recommended that Britain change to the Gregorian calendar, and legislation provided for almost all legal things to be accommodated to this — the lengths of prison sentences, for example. And New Year was fixed at 1 January. 
They reckoned without the views of the taxpayers.
It was expected that a full year’s worth of tax would be paid 11 days early, and the taxpayers simply revolted, and didn’t pay. They waited 11 days, and then paid up. Work it out, and you’ll find it was 5 April. Curiously, the tax year continued to use the Julian calendar. The year 1800 would have been a leap year in the Julian calendar, but wasn’t a Gregorian leap year, so 6 April became the start of the tax year from then.
It wasn’t until the early 1800s that the position was formalised, and 6 April became legally established as the start of the tax year. 
So who’s to thank (or blame) for this? Julius Caesar, Cleopatra and the Pope. 

Thursday, 31 March 2011

The Shame of the Apple


You’ll recall, if only vaguely, the story of the Garden of Eden, the forbidden fruit, the fall from grace, the shame, the pain of labour and having to toil in the fields.
Some parts of this are curious; the fruit isn’t defined, but is often taken to be an apple. Other theories have been advanced, it was a pomegranate or a fig for example, but an apple seems the favourite. Why an apple?
Toiling in the fields implies settled agriculture, starting around 5000 BCE; before then our ancestors were hunter gatherers, moving around and basically foraging and living in groups, communally. Most things seem to have been shared; not only responsibility for child care but also their procreation.
Settlement brought new problems; the creation of ‘wealth’ and ‘property’ and ‘inheritance’. Who would inherit on the patriarch’s death? It had to be a son that he was certain was his son and not some other man’s. So, to ensure this, the patriarch had to expect his spouse to be faithful, and perhaps to be faithful himself; monogamy. To sustain this, marriage was ‘invented’: I say ‘invented’ because it seemed that it served a need, to formalise the relationship, to make it ‘legal’. Organised religion has had lots to say about the pros and ’sanctity’ of marriage and the cons of fornication and adultery. But, basically, it seems to be about property.
The pain of labour: perhaps it was ever so, at least for homo sapiens.
Physicians often had a very religious, moral tone to their activities. There were those who fulminated against Queen Victoria when she had chloroform during one of her deliveries. The counter to this was that Adam ‘fell into a deep sleep’ when his rib was excised to make Eve, and this was a biblically approved ‘anaesthesia’. Alas, this story seems to be apocryphal. 
The shame: to anatomists, it is the pudendum muliebre for girls, and pudendum virile for boys (though this is now archaic). Pudendum derives from the Latin pudere, meaning to be ashamed (of).
And the apple? Well, if you cut an apple in half, the cut surface is supposed to resemble the pudendum, the vulva. Our forebears had good imaginations, and sex on the brain.

Friday, 25 March 2011

Calendars

The Roman calendar was pretty mucked up in Julius Caesar’s time, and he was getting fed up with it. He realised that the Greeks were the clever ones who could sort it out for him, and asked Cleopatra to do something about it. It helped that Cleopatra was his mistress at the time. So, Cleopatra asked her ‘chief scientific officer’ Sosigenes to sort it out. His calculations produced the Julian calendar, used for the next 1600 years or so. Months of 31 and 30 days alternated, except for poor February, which had a few lopped off. And when Julius was designated a god, he had a month named after him; and likewise his successor Augustus who got the month after Julius — which partly explains why September, the seventh month, is nowadays the ninth month. But, as Augustus couldn’t be seen to have fewer days than Julius, Augustus’s month had to be made up to 31 days. And guess where the extra day came from!
The Roman calendar started at the founding of Rome — the exact date varies with the source. ’Little Dennis’ was given the task of dating the Christian calendar from the year of Jesus’s birth which would have been 1AD — he seems to have got this a few years out. And in the fifth century, when he was doing the sums, the concept of zero or ‘0’ was unknown; so 1BC was immediately followed by 1AD. We call these BCE (before the common era) and CE (common era) these days.
By the end of the sixteenth century, the calendar was in trouble again; it was 10 days out. This was important as it made the calculation of Easter problematical. So Pope Gregory got his scientific chaps to revise and correct the Julian calendar, which they did. They had to shift the date by 10 days, but, in an attempt at perpetuity, altered leap years. Only centuries divisible by 400 were to be leap years, rather than every century — the other leap years remained. Thus entered the Gregorian calendar, the one that your computer uses.
Copernicus published his ‘theory’ that the earth went around the sun earlier in the sixteenth century, though this went against orthodox theology. Nonetheless, Gregory’s scientists used this concept in their calculations — the sums were easier. And by describing the ‘revolution’ of the earth, Copernicus unwittingly added a new meaning to the word — a successful ‘regime change’.
Sosigenes’ error of 10 days in 1600 years is about 1:58,000. Not bad going for someone without the benefit of a telescope, a calculator or a computer. 

Thursday, 24 March 2011

More Modern Myths

You might think that pregnancy (and childbirth) is a natural condition, and for around 80% of women it is. The other 20% have ‘problems’ of varying degrees of seriousness, and it’s true that some complications are really serious, and, sadly, there are still maternal deaths. Whether this is enough to ‘force’ all women to have a hospital confinement is a debatable problem, and one that I’m not in a position to discuss, though I can mark it up.
Until quite recently, there was another problem with pregnancy. On the maternity chart there were fields for the mothers’ name, date of birth and so on. And there was one for ‘date of marriage’. 
This field was often left blank, but it was previously important. If there was less than nine months between the two dates, there might be some discreet sniggering, but not much more. You might find it hard to believe, but women who could not fill this in were treated differently from those who could. Nowadays, labour shouldn’t go on for more than eight hours or so without some sort of intervention. Not so long ago, up to 24 hours was acceptable before intervention. But for some women (or, perhaps they were girls) it was acceptable to let them labour for 48 hours. Nothing like a good, if disguised, dose of moralising to bring it home to you, just what were the consequences of your actions.
You don’t believe me? Just a senior, retired obstetrician. If necessary, ply him — it will always be a him — with strong drink.
Girls have been riding bicycles for well over a century, yet, strangely, it’s only recently that they have reported problems. It seems that some of them feel that their labia minora are being traumatised, are being made painful by this activity. And so, they request what can only be seen as a cosmetic procedure to reduce the ‘excessive’ length of their nymphae. Unusually, this ‘disease’ is one that patients have discovered, not one ‘invented’ by their medical practitioners.
Circumcision was, a century ago, seen as a very valuable prophylactic against syphilis for which there was little in the way of medical treatment, though mercury preparations might help to alleviate some of the worst of the tertiary symptoms. Syphilis was then an incurable disease, and while abstinence was to be desired, this was qualified by the reality.
There’s now a movement for prophylactic circumcision to control HIV transmission. HIV is another disease that, as I understand it, can’t be cured, though progression (to AIDS) can be controlled. Those who advocate circumcision would be well advised to remember the previous efforts to control syphilis — it didn’t work.

Modern Myths

It would be good to think that we don’t live in an age of ‘medical myths’, that life was logical and scientific, all things proven, substantiated. Dream on.
We should all drink 3 litres of fluid (or water) a day, shouldn’t we? The figure of ‘3 litres’ seems to have come from fluid balances in kidney patients. Obviously, we all have to drink some fluid a day, and there is an upper limit to how much we can pee out, but ‘3 litres’ is more like a number plucked from a hat than anything else. I have yet to find any reputable scientific support for this.
And while we on about healthy living, we should all eat ‘5 portions’ of fruit a day, shouldn’t we? Unless we live abroad, when we should eat 8 or more. And sometimes potatoes are included, and sometimes they aren’t. Oh, I did discover that a ‘portion’ is 80 grams.
So where did the ‘5’ come from? As far as I can discover, it started with the fruit growers in California who wanted to improve their sales. So they suggested doubling the amount of (their) fruit that people age, and lo! The magic ‘5’ was born. And this was subsequently taken over by the World Health Organisation. The cynics amongst you will recall how shampoo manufacturers doubled sales — by telling consumers that they should shampoo twice. I’m sure that fruit is good, as part of a ‘balanced diet’ (whatever that might mean), yet the rationale for 5 — or any other number — is more belief than something clearly proven.
Munchausen’s syndrome was first described by Richard Asher in the early 1950s; it’s typically characterised by patients seeking medical attention with strange neurological, abdominal or ‘bleeding’ complaints. Baron Munchausen was famous for his fabulous adventures, as recounted by his ‘biographer’ Rilke. Richard Asher ‘talked sense’ on many topics — and was also the father of Jane.
Anyhow, the idea of ‘Munchausen’s by proxy’ developed; patients with psychiatric problems whose children presented with strange symptoms and problems — due to abuse by their mothers. Professor Sir Roy Meadow is credited with its identification. What became known as Meadow’s law is: “unless proven otherwise, one cot death is tragic, two is suspicious and three is murder”, though he seems not actually to have said this.
You may recall the trial of the solicitor, Sally Clark, prosecuted after the death of two of her children. Sir Roy gave evidence, she was convicted, though this was much later overturned on appeal.
The chance of a cot death¹ is roughly 1:8500, that is, there will be one cot death for every 8500 births. Sir Roy gave evidence to the effect that the ‘chance’ of a second cot death was about 1:73,000,000. He seems to have squared 1:8500 to arrive at this figure.
However, he did not seem to recognise that ‘chance’ here refers to a random finding, something that cannot be predicted; and that chance has no memory. If you flip a coin, and it comes down heads, the next time you flip it will be either heads or tails — equally. That it came down heads the first time is totally irrelevant. So, if you have had one cot death, your chance of a second remain exactly the same.
Or not: the cause or causes of cot death are unknown, which is not the same as saying that there are no causes, that it’s entirely a random, chance event. Moreover, you could well argue that whatever the cause or causes are, if you have had one cot death, these same causes could well make you more likely to have a second.
And here I must hang my head; I remember the original trial well, and I remember thinking then that the evidence, as reported in the newspapers, was flawed; yet it is to my continuing shame that I did nothing.
Munchausen’s syndrome by proxy may well be a real problem; but, as so often in the past, those who believe blindly in it (a syndrome of their own discovery) are unable or unwilling to see and accept other explanations.

1. It is conventional to declare an interest; I do so declare.

Recently deceased Myths

There are a few myths which disappeared within my working life. ‘Myths’ is perhaps too harsh a description, they were more concepts and hypotheses, though some of them were the accepted paradigm. Of course, progress depends on being unreasonable, and being unreasonable means challenging what’s accepted. For those conditions where the cause or causes are unknown, it’s not unreasonable to allow flights of fancy, to fly a kite, to see what the reaction is.
As students, we were taught some paradoxical physiology of the heart; so strange, that we didn’t believe it, though we had to spout it for the exams. Whatever this strange idea was — I’ve long forgotten — it had disappeared by the time of the post-graduate exams.
Crohn’s disease is a nasty inflammation of the bowels, cause unknown. Both toothpaste and corn flakes were suggested as causes, both fell early on.
The sugar content of cigarettes was thought to be the most important marker of badness; this idea didn’t catch on.
The mantra of ‘no acid, no ulcer’ was the orthodoxy to explain peptic (stomach and duodenal) ulceration, and this was reinforced by the discovery of the Zollinger-Ellison syndrome, where pathological secretion of the hormone gastrin is associated with intractable ulceration. Surgical methods concentrated on acid reduction, either by resection of most of the stomach, or more scientifically by various techniques of vagotomy — division of the vagus nerve to the stomach.
Vagotomy didn’t always work, often because the procedure was ‘incomplete’, and perhaps because of other, associated pathology. When I attempted to make sense of this, and of the investigations, I could find no correlation; I could only conclude that ‘something was missing’, though I didn’t know what it was.
A further orthodoxy was that germs could not exist in the acid stomach, it just wasn’t possible. So it came as a shock to discover that a miserable germ could indeed exist in this hostile environment, and more of a shock to find that if it was eliminated, so was the ulceration. 
The discovery of Helicobacter pylorii, which could be treated with pills, led to the very rapid disappearance of surgery for peptic ulceration. I’d learnt all the fancy techniques for nothing (and, if I’d been a bit cleverer, I might have got a Nobel prize.)
More recently, there was a major scare over the MMR (measles, mumps, rubella or German measles) vaccination. MMR seemed to cause autism, the concept was heavily promoted, and vaccination rates fell dramatically. It did take some time, but the ‘research’ was subsequently discredited.

Wednesday, 23 March 2011

Even More Medical Myths

While it’s hard to be certain, it seems reasonable to assume that acute appendicitis has been around for a long time, certainly from before its recognition towards the end of the nineteenth century. The natural history of (untreated) acute appendicitis is equally uncertain, though one can guess that roughly a third of cases resolve spontaneously, one third progress to a local abscess (which may resolve, or rupture) and the last third burst, resulting in generalised peritonitis which,without treatment,is invariably fatal. 
The cause of acute appendicitis was greatly facilitated when King Edward VII underwent an operation on the eve of his coronation — but only after being told that if he didn’t have the operation he would go to his coronation in his coffin. Actually, the King had an appendicular abscess drained, and his appendix wasn’t removed, but this mere detail did nothing to prevent the rise of the procedure.
Because of the danger of generalised peritonitis, it was totally acceptable that around one-fifth of all removed appendices should be normal — better to remove the normal than let the inflamed burst.
With acute appendicitis established as a real and treatable disease, there was a slightly later invention of ‘grumbling’ or ‘chronic’ appendicitis. In this ‘condition’, patients with on-going bellyache, specifically in the lower right-hand part, appendicectomy was the ‘ideal’ operation. Even if the bellyache didn’t resolve, there was no longer the fear of acute appendicitis and generalised peritonitis. In any case, the appendix was clearly a ‘vestigial’ organ whose removal was only really hastening the change to be expected from evolution (though not by natural selection).
Like the ‘redundant prepuce’ and the appendix, the tonsils were clearly organs in need of extirpation. They enlarged during childhood, and this was clearly a ‘bad thing’. That their enlargement was normal and ‘necessary’ was a minor detail, easily ignored. And like the prepuce, no family was properly hygienic unless the tonsils of the offspring were removed, with the kids lined up, waiting for the guillotine. 
The prepuce, clitoris, uvula and tonsils are all easily accessible; what does that say about the ‘need’ for their removal?
Investigations progressed during the early twentieth century, and it’s no surprise to find that previously unknown diseases were uncovered. One of the best of these was nephroptosis, or sagging or droopy kidney. Patients would present with aches over a kidney, and investigation would show that the organ sagged when they stood up: post hoc, ergo propter hoc. At this time, anatomy was learnt on preserved cadavers placed supine, and the normal sag when people stood up wasn’t understood. 
While there might have been a few brave souls who denied the existence of nephroptosis, the most heated argument was between those who thought that the offending organ should be removed and those who thought it should be fixed in position. Even today, there is a residue of belief in nephroptosis — and the article in Wikipedia is utter rubbish.
The kidney wasn’t the only droopy organ; the caecum, that blind-ending part of the large bowel beside the appendix, was found to sag when people stood up. And, unsurprisingly, the argument was between the ‘fixers in place’ and the ‘loppers out’.

More Medical Myths

The Victorian physician thought that removal of the prepuce by circumcision was an appropriate and useful operation for ‘nervous complaints’. As the clitoris was ‘clearly’ the equivalent of the foreskin, would not its removal be helpful for female masturbation and nymphomania?
The Victorians seem not to have regarded women as sexless, incapable of sexual enjoyment, but rather, placing them on a pedestal, praised them for their ‘natural purity’ — in contrast to men who were simply ‘beastly’. Of course, within marriage, women could fulfil their natural role, find satisfaction and bear children, preferably annually.
There was a short-lived vogue for clitoridectomy in the 1860s, its justification employing the same type of logic as employed for circumcision: ‘nervous complaints’, masturbation and nymphomania. Despite apparent initial successes with the procedure however, the view that the clitoris was the male prepuce was questioned and found wanting. The proponent of the operation was vilified, and expelled from the Obstetrical Society, and his nursing home closed.
Unhappily, clitoridectomy (or simpler excision of the clitoral hood) found more favour in the United States, and continued there well into the 20th century. There are enthusiastic reports from the 1970s about its efficacy in improving sexual response. Even within the last couple of years there have been reports of surgery to reduce the ‘oversized’ clitoris.
There’s a curious aftermath to this. ‘Hysteria’ became a fashionable disease, one found only in women, and having the usual multitudinous symptoms — ‘nervous attacks’, loss of appetite, loss of libido etc. The word ‘hysteria’ has been expunged from medical texts, though ‘conversion disorder’ has entered them. Hysteria has the same origin as hysteron, the womb. (Think of ‘lunacy’ and the moon or ‘luna’.)
Hysteria was treated in an entirely different way, one that was profitable for the physician because it needed to be repeated quite frequently, and one that was entirely risk free.
The physician, under cover of a sheet, provided a ‘vaginal massage’ until the patient experienced a ‘hysterical paroxysm’. In other words, the physician masturbated the patient to orgasm.
This ‘procedure’ was unpopular with (some) physicians, for it could apparently take several hours ‘work’ to obtain the ‘paroxysm’, time that could be better (and more profitably) spent with other patients. Cynically, you might say that the physicians’ technique was in need of improvement.
Help for the physician soon appeared, firstly driven by clockwork. The electric vibrator was one of the first domestic electric devices, though advertisements were somewhat coy about its real intended use. And the physician could return to more ‘useful’ work, such as the monthly change of ring pessaries for the control of uterine prolapse (at two guineas a time).

Medical Myths

Shaw’s attack on doctors who performed useless operations by removing the ‘Nuciform Sac’ may have had its origin in a contemporary vogue procedure for removal of the uvula, the dangly thing at the back of the palate. At least one surgeon seems to have made a very good living from this ‘ideal operation’, and was able to purchase a country estate, and live the life of a gentleman.
Such an ideal operation needed to be easy to perform, have no obvious side-effects, and be generally safe. It didn’t matter so much that patients derived any actual benefit from it, as long as they believed that they had. And they had to believe in the ability of their surgeon; and the more experience the surgeon had of the efficacy of his procedure, the more his patients could believe in him and it; a virtuous vicious circle.
Today’s gold standard is the ‘randomised double-blind trial’, in which neither the doctor nor the patient know what treatment has been given. It’s harder to do this for surgical procedures; there was a trial of gastric cooling as a treatment for bleeding from the stomach, and this showed that it didn’t work. A trial of arthroscopy of the knee showed, even if the arthroscopy hadn’t actually been done, that it was just as effective as real interventions. This (apparent) improvement is called the placebo effect — crudely put, just by doing something, a significant benefit can be expected. The placebo effect made it hard to determine whether the patient was really any better off. And of course, before the full extent of the placebo effect was known, pretty well any operation could be expected to bring improvement. So it’s no surprise that 19th century operations were mostly successful; and remember, that at that time, there were practically no pills with any real benefit. If you had a problem, the only way it could be properly fixed was by having an operation.
The sexual mores of the 18th century seem to have resembled todays; but gradually attitudes ‘hardened’, and medical opinion (at least in the UK) was conflated with religious orthodoxy. Several diseases, previously unrecognised, were described: spermatorrhoea, masturbation¹ and redundancy of the prepuce or foreskin². None of these are diseases, but they all exercised the mind of the Victorian medical man — there were no medical women in those days. The consequences of masturbation were legion, including debility, blindness, insanity, pimples, impotence, pain in the hand (!), you name it, and ultimately death.  Fortunately, a simple remedy for all three was available: circumcision, which was also advised for epilepsy.
The logical basis for advocating circumcision seems, at least in part, to have come from the Jewish practice, whereby newborn males are ritually circumcised by the mohel on the eighth day — in a reference back to the time of Abraham and sacrifice. Since Jewish boys apparently did not suffer from the fearsome effects of these (non-existent) diseases, clearly it was simply because they were circumcised. A similar logic recognised the benefits in the followers of the Prophet.
To all of this, there was the rise of ‘hygiene’ and the recognition of the ‘filth’ that lay under the foreskin; and this ‘filth’ naturally produced ‘irritation’, and irritation clearly caused masturbation. And, as Jews had a lower rate of syphilis than Gentiles, clearly removal of the foreskin could act as a preventative.
From roughly mid-Victorian times to the second World War, circumcision was heavily promoted. And, if not done soon after birth — without anaesthetic, as clearly ‘infants did not feel pain’³, then when symptoms (of masturbation) became apparent. Doing the procedure without anaesthetic was felt to provide a salutary lesson.
A simple, safe operation? Not exactly; in 1940, sixteen boys died directly after circumcision; the numbers who suffered significant physical side effects before then is unknown.
And don’t think that any side-effects were purely physical. These men were all circumcised: AE Housman, John Maynard Keynes and his brother Geoffrey, Tom Driberg, WH Auden and Alan Turing. All of them resented having had it done, forced on them. You might like to reconsider some of what they said and wrote in this light.


1. Onanism was invented in 1710 as an advertising device to sell patent medicine, was reinforced half a century later in an influential text, and became an English epidemic early in the 19th century. 
2. I was dismayed to find that one of my surgical textbooks still clung to the concept of ‘redundancy of the foreskin’.
3. Likewise, another textbook suggested that circumcision could be done in infants without anaesthesia.

Saturday, 12 March 2011

The Nuciform Sac, an Anatomical Question

“Congratulations!” I said. Some of the juniors had just passed their examinations.
“What were the questions like?”
“They were very reasonable, clinically relevant sorts of things,” I was told.
“Nothing to far out, then?”
“Oh no, quite straightforward. Why do you ask?”
“Well,” I replied, “the examiners could be buggers when they wanted to in my day.”
“That wouldn’t be allowed now. What sort of things did they ask you?”
I thought for a moment. “I seem to remember I was asked about the nuciform sac,” I said. Of course, I was asked no such thing. There was silence.
“I didn’t know a lot about it, but I could tell them something,” I said.
There was silence in theatre, and more than a few rather puzzled faces.
“It’s one of these vestigial bits, no known function, though if diseased, removal may be helpful.”
The silence continued, the faces remained puzzled. I wasn’t very surprised, though I’d been hopeful that perhaps one of them had heard of it; or perhaps the anaesthetist. But none of them had. I had to give in and tell them.
“It’s fictional,” I said, “it's in George Bernard Shaw’s play The Doctor’s Dilemma. One of the characters makes a good living removing it, having invented an illness that can only be cured by his operation. It’s not the dilemma in the play, but a sideswipe at doctors doing unnecessary surgery — specially in private practice.”
Shaw was writing in the early 20th century, but such practices continued for a long time afterwards — I have seen a patient who had had a nephropexy (fixation of the kidney) as a cure for her ‘droopy kidney’. As there is no such illness, the operation didn’t help her at all. I was even more surprised to learn that her private insurance had funded the procedure.

Proving it

“Explain this, then; the phrase ‘it’s the exception that proves the rule’. How can the exception show that the rule is true?”
Not for the first time, there was silence in the theatre.
“It doesn’t make any sense,” said one of the juniors. “If there is an exception, the rule cannot be true.”
“It makes perfect sense,” I replied, “though I did try to mislead you just a little.”
“I can’t see how it can be correct.”
“But it is, and you know that it is,” I said. “You just don’t know that you know.”
Another silence.
“Do any of you like whiskey?” I asked. There was a murmur of assent. 
“I don’t, but haven’t you looked at what it says on the label? Beyond the name and all the usual guff.”
Clearly, they hadn’t, or if they had it hadn’t registered.
“OK, do any of you make bread with yeast then?” There was another murmur, alas only from the girls.
“And after you’ve mixed it all up, what do you do next?”
“I leave it to prove,” answered one of the girls.
“Exactly. So what are you doing?”
“Waiting to see that the dough rises. I usually put it in the airing cupboard.”
“Yes,” I replied, “and when it rises, what does that mean?”
There are times when you have to drag answers kicking and screaming out of people. It can be bloody.
“It means that the yeast is healthy, that it’s alive.”
“So, could you give me another word instead of ‘prove’ for what you are doing?”
“I suppose you’re testing the yeast.”
“Say that word again, please,” I asked, “slowly.”
“Testing.”
At last; but why is it so often the nurses who can work it out?
“So, the word ‘prove’ actually means ‘to test’ then?”
“Yes, but I though it meant it showing something was true.”
“It’s an example of how words can change their meanings over time,” I said. “Originally, it meant ‘to test’, but now we think of is as ‘showing something is true’.
“So, go back to ‘it’s the exception that proves the rule’ and replace ‘proves’ with ‘tests’ and it makes sense, doesn’t it?”
“Ah, I see. But what about the whiskey? What were you on about there?”
“It used to say on the label something like ‘40° Proof Spirit’,” I said, “meaning that it had been tested and found to have the correct amount of alcohol.
“But now, can anyone tell me how they used to test whiskey?”

Blue

“Can you show me something blue here?” I asked.
We were in the operating theatre again, and I was using the look-around-and-be-creative tool again.
“What about the nurses’ caps?”
“They are not really blue, and though you might call them light blue, that’s not the proper description either.” 
Actually, it was rather hard to find something blue. Properly blue, not light blue, or blue mixed with another colour. We did eventually find something, I forget what.
Blue, technically, is what you might think of as ‘dark blue’. What you would call light blue is really cyan.
“So, where would you see a cyan spot?” I asked. There were blanks looks all round.
“Ask the anaesthetist. He’s bound to have something, even if it’s Sudoku these days rather than the crossword.” And indeed, the anaesthetist was doing the Sudoku in the paper.
Newspapers are printed in colour these days, a process called CYMK, or cyan, yellow, magenta, black (or key). And somewhere you will find patches of CYMK, the printer’s controls.
This is a colour checker:


Blue is on the second row up, on the left; cyan is at the right end of the row.
Now, why do we have blues music, why are some jokes blue, and why do we sometimes feel blue — and when we are very blue, why are we then black? And if we think of blue as being cold and red as hot, why is it that the hotter things get that they go from reddish to blue?

Friday, 11 March 2011

Another use for the Diathermy Machine

“So, what exactly is diathermy?” 
As I expected, I got a five minute mini-lecture, most of which I didn’t understand, and all of which I’ve forgotten. Yet, I wasn’t told exactly what it was.
I should explain: diathermy or electro-coagulation uses a high-frequency current to heat, and thus coagulate, small blood vessels. It spreads from the ‘active’ pole, usually whatever is being held in tweezers, and returns to the ‘collecting plate’. It’s been around from before the second World War.
“Just what sort of a current is it?” Again, another mini-lecture.
“So, why don’t we just use ordinary mains electricity? It’s an alternating current, after all, like diathermy is.”
Now, that stumped them for a while. Actually, that is quite simple; ordinary electricity at 50 Hz (cycles per second) can interfere with the heart’s pacemakers, and cause sudden death — one way that people can be electrocuted.
“It’s a radio frequency current.”
“That’s what I said.”
“Without using the words radio frequency, or explaining why it must be.”
“A radio frequency alternating current won’t interfere with the pacemakers. That’s why it’s safe — provided all the correct precautions are observed.”
They looked suitably smug. Sir clearly hadn’t followed the lectures.
“So, can you imagine any other use for diathermy machines?” Now, this was a bit naughty of me, but Sir does have to be at least one step ahead of the young chaps, to keep them in their place.
They couldn’t. I wasn’t surprised.
“This is a strange story. In the early days of the Blitz, the Germans used to fly along a directional radio beam; and when they crossed a second directional bombs, they knew they were at the target. The British initially thought that a directional radio beam was impossible, but sent up a plane to check it out. It was true. So there was the problem of jamming those radio beams. A senior officer went round the surgical suppliers, gathering up diathermy machines. These were then sent to village policemen on the south coast, and when directional beams were found, the policemen got instructions as to how to change the settings on the machines. It actually worked, the directional beams were blocked.”

Why Green?

The assistant and I were draping the patient, having painted on the antiseptic, we were putting on the drapes. These are paper nowadays, rather than the more traditional balloon cotton — paper has a lot of advantages — though the green of the cotton is now more of a bluish-green in the paper.
Of course, in countries where the value of the surgeon is properly recognised, all of this is done for you by the hand-maidens. They even hold up the gown for you to walk into it after scrubbing — a rather different way of getting close and personal with someone. But I digress.
“So,” I said, “why are the drapes green?” There was silence. “And what is the physiological reasoning behind the choice of green?” There was an even longer silence.
“Is green not the most restful colour?” was one suggestion.
“Because our eyes are more sensitive to green than any other colour.” Now, this is correct, but I don’t think it’s the right answer here. Actually, I’m not sure if anyone now knows why drapes are green, but it didn’t stop me asking.
“Drapes used to be green-green,” I said, “and now they are this bluish-green. Is this an improvement?”
You really do have to get used to long silences when you ask simple questions.
“I know you know the answer I’m looking for,” I said, “you learned about it in physiology.”
We weren’t getting anywhere. It was as if all physiology had been learned for the examinations, and then forgotten. Perhaps it had been.
“If you look hard at something red for a while, then turn away and look at something white, what happens?” This was akin to telling them the answer.
“You see a sort of red image,” suggested one.
“Red,” I said, rolling my eyes heavenward. “Not red.”
“Don’t you lot remember after-images? After staring at something, and you look away, you see the opposite colour. So, what is the opposite to red?”
A light-bluish green is technically the opposite colour to red. The idea is that after looking at something red — and most peoples’ innards are red — when you look away, perhaps to find an instrument, the after-image will be lost in the colour of the drapes.
Drapes initially seem to have been white — I guess they were something like sheets — though the real reason for the change in colour eludes me so far — and I’m not empirical enough to go trying to find out. Some surgeons in the past used red drapes temporarily when the operation field was contaminated. A bit like ‘red for danger’ and ‘green means safe’. 
Now, where did the idea for colours for danger and safety come from?

Time

The concept of ‘time’ is rather slippery; there is only the ‘now’, but we know there has been a ‘past’ and we expect that there will be a ‘future’. It’s a bit like drifting downstream in a boat, looking at the banks; we can’t row back upstream, nor can we row faster downstream.
Mechanical clocks used to show the local time, the time where they were. This didn’t matter that much originally, but the development of the railway and the telegraph lead to changes. There were two clock faces on the railway station in Basel, Switzerland; one showed Basel time, and the other showed the time in adjacent Germany — only a matter of a few minutes difference, but not identical.
Time was initially standardised in individual countries, the railways ran to it, but it was a while before there was international cooperation with the establishment of time zones, and the international date line. Of course, there was considerable national rivalry over this; but with the adoption of the Greenwich Meridian for ships’ charts, the same meridian was used as the basis for time. The French were not amused by this.
You might well wonder where the idea of having 60 minutes in the hour and 60 seconds in the minute came from; it doesn’t seem very logical. Why not a decimal system? The Babylonians didn’t used a decimal system, one with a base of 10, they used a system with a base of 60. Pragmatically, 60 can be divided by far more integers than 10 — by three and four for starters — and the Babylonians were very good at sums. And we’re talking around 5000 years ago. Another little echo from the very distant past.
If time only exists in the ‘now’, Newton and Leibnitz took this quite literally, and talked of ‘instantaneous’ events and changes, from which they developed their versions of the calculus. The two methods are similar, and while Newton and Leibnitz argued over who was the first to discover calculus, Leibnitz’s notation is the one we use today. If you never did calculus at school don’t be alarmed; I’m not going to describe it in any detail. You won’t have learnt, I’m quite sure, that calculus was regarded with suspicion for quite a while after it became public knowledge; it was seen as mathematical prestidigitation — chicanery almost — and as not very ‘pure’.
You might remember that differentiation is about finding tangents to curves, or acceleration, and that integration is about the area under a curve between two points on it. You might recall that there was a strange symbol to represent integration, looking something like this:
Did you know that it is the long form of the letter ’s’ ? It stands for summa — adding all the bits up under the curve. You’ve probably seen the long ’s’ in old books or reproductions of them, though often an ‘f’ is used instead. The best modern version I’ve found is ‘ſ’  (in a different font). It doesn’t have the cross-stroke of the ‘f’.

Thursday, 10 March 2011

Clockwise?

“Rotate the picture a bit clockwise, please,” I asked the assistant. The picture on the monitor turned anti-clockwise.
“No, clockwise.” He tried again, this time successfully.
We were doing a lap chole — a laparoscopic cholecystectomy (translation: key-hole surgery to remove the gallbladder) — and it’s essential to have a standard view of the puddings to get the correct orientation and recognition of the internal clockwork. Sometimes, rotating the camera one way has the opposite effect — it’s just how the optics are designed.
And this little problem gave me an idea. If you do a Creativity course, one of the things you learn is to look around the room, and try to imagine creative uses for whatever is in it. And this is where this series of little questions began. Of course, the chaps soon got wise to the questions, and when they changed rotas, the newcomers were briefed, to I had to keep thinking up new ones.
“So,” I said, “just why do clocks go clockwise?”
There was silence in the theatre. Just a simple question, but no answers were forthcoming. If I’d asked why you shouldn’t drink grapefruit juice when you are taking atorvastatin (a widely used anti-cholesterol drug), I’d have got a screed about enzyme induction in the liver. This simple question wasn’t what they were expecting.
I had to do a little prompting. I asked why clocks didn’t go anti-clockwise, and as they didn’t, there must be some sort of explanation.
The nurses got there in the end. “Sundials,” they said and, “they’re like the movement of the sun.”
“Yes, it must be like an imitation of the movement of the sun in the sky, or the movement of the shadow on an ordinary sundial. So at midday, when the sun is highest in the sky, the hour hand points directly upwards.” And if there’s a better answer, I don’t know it.
A supplementary question: “so, what does this tell you about the origins of clocks?”
Another long silence. I had to remind them of the motion of the sun, and where they had to look for it. And that the sundial’s gnomon points to the south. The penny eventually dropped. “In the northern hemisphere, of course.” 
Now, I do know that there are far older versions of clocks, candles with markings, and ancient water wheel like things. But I was thinking of mechanical clocks with the standard clock face.
But — how often is there a but — not all sundials go clockwise. Some, even in the northern hemisphere go anti-clockwise.


Even worse, there are mechanical clocks that go anti-clockwise — there’s one in Münster. There are two clocks on the Old Jewish Town hall in Prague; the upper is conventional, the lower goes anti-clockwise.


Some of the early mechanical clocks were very complicated; as well as the (local) time, there were rings and pointers for zodiacal signs etc — so complicated, that even with a guide book, it’s hard to work out what’s going on. Yet some at least of the population must have been able to read them when they were build — and this almost innate knowledge is something that has now been largely lost.



A Reasonable Thing to Ask?

In another life, I used to teach junior doctors. Not so much in lecture format — lectures are a pretty useless way of getting ideas and information across — but in small, interactive groups, often informally. I soon learned that they were very clued up about the detail of most things, and could recite it almost parrot fashion. Actually, they generally knew far more about these topics than I did. I thought that while some of their knowledge was useful, a lot of it wasn’t all that relevant, even if it was interesting. I also learned, and not just from them, that it’s often far harder to answer apparently simple questions — the sort that a child asks. And I learned that we actually know a lot more than we think, we just don’t realise that we know. Things that we learned in school, for example, often seem to be lost, forgotten. It’s a strange irony, but schooling often seems to destroy our ability to question as a child does. We don’t realise it at the time, but we are being moulded into the form that society deems to be appropriate and reasonable. And there is now the suspicion that the use of computers alters how we think.
Retrieving this forgotten knowledge doesn’t always come automatically. There are tricks — sorry, techniques — that are helpful, the sort of things that you might learn on a management course. SWOT analysis (strengths, weaknesses, opportunities, threats) and Kipling’s list (who, what, where, when why, how) are simple ones, and there are many hundreds more. And you can learn to think ‘out of the box’ and about ‘lateral thinking’.
Or you can just leave it to Google; someone else is bound to have an answer to your problem, if you can only formulate the question correctly. It’s fair enough to know where to find the information and the answers, rather than trying to remember everything; yet your own understanding won’t be improved by this second hand knowledge. For this you really need to question yourself. If you think you are rational, logical and empirical, then you’ll expect an explanation for (almost) everything. The answers won’t always be rational and logical, though.
Think of map reading and satellite navigation. The sat nav does all the work for you, so why bother to learn how to read maps? Well, you’ve only got to read a few horror stories of where the sat navs have taken people who blindly relied on them to realise that they are not infallible — and if the systems are turned off, you’re lost.
But you are happy to be reasonable, aren’t you? Here’s a quotation from George Bernard Shaw:
The reasonable man adapts himself to the world: the unreasonable persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man.