Everyone’s Gran

This could theoretically be about Mitochondrial Eve but in fact it’s about someone who lived many millions of years before her. But just to mention her in passing, this is how it goes. Mitochondria are ancient former bacteria who now live in the cytoplasm (jelly bit) of plant and animal cells, among others, and enable us to release energy from glucose efficiently. They still have some DNA and therefore have a genome. When a sperm fertilises an egg cell, its (his?) own mitochondria are usually abandoned and the only surviving ones are in the cytoplasm. Therefore, just as Y chromosomes are only inherited from fathers, mitochondrial DNA is only inherited from mothers – the cells and their mitochondria divide, passing on to future generations. They do change slightly with the passage of time, like all living DNA and even more so RNA, so they can be used to trace female lineages. My own female lineage is shared with most of the Libyan Tuareg, but that doesn’t mean they’re my ancestors so much as that my type of mitochondria used to be all over Europe and North Afrika but has disappeared elsewhere. North Afrikan humans have historically tended to be quite isolated because of the natural barriers of the Sahara, the Straits of Gibraltar and the narrow Sinai land bridge, so genes tend to linger there for longer. Anyway, mitochondrial DNA can be traced through its variation back to a woman living in East Afrika up to 230 millennia ago. This woman is of course everyone’s gran, although there would clearly have been other people around at the time who weren’t descended from her and she and they would likewise have been descended from a single woman in the more distant past, and so on. This is a bit long-winded but I just wanted to get that out of the way.

But we have much more distant grandmothers. Technically we could and kind of have traced them back thousands of millions of years to the LUCA – Last Universal Common Ancestor – living in the Hadean, soon after the formation of this planet. My focus today is on animals who existed in the final one percent of this planet’s history so far, and it comes out of the question “are we descended from monkeys?” The answer is to do with clades.

A clade is a group of organisms who are all descended from a single ancestor, that ancestor often being considered as an entire species rather than a single individual. Living humans constitute a clade of descendants from Mitochondrial Eve and also from Y-chromosomal Adam. Individual bits of genomes can also be thought of as clades, so although they overlap completely nowadays, we are all in a number of different clades, including all descendants of Y-chromosomal Adam and all descendants of Mitochondrial Eve despite the fact that there would have been countless individuals in the stone age and perhaps later who would only have been in one or the other of those. Cladism has taken over the practice of taxonomy, which is the science of classifying organisms. Before genomes were routinely fully sequenced, it wasn’t possible to classify them as precisely as they are today, leading for instance to the idea of edentates, a group which included anteaters, sloths and armadillos, but also the relatively distantly related aardvarks and pangolins. Nowadays that order has been retired because it turns out that the first three are so distantly related to all other placental mammals that they are more or less a sister clade to them almost like the marsupials, and aardvarks and pangolins are not closely related to each other at all either. Nonetheless I have a lot of affection for the old system because it did usually reflect morphological similarities which did reflect natural kinds in my opinion, just not natural kinds which reflected genetics as such.

A taxon (classification category for organisms) is a clade if it includes all individuals descended from a common ancestor. This is called a monophyletic taxon. The other two types are polyphyletic and paraphyletic. A paraphyletic group is one which excludes some descendants of a common ancestor, an example being the old idea of hominids as excluding chimpanzees, bonobos, gorillas and orang utan but including humans, and a polyphyletic group is what’s left once you’ve defined a paraphyletic one, so for instance the idea that orang utan, gorillas, chimps and bonobos exclude humans but form a natural group separate from us. Incidentally, I’ve oversimplified here because there are two species each of gorillas and orang utan, so in fact there are seven surviving species of ape. A particularly striking example of a polyphyletic group is the reptiles, which are of course snakes, lizards, crocodiles, alligators, tortoises, turtles, terrapins and the tuatara, which is not a “real” group because crocodiles and alligators are more closely related to birds than they are to any of the other “reptiles”. This always comes to mind when I consider the reptilian humanoid conspiracy theory, since this claims that we are ruled by alien reptiles when in fact in a sense reptiles themselves don’t really exist as such, so what exactly are reptilian humanoids supposed to be? A slightly more annoying consequence of cladism is that it wipes out the previous supercilious bit of pedantry where you “knowingly” claim that apes are not monkeys, removing that particular bit of masculinist discourse and information-hoarding, because in fact we are monkeys genetically speaking, and it’s this that finally gets me to the point of this post: omomyids. Well, almost.

Humans are in fact monkeys because humans are apes and apes are monkeys. Apes are a specialised development of the Old World Monkeys, i.e. the likes of macaques, langurs and baboons rather than capuchins, marmosets and spider monkeys, with various traits in common with them such as properly opposable thumbs, but lack tails and are more terrestrial due to their weight, and have wider bodies than other monkeys. Cladistically, we’re monkeys in a similar sense to being animals, and more specifically Old World Monkeys. There are also New World Monkeys. It would’ve been interesting if the New World Monkeys had evolved separately from non-simian ancestors, because apart from anything else it would increase the probability that humanoid aliens exist, and in any case it would’ve been an interesting and very close to home example of convergent evolution, like spiny tenrecs and hedgehogs or whales and fish, but in fact all monkeys form a clade. There is, however, an example of convergent evolution in the New World Monkeys in that their physically smallest taxon, including the marmosets, include examples who are physically quite similar to the prosimians found in the Old World who are not descended from the same ancestors, namely the likes of pottos and slow lorises. Something slightly similar also happened with apes, where gorillas seem more human than chimps and bonobos but in fact it’s the other way round, probably because gorillas and humans are both larger than chimps and bonobos. But in any case, humans, other apes and other monkeys are all descended from an older group which is almost completely extinct, but is today represented by tarsiers. Tarsiers are the most physically similar primates to the omomyids, who are our common ancestors.

Tarsiers, however, are rather specialised. They’re carnivorous and, being nocturnal, have very large eyes which dominate their heads. Because these eyes are so large, they cannot move them and instead have necks which can turn their heads almost all the way round like owls. They seem to have been around since the Eocene, over thirty million years ago, and I personally suspect they’ve survived longer than their relatives because they’re specialised in a direction which few other primates are, being for instance carnivorous. In the Galactic Association universe, incidentally, all primates native to the planet Athena are carnivorous and I’ve speculated that they’re descended from tarsier-like forms after this specialisation took place. There are arboreal predatory mammals elsewhere on Earth but none of them are primates.

If you take away the specialisations tarsiers evolved which seem to have preserved them for so long, you get a bush baby like animal who has smaller eyes, can’t turn her head most of the way round, is omnivorous and sometimes active during the day more than at night. This is an omomyid, and is our Eocene ancestor. Omomyids were, slightly surprisingly, native to North America (as well as Europe and elsewhere) although there are no longer any indigenous primates in North America apart from humans and those who have spread into Mexico from South America when it collided with its neighbouring continent less than three million years ago (and of course Sasquatches if they exist, which I myself doubt). This complementary distribution also happened to camels, who nowadays only exist where they originally didn’t. Although some omomyids did have smaller eyes, most of them did have rather large ones and therefore were likely to be nocturnal. There’s a rather unreliable principle that once a feature evolves or disappears, the descendants of the organism won’t retrace its steps, and this really doesn’t work very well but it probably does apply to the omomyids in the sense that we probably descended from the ones who weren’t nocturnal, one of whom lived in Texas. At the time, it seems that most of the land surface of the planet was covered in rain forest. The Eocene was one of the hottest periods of all in the history of Earth, and there were even rainforests in the Arctic at the time. Hence Texas was, like everywhere else, full of thick, impenetrable jungle.

Omomyids weren’t carnivorous. Their teeth suggest they ate fruit like many other primates. Incidentally, while I’m talking about teeth, the presence of canines is not indicative of an omnivorous or carnivorous diet, and like other primates, herbivorous or not, omomyids did have canines. In fact humans have smaller canines than most other apes, including practically herbivorous ones such as gorillas. Getting back to the point, it’s hard to say that humans are descended from monkeys, partly because we just are monkeys, but also because going back to the Miocene gets you a huge thicket of lines of descent, including more and less humanoid forms not all of which are closely related to each other or ancestral to us, and consequently we can say we are descended from monkeys, though it’s unclear which ones, but it’s a much stronger claim that all of those forms, including our ancestors, descended from omomyids, who were more like tarsiers than monkeys as we tend to think of them today.

They did, however, have a sister group, the adapids. These were less similar, being entirely non-simian, but are ancestral to the prosimians such as lemurs, bush babies and lorises. These shared their habitat with omomyids and used to be thought our direct ancestors, but it turns out they are real examples of convergent evolution with monkeys and are only similar to them because their lifestyles were similar. They also lived in North America, for instance what was to become Wyoming. Modern apparent descendants of adapids have a “grooming claw”. Whereas most primate digits have nails rather than claws, one finger is sometimes specialised to enable the animals to groom themselves. This was thought to be confined to the likes of lemurs, but it turns out some monkeys have them too and adapids sometimes have them or lack them, which was thought to be crucial to deciding whether they were our ancestors. In fact it seems that grooming claws evolve sporadically throughout the primates and don’t indicate any particular pedigree.

Pushing back further gets us to the common ancestors of the flying lemurs, who are like flying squirrels with a membrane between their limbs allowing them to glide between trees, and the primates – the primatomorphs. These used to be thought of as also being ancestral to the tupaias or treeshrews, who have also been considered primates in the past, but very recently these seem to have been revealed as being closer to rabbits and rodents. However, rodents are also fairly closely related to us, and at this point another example of irritating pedantry arises. It’s often said that rabbits aren’t rodents, and in fact this is true – hares, rabbits and pikas are all in the order lagomorpha, who unlike rodents are completely herbivorous and have four incisors in their upper jaw by contrast with the two of rodents (and I think always eat their own poo in order to finish digesting it, unlike rodents, but I might be wrong there). However, above the level of species there really is no firm definition of any category of organisms, and lagomorphs are closely related to rodents, so as far as I’m concerned it’s nitpicking, and another example of masculinist information-hoarding and gatekeeping signalling if that’s a thing, to point out that rabbits are not rodents. What is interesting about all this is that rabbits, rats, colugos, tupaias and humans are all in a relatively huge group of relatives called the euarchontoglires. We have little in common not also shared with other mammals, but we are unique among placental mammals in usually being born with a vermiform appendix – some marsupials also are, so it’s not a diagnostic trait. The earliest known euarchontan, and also the earliest known mammal with fingernails and toenails, was Purgatorius, who lived almost immediately after the extinction of non-avian dinosaurs. The problem, incidentally, with mammalian palaeontology of small mammals is that often they can only be identified by dental records because that’s all that survive. Hence you will see a lot of the oldest mammals referred to as something-odon or something-odonts, because that’s all anyone has to go on, initially, and this is true of Purgatorius, who was initially identified by bits of jaws with teeth attached to them.

The basic purpose of tracing our ancestors back this far was just to explain my reference to tarsiers in yesterday’s rant about veganism, i.e. who they are in relation to us. It’s possible to glimpse from their own ancestors who we were before the impact which ended the Cretaceous, and one of the interesting things about that is that not only do they lead us into the age of dinosaurs, but in the case of Purgatorius they might even have been around when it ended.

Veganism and Depressive Realism

Suppose you have a palate of paints which concentrates on dark blues, black and greys, but has hardly any warm, “sunny” colours like oranges, yellows and reds or the lighter shades of other colours.  And also suppose that you can paint representational scenes quite convincingly.  If you try to paint a flower meadow on a warm sunny day, you might run into problems although for all I know you could either “dither” dots of different colours to create a more convincing gamut or mix paints together to do so, which since paints subtract from lightness rather than add to darkness would probably result in a darker picture than you’d be able to produce using a wider range of warmer colours.  You would struggle to produce a convincing or cheerful picture, though even in this situation you might do well with cornflowers, darker clouds in the sky or shadows.  Imagine a version of «Un dimanche après-midi à l’Île de la Grande Jatte» with more emphasis on the shadow in the foreground and by the trees where the lake looks dark and threatening and the sky gloomy.

However, should you be asked to paint a stormy seascape at night, you’d be in your element.  You’d be able to use the many dark colours at your disposal to create all sorts of detail and accurate portrayal of the foundering ships, the waves and the rain slashing down.  A scene, in fact, like the painting depicted on the back wall of René Magritte’s «La traversée difficile».  In such a painting, you would perhaps have problems with the room, with its yellows, pinks and white, but the shade on the chesspiece-like object would be much starker, as would that under the hand trapping the bird, and of course the seascape would be particularly well done.  In other words, in both cases you would find it easier to bring out the darker aspects of the paintings but struggle to portray the sunnier side of things.

One theory of depression is that people who are depressive, and incidentally I would be astonished if I wasn’t diagnosable as depressive but I’ve never bothered to have it diagnosed because I don’t consider it a problem, have a problem with the neurotransmitter serotonin which would otherwise enable them to adopt a sunnier disposition more easily.  Historically, serotonin originated in the enteric nervous system of the gut, where it’s associated with vomiting, among other things.  As the central nervous system evolved, it took up an important role in the brain, where it enables one to see the brighter side of things.

Note the racist connotations of associating darkness with negativity and brightness with positivity.  Had I written the sequel to ‘Replicas’, I planned to subvert the idea of brightness as positive, but nobody did it as well as Orwell in ‘1984’ when he put the words (and this may not be a direct quote) “when we meet again, it will be in the place where there is no darkness”, which turns out to be a solitary confinement cell flooded with blinding light 24/7 to prevent the inmate from sleeping.  In my limited experience of mainstream literature, this is maybe the most inspired line I have ever read, which probably means I should read more widely.  The racism of associating darkness with negativity can, though, be subverted by noting the more positive aspects of acknowledging that side of reality.

Experiments have established that depressive people are more likely to cotton on to situations where they lack control.  Two groups of people were asked to switch a light on and off where the operation of the light was in fact (pseudo-)random, not controlled by the switch at all.  Half were diagnosed as depressive, the other not.  The former half realised much sooner that they weren’t really controlling the light at all than the latter.  The problem is, however, that the delusion of control and many other things which non-depressive people have is probably adaptive in that it helps them cope by shielding them from the stark realities of how awful life really is.

I have a mini-hypothesis about how depression works on a brain cell level.  I think memories supervene upon pathways between neurons which can be strengthened or weakened by other inputs to those networks.  A repeated noxious stimulus leads to a strengthening input to such a circuit which makes it more memorable.  In depressive people, this happens more quickly than in people who aren’t depressive.  It’s also noted that anxiety and depression share many features and are probably substantially the same problem, insofar as they are problems at all.  This idea of noxious stimuli being reinforced more easily would also work for anxious people, although this common pathway isn’t necessarily associated with other aspects of the two situations.

Antidepressants do focus pretty much on serotonin, aiming to increase the concentration of the substance at the synapses of nerve cells using it, and this seems to work although much of the research focussed on dominance in chimpanzees, which brings up the issue of veganism for the first time.  Other apes are, I have to admit, in a different category than most other animals for me, since I am myself an ape as are, in all probability, you.  Hence my speciesism leads me to judge the idea of much experimentation without informed consent on apes as completely beyond the pale, although I can’t say I’m exactly keen on nasty things being done in the name of science to the likes of oysters either.  Besides this, I’m not sure the emphasis on dominance is particularly healthy.

The idea of depressive realism shouldn’t be taken too far because depression is more about selective than global perceptiveness.  That said, reducing the situation to brain chemistry is misleading, and the accuracy aspect here comes into play because someone who is, for example, thrust into a depressing situation such as poverty, physical ill-health or living in an oppressive régime is likely to get depressed because of the accuracy of their perception rather than inaccuracy.  There’s a disturbing tendency along the lines of “there’s no such thing as society” going on here, where the mental problems one might have are centred more on the internals of the individual rather than the circumstances foisted on them by external forces or people.  It isn’t either/or of course, but accurate perception can be depressing, and if one becomes aware of how bad things are, one may become depressed.  This isn’t, by the way, supposed to diminish the reality of depression or anxiety states as real conditions:  assuming I have myself been depressive, I could definitely characterise it as a tangible, physical issue like a leaden millstone pulling me down into a pit of despair, just as trying to run while having the ‘flu would be pretty hard and ill-advised.  That said, the realities of living in a carnist world are depressing and worrying, and that’s not my attribution as far as I’m concerned but an observation about the external world along the lines of “a stormy night at sea is dark, a sunny day in the park is bright”.  In a sense those qualities too are subjective – a tarsier or an owl wouldn’t see the night as dark as I, for example – but there are two things going on here.  Firstly, being depressive or anxious, even if it is a form of preëxisting neurodiversity, could prime one to see these negative aspects of non-human animal abuse and lead to one taking the rational step of becoming vegetarian or vegan.  Secondly, it’s pretty depressing and worrying to be confronted with the sheer scale of cruelty and murder involved in maintaining carnism, so even if you’re not depressed and anxious before, it wouldn’t be surprising wert thou after thou dispensest with the rationalisations thitherto shielding thee therefrom.

At this point I want to insert a caveat regarding judgementalism.  In spite of the fact that veganism is clearly the way to go, in the sense that it’s a moral imperative, that doesn’t mean I think badly of carnists.  We are constantly surrounded by slaughter and suffering on an unimaginable and horrifying scale.  Badgers eating chicks, cuckoos throwing the same chicks out of nests to die, spiders paralysing flies and gradually eating them at their leisure, our own willingness to kill our parasites, our immune response wiping out bacteria in their billions – the whole biosphere is an aeon-long slaughter fest.  Next to this, anything the human population can do to reduce this simply by pursuing a plant-based diet (which is of course not what veganism is) is like the fairy tale of the bird sharpening her beach on the diamond mountain in Pomerania once a century.  Consequently, even if someone eats nothing but dead flesh for that same century, it makes practically no difference to the death and agony characterising existence.  What matters to me is my agency and ability not to be part of the problem, not what other people do.  That said, I do believe ego defences come into play when people try to justify the unjustifiable.

An example of this rationalisation was brought to my attention this morning in the form of a research study from the University of Alabama purporting to demonstrate that vegetarians are more likely to suffer from depression, anxiety states and practice easily observable self-harm than carnists, and therefore that carnism was good for mental health.  Thisses context is the advocacy of veganism and vegetarianism as healthy diets, which to me is a complete red herring because it has nothing to do with being vegan except insofar that one should be a good example to carnists.  Before I go on to cover the issues with their idea, I want to examine the possibilities that this is correct, and it may well be even though this is tangential.

Plant-based diets tend to be lower in easily absorbable iron than those including red meat, as I understand it, and of course an unsupplemented clean plant-based diet will have no cyanocobalamine (vitamin B12) in it at all as well as no dietary sources of vitamin D.  There is in fact an Afrikan plant which does contain vitamin D at a dangerously high level – all substances are toxic, just some are more toxic than others, and vitamins D and A are particularly poisonous.  It’s been alleged that vitamin D is antidepressant although in fact as I understand it this can’t be disentangled from the lifestyles of people who are deficient in the hormone (vitamin D is a steroid hormone synthesised and secreted by the skin when exposed to sunlight), who may be stuck indoors and rather reluctant to exercise because they’re depressed.  That said, the list of deficiencies associated with a poorly-designed plant-based diet may well lead to anaemia, which is associated with depression and anxiety if emotions are labelled in that way.  Moreover, serotonin is itself synthesised from the amino acid tryptophan, which is of course found in animal protein (as well as vegetable protein).  There may also be issues regarding essential fatty acids, which seem to influence the electrical activity of the brain.  Therefore there are indeed firm reasons for supposing that if someone just goes at a diet without it having been planned rationally, they may well become depressive, or more depressive, if they simply stop intentionally consuming animal products.  However, and I can’t emphasise this too strongly, this way of looking at things is completely skewed ethically.

It’s a basic principle of applied ethics that unnecessary suffering and death should be avoided, and this is the basis of veganism.  Vegetarianism is a diet, more or less. Veganism is more akin to pacifism.  It’s the attempt to minimise violence and killing, and it’s a moral imperative.  Ultimately there are no excuses for not being vegan if you are human and have a conscience, no matter when or where you are.  If you can’t be vegan where you are or given your current lifestyle, that lifestyle or location must change, quite possibly by changing the political situation which has forced you into a food desert.  On the whole this is not racist, but even that possibility should not stand in the way because veganism is an absolute ethical priority and trumps racism.  There’s an easy analogy with cannibalism here.  If it were discovered that human flesh was the only source of an essential nutrient, global capitalism would of course start farming people and murdering them to provide it, but from the current perspective of most people as yet uninfluenced by pro-cannibal propaganda in the mass media it probably seems wrong to do that rather than find a way to synthesise it or get it from other sources.  The absence of cyanocobalamine and perhaps the low level of tryptophan in ethical food sources does not imply that one should get it from unethical sources but that there is a moral obligation to remedy the situation by finding ethical ways of providing it.

Therefore this piece of research, which appears to establish a correlation between vegetarianism and mental illness, has its priorities wrong and comes close to attempting to justify carnism, which is never permissible.  If the confounding variables of the depressing and anxiety-provoking reality of animal abuse on a vast scale are removed, which are either perceived more clearly by depressive and anxious people as a motive for not wanting to be part of it or borne in upon them after the fact once they’ve dropped their excuses for eating animal products can be removed and a difference can still be demonstrated, the appropriate response is to find a way to remedy it without compromising the obligation to be vegan.

It’s so very common for arguments against veganism to do two things.  One is to cast veganism as if it’s a diet.  It isn’t, and since the animal with whom we have the most conscious interaction is Homo sapiens, most of it involves us behaving compassionately towards each other, individually and en masse.  The other is not to take the moral necessity to be vegan seriously, and try to find reasons why it’s bad in some non-ethical realm of discourse, usually nutritional.  I personally find it a little suspicious that this study was conducted at the University of Alabama because of its location in a former Confederate State which previously had a decidedly non-vegan interest in certain human animals of Afrikan extraction, regarding them as chattels, and I can’t help thinking that this might have a bearing on their attitude to other species, so the question of racism does indeed arise here, though perhaps not in the direction which some anti-vegans might like it to.  But maybe I’m wrong.  Maybe the University of Alabama is a bastion of liberalism in other areas.  This study, however, is not an example of liberalism.

11:51 am GMT, Thursday 11th September 1986, Rickmansworth

Some people have too much time on their hands. Other people know when to stop. I am in the former category. Then again, maybe we all need to escape sometimes and preserve our precarious mental health.

Spoilers for ‘The Hitch-Hiker’s Guide To The Galaxy’. I sometimes think I should just succumb to the inevitable and write a couple of hundred blog posts on the thing, but maybe not yet. But I will write at least one at this point.

“And then, one Thursday, nearly two thousand years after one man had been nailed to a tree for saying how great it would be to be nice to people for a change, a girl sitting on her own in a small café in Rickmansworth suddenly realized what it was that had been going wrong all this time, and she finally knew how the world could be made a good and happy place. This time it was right, it would work, and no one would have to get nailed to anything. Unfortunately, before she could get to a phone to tell anyone, the Earth was unexpectedly demolished to make way for a hyperspace bypass, and so the whole idea was lost forever.”

This is of course the opening narration from Fit the Second of ‘The Hitch-Hikers’ Guide To The Galaxy’, foreshadowing the big revelation a while later that the purpose of the Earth was to calculate the Question To The Ultimate Answer Of Life, The Universe And Everything. There are some problems with it. For instance, we’re told that it was right and it would work, even though it turns out the computer program has been messed up by the arrival of the Golgafrincham B Ark two million years previously with a hold full of frozen middle management people and telephone sanitisers (incidentally one of my aunts was one of the latter), so we might expect it not to work and to be wrong, but apparently it did. However, if we drop all this and just stick to the idea that Fenchurch, for it is she, is the final nexus of the computer program about to provide the readout of a successfully executed procedure to calculate the Ultimate Question, onto whom all events in the Earth are converging, the question arises in my mind as to whether there are any clues about the content of that revelation is.

In order to calculate the significance of Fenchurch’s life, it might be instructive to calculate the exact time this is supposed to be happening. We know the end of the world is on a Thursday because the narrator says “On this particular Thursday” just after Arthur says “I never could get the hang of Thursdays”, even though he doesn’t himself seem to be aware that it definitely is a Thursday. There are some problems with this. For instance, the conversation between Ford and the publican of the Red Lion in Cottington includes a reference to a football match involving Arsenal, which makes my poorly sports-educated brain think it’s more likely to be on a Saturday. This issue has unsurprisingly been discussed elsewhere. Another sports-related incident is the attack of the Krikkitmen on Lord’s Cricket Ground during the Ashes, which had taken place two days previously. Oddly, this is not mentioned by anyone on that “terrible, stupid Thursday” but this is along the lines of stuff that happens in the Whoniverse being glossed over by the general public, who only see what they want to see. After all, there is such a thing as a Somebody Else’s Problem (SEP) Field, an issue to which I shall return shortly.

The Ashes match took place some time in the 1980s. This is mentioned in ‘Life, The Universe And Everything’, chapter two I think. Every time England hosted the Ashes in the 1980s was in June, which contradicts other data. At the start of the first episode of the TV series, the sun rises in southwestern England at 6:30 am and the Earth is destroyed at 11:46 am. This can be assumed to be GMT rather than BST because the pubs are open and crowded and Arthur refers to the time of day as “lunchtime”. It places the date as one of four possible dates through the year, which drift somewhat during the 1980s because of the leap years 1980, 1984 and 1988. In fact the sun never rises at precisely 6:30 am in Greenwich, but at 6:29 or 6:31 instead, but in the West Country it will rise at a slightly earlier time because it’s further west. Taunton is three degrees, six minutes of arc west of Greenwich and Salisbury one degree, forty-seven minutes west. Arthur must live near Taunton because when he gives Fenchurch a lift, he almost takes her there, and Ford bought a towel from the Salisbury branch of Marks And Spencers, so the fictional village of Cottington is likely to be somewhere between the two of them. Consequently, the reported sunrise will be slightly earlier than the official sunrise at Greenwich, so the reported time is still possible. The four possible dates are 7th or 8th March or 11th or 12th September. Thursday falls on one of these dates five times during the 1980s: 11th September 1980, 8th March 1984, 7th March 1985, 12th September 1985 and 11th September 1986. It clearly isn’t in March according to the television series because it’s a warm sunny day, so it can only be 1980, 1985 or 1986. The Ashes took place in England in 1985, but as I said they were in June so this is unhelpful because it clearly conflicts with reality, so I’m going to put that down as a variation between our universe and this particular Hitch-Hiker’s universe, whereof there are of course many. One problem with the earlier date is that it occurs before the TV series was broadcast, and Marvin appeared on the radio in 1981 to claim that the world might end that year, so the only possibility remaining is Thursday 11th September 1986.

As for the time, it can also be calculated that the crucial moment of readout was 11:51 am GMT. Slartibartfast states that “five minutes later, it wouldn’t’ve mattered so much” if the Earth had been destroyed because that would have happened, so we know that it would have happened by five minutes after 11:46 am, in other words 11:51 am GMT.

That, then, is established. Fenchurch realised the Ultimate Question at 11:46 am GMT on 11th September 1986, and it would’ve been revealed to the world by 11:51, except that by that time there kind of was no 11:51 except on Arthur’s digital watch, where it was pretty meaningless because very few other planets would have similar orbits or rotations to ours and then there’s the whole “nailed to a tree” thing, which didn’t happen anywhere else either as far as anyone’s been told. But what’s happening in the café? Presumably her presence, demeanour, state and life story all have a bearing on this, as have the general configuration of objects and timing of events in that place. It’s been pointed out that the significance of a young woman suddenly having an idea in a café is different today, because J K Rowling wrote in them in the early 1990s, where she used to take her baby to help her sleep and she’d be able to write. I was doing something similar at the time in parks with my coursework for herbalism training. However, Rowling didn’t have an “aha” moment like Fenchurch’s in one, but on a train, so it doesn’t quite work, and Fenchurch isn’t a parent.

At this point I should return to the subject of the SEP Field. This was erected around the Starship Bistromath, which is of course powered by the Bistromath Drive. Bistromath is an apparently rather unpopular joke considered by some to be a failure, which capitalises on the idea that when you try to pay a restaurant bill and divide it fairly between members of a party, it never seems to work out properly. There are problems in the writing here, I think, in that there’s no direct connection between that idea and being able to relocate a spaceship, unlike the Infinite Improbability Drive which kind of works on something a bit similar to quantum mechanics. That said, if the Heart of Gold can use that kind of fake maths, why shouldn’t the Starship Bistromath? More to the point of this post, the rocket science possibilities of Bistromathics are not important to the situation I’m about to apply it to.

Numbers are not absolute but depend on the observer’s movement in restaurants. In reality this is probably due to the effects of the likes of the Pan-Galactic Gargle Blaster, but passing swiftly over that as an anachronistic celebration of a substance abuse syndrome and concentrating on the actual “maths”, there are three non-absolute numbers: the number of people for whom the table is reserved, the recipreversexclusion and the most relatable relationship between the number of items on the bill, the cost of each item, the number of people at the table and what they are each prepared to pay for. Clearly these can be applied elsewhere, as the SEP Field involves the use of the recipreversexclusion. It’s a number which can only be defined in terms of what it isn’t.

Consider now the Ultimate Question/Answer problem. The two cannot exist in the same universe. You can either formulate the question correctly and never get an answer or find the answer, i.e. forty-two, and always fail to formulate the question correctly. This is remarkably similar to the recipreversexclusion and is even a number in one case. It’s also possible to conclude that Fenchurch herself would have to have a blind spot for the number forty-two or the two would coexist in her mind at the same time, which is impossible. In fact it’s even possible that she’s completely dyscalculic, and I don’t think there’s any evidence she isn’t.

Bistromath is shown working in the starship via a simulation of a meal at an Italian restaurant, which is so precisely defined that even moving a breadstick would mess things up. Fenchurch is sitting alone in a small café in Rickmansworth, another catering establishment, drawing a conclusion relating to what seems very like a recipreversexclusion. At this point I’m going to make a leap of faith and decide that Bistromath is going on in the small catering establishment concerned, just as it did two days previously at Lord’s Cricket Ground.

Probably the most relevant data regarding the events in that establishment are available in the opening to the second episode of the TV series. It can be seen from this that the café concerned is called the Silver Slipper, is at number 286, on the left hand side of its road and is red on the outside with blue and yellow walls and the same shade of red for its furniture as the outside. Fenchurch herself is seated at the second table on the right, next to the counter, facing the window, and has a black coffee with sugar and the remnants of a slice of toast at the table. During the shot, she stirs her cup twenty-six times before realising the meaning of life, pausing once after thirteen stirs. I would conjecture that she has in fact stirred her coffee forty-two times exactly by that point. Initially I thought her insight was connected to the turbulence of the milk but in fact the coffee is, unusually, black with sugar. She’s also wearing a Newnham College, Cambridge University scarf. This is a women’s college, of which the famous suffragist Milicent Fawcett was an alumna along with a very large number of other famous and successful people. The other two occupants of the café are a man in a donkey jacket looking at page three of The Sun in the window and the man who has apparently just served her, behind the counter. There are eight cups on the counter, a further cup being filled from the coffee machine, an unaccompanied cup on the table behind her and one sugar bowl per table. A poster behind her has the word “star” in red on it. All of these can probably be interpreted as clues. The total number of coffee cups is probably twelve. Fenchurch has red trainers, red nail polish, a fawn trenchcoat, long straight fair hair and is white. Incidentally, the actor playing her in the TV series is not credited and generally unknown. It’s later revealed that she was conceived in the queue at Fenchurch Street Station, which judging by the probability that she’s an undergraduate student in 1986 would have been in the 1960s, calling to mind the “early ’60s sitcoms” of which Arthur spoke. She also has a dismissive and annoying brother called Russell who thinks she’s insane.

The recipreversexclusion has already been accounted for: it seems to be participating in the thoughts currently occurring to Fenchurch. Her possibly specific number-blindness for the number forty-two may have been triggered if she was counting the number of times she was stirring her coffee. I think she was also expecting to meet someone there who hasn’t turned up, because that would be the first non-absolute number. It’s also possible that that person was about to arrive because that would also be a recipreversexclusion and perhaps that person would’ve been the first recipient of the Ultimate Question. It’s also remarkable that she didn’t think to share the information with the person behind the counter or the guy sitting nearby, and this too might provide a clue as to what it is. She never receives a bill, of course, what with the world ending and all, so that aspect of Bistromath doesn’t come into play and is in any case after the fact of the readout. It might even be that money itself would cease to have any meaning precisely because of the Ultimate Question, but this makes sense because all of the relationships between the number of people at the table, the number of items on the bill (either one or two) and the cost of each item are all fundamentally uncertain when combined. The street number, 286, seems to be entirely unremarkable, which corresponds to the integer forty-two, which is also unremarkable and “the kind of number you could introduce to your parents”. So far as I can tell there is nothing whatsoever significant about that number or its relationship to forty-two, which from a real world point of view is a little surprising.

I haven’t been able to draw much of a conclusion from this castle in the air but if I were to take this even more seriously I could practically found a religion on this. It is interesting that the date has to be September 11th though.

African-Americans And The North Afrikan Buffer

It’s inevitable that a privileged person blundering into a topic regarding discrimination against less privileged groups will get something wrong, perhaps even offensively so. Hence there’s an argument that I, the world’s whitest woman, shouldn’t even be writing this but I’ve also found that when I hold back from doing something, other people will wade in and do a much worse job because they don’t suffer from my scrupulosity. Consequently, I just am going to say this, despite my ignorance on the matter.

“African American” is a term used widely in the States to refer to the descendants of formerly enslaved Black people born in the USA. I must reveal at this point the first bit of ignorance I’m aware of: I presume that the Afrikan origins of these people’s ancestors cannot be pinned down easily to specific ethnic groups in Afrika (I’ve gone into the K spelling before), and they usually seem to have lost their names, hence “Malcolm X”. The diversity of people originating from south of the Sahara in that continent, culturally and genetically, is considerable. For instance, at least thirteen language families are spoken on the continent, excluding Indo-European but including Austronesian in Madagascar, most of whose languages are spoken in the Pacific and Southeast Asia. Genetically, ten types of human Y chromosome dominate indigenous populations over the whole continent compared to the six of the whole of Europe, the three of the Americas, the one of Australia and the nine of Asia. Regarding mitochondrial DNA, inherited from the mother, there are forty-three groups outside Afrika and eighteen within it, again in terms of indigenous population. I don’t know much about Afrikan cultures but I would expect similar diversity. The genetic element of this diversity is, as I’ve said previously, a good a priori argument for the idea that if genetics are an important factor in human intelligence, Afrikans can be expected to be more intelligent because they tend to be less inbred than the rest of us. Therefore one of the problems with being African-American, I imagine, is not knowing your heritage beyond the awareness that your ancestors were from Afrika south of the Sahara. However, it would be unfair to suppose that an African American would simply leave it at that all of the time and I’m also aware that most enslaved Afrikans outside the continent would have been from West or Central Afrika. Hence the term “African American” is somewhat inaccurate. It does not in fact refer to all people whose ancestors originated from Afrika. There are forty million African Americans in the restricted usage of the term, but also eight hundred thousand Americans whose ancestors originated in North Afrika. Of those, thirteen hundred identify as Berber. But why should this matter? Surely the important thing about African Americans is that their ancestors were enslaved and disenfranchised, and that they’ve been forcibly wrenched away from their own history, isn’t it? Well, yes and no.

There is a problem with North Afrika being left out of the picture. North Afrikans themselves tend to identify as Arabs more than Afrikan, and historically as Roman or French citizens, among a number of other things. Egyptians have had a slight tendency to regard themselves as something apart too, and of course there are the Berbers and the Tuareg. A different kind of imperialism is involved here. Firstly, there was a time when the provinces of Africa, Aegyptus, Creta et Cyrene and the two Mauretaniae were part of the Roman Empire, as was much of Great Britain and all of Asia Minor. A telling aspect of this grouping is that Creta et Cyrene includes both a stretch of coastal North Afrika and the nearby island of Crete, which might be more often thought of as part of Greece nowadays, and therefore European. Likewise, Carthage is not thought of as an Afrikan power even though it was. There’s also a tendency for Ancient Egypt to be considered separately from the rest of Afrika even though it was one of the first and most durable post-Neolithic civilisations of all. It’s almost like, if something is influential and long-lasting in terms of Western history but happens to be in Afrika, it automatically becomes an honorary European culture because “that kind of thing doesn’t happen in Africa”. Of course it does, in other parts of the continent as with Timbuktu, the Songhai Empire and Great Zimbabwe, but if it’s close enough to Europe, somehow it no longer counts as Afrikan. North Afrika has also been subject to European colonialism and imperialism in recent history, although it may have been spared some of the atrocities committed by Europeans further south. There has also been Arab imperialism in North Afrika, but this also affected some of the rest of the continent such as the Swahili coast, the Sudan and Abyssinia, and there was also an Arab slave trade from East Afrika.

The reason I think this is important is cosmopolitanism in the Ancient Greek sense. I don’t want to deny the importance of Afrikan identity in terms of a common history, ongoing, of external imperialism and slavery, but there isn’t enough emphasis on the fact that the human race forms a continuum, genetically and socially. “Blackness”, like gender, is substantially related to how others perceive one’s physical form and has major consequences for one’s social status. Hence having darker skin may be the most important characteristic used to justify racism. If, however, the transition between Europeans and “Africans”, and between “Africans” and Middle Eastern people, is included, it makes it harder for white people of Northern European origin to “other” Afrikans. Thus it might on the one hand seem that making this kind of claim about Afrikan identity is a form of erasure, but it’s also true pan-Africanism. Another aspect of this is that pan-Africanism would recognise the various peoples in North Afrika who don’t identify as Arab, such as the Tuareg and Berber.

The Tuareg are particularly significant here in terms of skin tone, but before I get to that I want to point out a parallel between the social construction of ethnicity and that of gender. These are both externally defined by factors over which one has little control. Just like gender, ethnicity in terms of oppression and privilege is dominated by the visual feature of skin tone, but although white people do form a relatively homogenous genetic group, Black people don’t. All the latter have in common genetically is the few alleles of about three or four genes related to the inheritance of skin colour. White people are likely to perceive Melanesians and Australian Aboriginals as Black and therefore place them in the same category as African Americans even though Melanesians and Australian Aboriginals on the one hand and White people on the other may have more in common with each other genetically than with one of the many genetic groupings of individual Afrikans south of the Sahara. Back to the Tuareg though. These people, with whom like many other white Europeans I share one of the most popular versions of mitochondrial DNA, H1n, vary in skin tone between what probably most white people would think of as white and what they would definitely think of as black. I’m not clear about whether they themselves make this significant, but just as in the white Irish and British population hair and eye colour are generally seen as workaday variants without much significance, so among the Tuareg this variation may be considered just as trivial, although there is a geographic distribution with the Malian Tuareg being much darker than, say, the Libyans to whom many Europeans are most closely related.

There is also a problem with the idea that all Caucasians are white. Caucasian “ethnicity”, such as it is, is based on measurements of skulls from the eighteenth century, and the idea was that because Noah’s Ark had settled at Mount Ararat, all humans alive had descended from that area and also that the Caucasians were the peak of human beauty, so the very concept is deeply racist in its origins. It also corresponds to the idea that the world is running down from a golden age where those who originate from further away from the Caucasus are inferior because of degeneracy. The idea of Caucasian beauty is descended from the Mediaeval idea of Circassian beauty, that the Circassians of the Northwest Caucasus. I suspect that the white European mind at the time associated whiteness with beauty. The only thing is, Caucasians have only been white for the last few thousand years.

There seem to be about eight genes in the human genome associated with skin colour, the most significant of which are SLC24A5 on chromosome five and SLC45A2 on chromosome fifteen. The others are OCA2, HERC2 and the apparently less important TYR, TYRP1, LRMDA and MC1R. Analysis of Neolithic European skeletons from modern-day Spain, Luxembourg and Hungary has shown that SLC24A5 and SLC45A2 would both have been of the allele (variety) which conferred dark skin eighty-five centuries ago. The fair-skinned allele of SLC45A2 was low in Europe until only fifty-eight centuries ago, which is in the same millennium the Pyramids were built and decipherable writing was invented such as cuneiform and Egyptian hieroglyphics. After that, fair skin became rapidly more common because of the advantage conferred by being able to synthesise vitamin D from weak sunlight, and other characteristics thought of as Caucasian were brought by the people I think of as the Aryans from the area north of the Black Sea. However, by seventy-seven centuries ago there were people in today’s southern Sweden, which is incidentally the originally Germanic homeland, who did have the pale-skinned allele and also the relevant alleles of HERC2 and OCA2. The HERC2 allele is the main eye colour gene and is associated with fair hair, blue eyes and to a lesser extent fair skin. I personally have this gene homozygously, as have our children, and Sarada has it heterozygously. It’s also common in North Afrika. In my case it’s on a segment of my genome originating from Sweden. OCA2 is also on chromosome fifteen and is implicated particularly in ocular albinism, which is the variant of albinism which affects melanin in the retina and therefore interferes with sight. I have to admit that I know very little about TYR, TYRP1, LRMDA or MC1R.

SLC45A2 codes for a transporter protein necessary for melanin synthesis and is also relevant to coat colour in some other species of placental mammals other than primates. SLC24A5 codes for a protein used in the melanocyte Golgi apparatus, a kind of “conveyor belt” in cells which packages proteins into lipid envelopes and is currently very busy in a lot of humans churning out a certain Coronavirus. Melanocytes, by the way, are the cells which confer skin colour. One interesting thing about the protein concerned, which is called “sodium/potassium/calcium exchanger 5” is that the versions causing light or dark skin differ by a single amino acid, encoded by three base-pairs on the relevant stretch of DNA. This single small molecule, alanine or threonine depending on the allele, has vast social consequences for our species as it decides skin colour.

Melanin is the only mammalian pigment unless you count haemoglobin. Even red hair contains a form of melanin which is more liable to be altered by ultraviolet light, and in fact the apparently ginger Egyptian mummies probably wouldn’t have been ginger in life because the “gingerification” of melanin will always occur eventually. The same applies to ginger mammoths, who weren’t ginger before they got frozen. Other vertebrates also have melanin but other pigments too. Technically this is one reason why there are so few mammals who could be described as green in colour, unlike many lizards and amphibians. It also means that genuine mammalian albinos, with pink eyes, white hair and skin, are often unlike albinos in other vertebrate classes, who are often coloured but “washed out” – albino birds, for example, often have coloured plumage as well as many white feathers. Now the question arises in my mind of whether white humans could be seen as on some kind of albino spectrum. There is a “milder” form of albinism called “leucism” seen in many species where an animal is paler but not completely devoid of melanin. Having no melanin at all seriously interferes with your eyesight because the inside of the eye is blacked out with it, allowing it to function as a camera and therefore enable you to see detailed scenes, and this blacking out is also necessary for healthy development of the optic nerve and light-sensitive cells, which kind of grow towards the light like plants do but can’t do that in the “glare” caused by albinism. Therefore most white Caucasians are not really albinos, but we could be thought of as leucistic. Oddly, although Homo sapiens sapiens herself is originally black, the chances are that our ancestors weren’t because chimpanzees are pale-skinned, and it’s thought that we only developed dark skin on leaving the rainforests. I’m not sure about this idea because bonobos are more closely related to humans than chimpanzees and are black, as are gorillas, although orang utan aren’t and in fact can even have blue eyes. Also, Neanderthals were fair-skinned and possibly also blond and blue-eyed, and had straight hair, so to some extent would’ve looked like Britt Ekland or Angetha Faltskog. The fun thing about this is that it seriously undermines the racist idea of today’s Afrikans being more primal or primitive. Afrikan humans, unlike the rest of us, are also genetically the “purest” humans, with no Neanderthal or Denisovan DNA.

This brings up the question of hair texture, about whose genetics I know little. Straight hair has circular follicles and is therefore cylindrical. The more elliptical a hair strand is, the more bent it is, and this is linked to the shape of follicles. There are some unusual hair textures such as beaded and triangular, the latter being “uncombable hair syndrome”, but leaving aside those rarities, hair can be classified in various ways, though the most common seems to be the FIA system, which specifies twelve variants from 1a to 4c along with thickness and volume. My own hair is 3b, and this is where it gets contentious.

Hair texture seems to be ultimately defined by dozens of genes as well as what happens to it. Due to this widely distributed scattering of relevant genes across all the non-sex chromosomes (autosomes), the inheritance of hair texture is very complex. I have slightly more genes associated with straight hair than hair of other textures but the stretch of DNA on my chromosome which originated from North Afrika, unsurprisingly, does include hair texture genes, because of this wide distribution. My elder brother, incidentally, has type 4 hair and the last time I saw him he had an Afro, although the relevance of this is questionable and this may be misleading. This means that my hair has a slight tendency to dread and a strong tendency to go frizzy, but because I am, as I’ve said, the world’s whitest person who isn’t actually an albino and my hair bleaches easily in sunlight, to the extent that in Spain I’ve been referred to as “rubio” – “blond”. I can’t really comb my hair with the kind of comb designed for straight hair. Consequently, if I’m not careful I could stand accused of cultural appropriation because of the “natural” state of my hair. Dreads are historically of course not confined to Black people and may even be commanded by the Bible, but a white person with dreads is generally perceived as culturally appropriating and it just ain’t gonna happen to my hair, but I have to put quite a lot of effort into preventing it.

There is a wider point to this than just obsessing about the state of my locks. The reason my hair is like this is my ancestry. Stray genes have wandered from their homelands, as it were, into this Northwestern European person, because people are not in sealed boxes genetically and even if parents only have children with near neighbours there can be spread across great distances. My North Afrikan ancestry is still somewhat mysterious to me although I suspect it’s to do with eighteenth century Barbary piracy and their white slave trade, which is another difficult subject to raise because of the enormous importance of the Atlantic slave trade in the family history of African Americans and Afro-Caribbeans. I can’t “go” there right now and there’s a more nuanced and mediated story to be told there which is currently unknown to me.

Genes have wandered into North Afrika as well. As well as having been part of the Roman Empire, it’s also been subject to Vandal colonisation. As the Empire collapsed, the Vandals and Alans invaded Corsica, Sardinia and the Northwestern Sahara, including the former Carthage, before being conquered themselves by the Byzantines. The Vandals were Germanic peoples once native to what’s now Southern Poland, and the Alans were from, of all places, Central Asia. These people will have left genetic traces in the area too. The Vandals also had a kingdom in Iberia, indicating the connection between parts of Europe and the Roman provinces further south.

Speaking of “further south”, I’ve rather clumsily avoided using the term “sub-Saharan Afrika” in this post. This is because the only place “sub-Saharan Afrika” is “sub”, i.e. “below”, the Sahara is on a map on a wall with North at the top, and the reason North is at the top is that the people who made the maps concerned were from the North. The “top” of the planet, if anywhere, is the outside, i.e. the bit we all live on. Thus it’s Afrika south of the Sahara and not the other thing, any more than Transylvania is on the other side of the forest.

Another scrappy and difficult topic here is the idea that we are all Afrikan. There is obviously a sense in which we are indeed all Afrikan, in the same sense that I’m Scottish even though I was born in England, as were my parents. It’s true that Homo sapiens originates from that continent and it’s fair to acknowledge that most of what makes us human happened there. Just as a two-year old child is said to have learnt most of what she will ever learn throughout her life already, when our ancestors emerged from that continent we could be thought of as having invented and discovered most of what we ever will. In another sense, most of human history took place there. However, it doesn’t seem fair to define being truly Afrikan out of existence that way. We need to admit to the fact that it’s the cradle of civilisation and humanity, to be sure, but not to the extent that we pretend it means nothing to have Afrikan ancestry. Nor is it true that most of us outside Afrika can trace all of our ancestors back to the continent while we were actually members of the genus Homo, because non-Afrikans are distinctive in having Neanderthal and Denisovan ancestry and those groups were never there. I often wonder if we were to travel back in time and witness the behaviour of Palaeolithic humans, we would have a kind of “aha” moment when we realised where a lot of our problems today started.

This is getting a bit scrappy, but there’s still something I haven’t covered regarding skin tone. I’ve already mentioned human albinism and the question of whether white people are albinos. What I haven’t mentioned is the rather surprising distribution of albinism in the global human population. Caucasian albinos are in fact very rare. Only one person in tweny thousand with mainly European ancestry is albino, in spite of the fact that we’re all leucistic. In Afrika as a whole, the incidence is one in five thousand, and in Southern Afrika it’s as high as one in a thousand, which is as common as the global distribution of Down Syndrome, twice as common as Turner Syndrome and half as common as having an XXY karyotype. The life expectancy of Afrikan albinos is significantly lower for various reasons, notably because of ultraviolet radiation from the tropical sun but also most unfortunately because of prejudice against them and the fact that they tend to be murdered for religious reasons so their body parts can be used in rituals. I’m all for cultural relativism but this doesn’t seem right to me.

The question is, then, why are there so many albinos in Afrika? Why is it more widespread in populations which are mainly dark-skinned, and why is it more likely to occur in places where it’s more threatening to life for a number of reasons? It would seem more likely that being less likely to have children would lead to selection pressure against the persistence of the characteristics unless they resulted from point mutations and were genetic but not inherited. Albinism is a recessive characteristic. You can only be albino if both your parents carry the gene. Doing the maths, this means that in Southern Afrika about one person in thirty would be expected to have that gene. It’s thought that the reason it’s so common is that the parents are more likely to be fairly closely related than elsewhere, although I’m not sure about this because of the situation in South Asia, where albinos aren’t at all common. It doesn’t explain why the allele exists, and I’m wondering if there’s a selective advantage to being heterozygous in this respect, like sickle cell trait, which confers greater resistance to malaria, or red-green colour blindness, which may give XX carriers better colour vision. Apart from that, I think it reflects the greater genetic diversity of the continent.

Just to finish, I’m aware of the fact that I haven’t mentioned the Covid-19 situation with regard to Afrika or for that matter anywhere else here, but I just thought it might be good to give people something else to think about for a change.

Gd Will Shorten The Way

Contrary to the very good advice I was given, and which I’ve given myself, once again I’m typing this into the cruddy WordPress app rather than doing it on the text editor and copypasting it into the web interface. This is because I’m trying to be spontaneous and I’ve got a lot of other writing today, based of all things on Strawberry Switchblade. You probably don’t remember them.

Anyway, you will presumably be aware of the story about Ha-Nukkah and the cruse of oil for lighting the menorah which was only enough for one night but lasted eight days. Whether this is a true story or not, it’s reflected in the reported experiences of many other people. For instance, one of my friends was a very poor single parent and had only a few grains of instant coffee left in the bottom of a coffee jar and couldn’t afford to buy any more, but claimed (and I have faith that this did happen) that it lasted her ages. This is very similar to the miracle of the cruse of oil, and it gives me to wonder whether that miracle was just the most prominent such event of its kind, in which case it’s still worth celebrating for it being brought to public attention that such things happen.

I used to do a lot of washing up at church breakfasts, which were for the neighbourhood including the homeless incidentally. The area we had to do it in was quite cramped at the time although it’s since been extended, but in spite of the fact that people were cooking, serving, washing up and carrying dishes in and out, in a place which literally had hardly any elbow room, so far as I know nobody ever got in anyone else’s way, a fact which someone else brought to my attention. This is not my peculiar observation, although I can confirm it was true. It was one of several such phenomena which took place in our old church.

The immediate question arises of why, if G-d can do something like that, some of her beloved children are worms who eat babies’ eyes alive and so on. I don’t have complete answers to that although I do have some, but I won’t be going into that here. You’re also at liberty to scoff of course. I personally won’t be doing that either.

But I want to make the observation that this is a space-related miracle. For some reason the space between the bar and the cupboards, which I seem to recall would measure about sixty centimetres, was adequate for seven adult bodies to occupy and it didn’t even look particularly crowded or odd. This is the peculiarly British “dimensional transcendality” of Doctor Who fame – the kitchen area was bigger on the inside. Unlike the TARDIS, though, this area – actually volume, this is 3-space – was fully exposed to the rest of the nave. So yes, my claim is that G-d can warp space. Parallel lines did not stay the same distance apart behind the bar.

Another way in which space is said to be divinely warped is the apparently little-known miracle of Qephitzat Ha-Derekh, קְפִיצַת הַדֶּרֶךְ, or the “shortening of the way”, which I’m going to call “Kefitzat Haderech” for reasons which shall become clear later. This term has interesting translations from a science fictiony perspective as the first word can be translated as “jump” and is related to the word for “clench”. This is a situation where there is a divine need for a person to reach a destination faster than is usually possible, so they simply get there. The way between their location and the place they’re supposed to be shrinks. According to the Book of Genesis, Abraham sent his chief servant Eliezer (this is the name given him later rather than in the Bible itself) to find a wife for Isaac and he reached a well within a day which would be expected to take longer where he met Rebecca. In this case he travelled on a camel. Also in this case, it’s said that Eliezer actually wanted Isaac to marry his own daughter, meaning that he wasn’t particularly keen on this errand, so it may be that the way was shortened for him to stop him having second thoughts and acting upon them rather than because he was a particularly blessèd individual, but simply because it was divine will that that happened and presumably Eliezer was free to change his mind, which is interesting because it suggests that free will is more fundamental to existence than geometry. One possible answer given to the problem of free will in a deteministic Universe (or rather one in which the non-deterministic aspects are unhelpful to the problem) is that since omnipotence is the ability to do anything, one of those things is the apparently impossible existence of free will.

Someone mentioned another incident yesterday which interested me again which seems to count as Kefitzat Haderech was when they saw an incident putting a child in danger of serious injury nine metres away too late to stop it from happening, and found themselves instantly standing in the way, thereby saving the child. This and the other incident begins to build up a picture of the circumstances in which Kefitzat Haderech are likely to take place, or rather the conditions necessary for Kefitzat Haderech, that it occurs when there’s a need in accordance with the divine will. It isn’t a parlour trick or something which can be easily brought about, but happens when there’s a need.

Spoilers for ‘Dune’ follow in the next paragraph

Frank Herbert’s ‘Dune’ series (let’s pretend the later novels weren’t written) is remarkably good although I lack the stamina and attention span to bother to read them nowadays. I have however read the first book, ‘Dune’ itself, which deals with the arrival of a Messiah who was planned to be female by a religious order of women who have been conducting a human breeding programme for centuries, but when it comes to it her mother chooses to conceive a boy. This Messiah is referred to as the Kwisatz Haderach. It’s pretty much obvious that this term is a modified version of Kefitzat Haderech. Rather strangely, although Herbert doesn’t use the phrase to refer to the Shortening Of The Way, not only does the shortening of the way exist in the ‘Dune’ universe but it is carried out by a process of mental discipline by Guild Navigators on Spice (the original spice Melange, not the twenty-first century street drug, which I presume is named after it), who move spacecraft and their occupants between star systems by folding space and travelling without moving. This just is Kefitzat Haderech, but for some reason isn’t called that but a similar term is used for something completely different.

It’s also been claimed that a time warp exists by divine fiat. In the gospels, the faithful are said to be able to endure the tribulation even though it’s unbearable for the period it lasts, and this has been looked on as a form of time dilation, which of course exists uncontroversially according to relativity.

I have to say this whole thing reminds me once again of ‘The Hitch-Hiker’s Guide To The Galaxy’, in two ways. One is the spacecraft powered by bad news because nothing travels faster, but they “proved to be very unpopular when they arrived”. It also reminds me of the proverb “no matter how fast the body travels, the soul travels at the speed of an Arcturan Megacamel”, which is in fact based on the real Arabic proverb “the soul travels at the pace of a camel”. Sadly, I have very occasionally travelled by plane, to Paris and Madrid, and having covered the same distances by road, though not mainly on foot, the sense of disorientation and suddenness is very disconcerting and made me feel like I hadn’t earned it. It definitely isn’t to be recommended on an emotional level completely aside from the environmental aspect, and the RastafarIan principle “best is pure foot”, i.e. that one should walk everywhere one wishes to get barefoot, is rather appealing though I don’t honour it as much as I should. However, this clearly isn’t always a problem even for camels, because Eliezer’s camel seemed to move rather fast, perhaps faster than the speed of light. But there was a spiritual need to do so.

There are a number of alleged cases of teleportation. One of these involves an incident in 1593 when a man seems to have been teleported from Manilla to Mexico City and was able to report on an assassination in the Philippines whose news hadn’t yet reached Mexico. A definite case of bad news travelling fast then. In 1629, the Roman Catholic Church sent a mission to the Jumano people of present-day Texas, only to find that they’d been visited repeatedly several years before by a nun called the Venerable Mary of Agreda and converted to Christianity. They were able to describe her habit and its colour. The only thing was, although the Venerable Mary of Agreda was a real person, she lived in a monastery in Spain and never walked out of the entrance. This confirmed her claim that she was regularly teleporting to North America to convert the Native Americans. So good news travels fast too, apparently. Teleportation has also been offered as an explanation for Agatha Christie’s temporary disappearance, although I have to say that seems pretty elaborate and outlandish.

The issue of the nun leaves me in a bit of a quandary, because it really seems to me that converting indigenous people to Christianity is not necessarily a good thing. I can think of two reasons why that might have happened. One is that in this particular case it enabled them to protect their nation against the Apache by becoming allied with the Spanish, so maybe that’s what it was about. The other is that whereas it seems to us that it’s the result of it occurs only in cases of divine need, maybe it actually happens when the person concerned wills it strongly enough, and that sometimes becomes rationalised in that way.. Maybe it’s just as well it does, because otherwise it could lead to pride. In fact it reminds me of my experience of stigmata. I don’t consider stigmata to be miraculous at all. They’re more like a psychosomatic process where wounds appear on your body in a slightly similar manner to eczema and psoriasis exacerbating under emotional stress. They’re just the result of empathising with the Crucifixion and that manifesting as sympathy symptoms, or in this case signs. Incidentally, they didn’t feel like they were a major spiritual experience.

There are said to be incidents of women lifting vehicles off their children to save their lives. I’m sure this happens but it’s very hard to study. I’m also confident that a man was stuck on a desert island with a small baby would be able to lactate without any kind of external hormonal manipulation, from plants or otherwise. The point here is that these things happen when there’s a need. These examples, though, are not as startling as Kefitzat Haderech.

I don’t wish to claim for a second even that teleportation or hyperspace jumps definitely happen, but I think they need to be put in context. Before the apparent support of Donald Trump’s presidency by evangelical Christians, I believed in the virgin birth for the following reason. If it was necessary to bring about the messianic age by a human being born without conception by a human father, that’s just what would’ve happened. If it wasn’t then it didn’t. The reason I no longer believe that is that Trump supporters calling themselves Christian are clearly not doing a very good job of avoiding sin, strongly suggesting that Christ has no power to save them. I have faith in their honesty that they are Christian, but it isn’t helping them, which is sad for them. But the point is that the reason I don’t currently believe in human parthenogenesis is nothing to do with science.

To finish then, the question of whether it will ever be possible to travel to other star systems faster than light is not determined by whether we will have the technology to do it, although of course the chances are we would if we did it, but whether it fulfils G-d’s plan for us to do so.

The Multiverse And Hyperspace

Those two words up there are commonly bandied about in popular culture and discourse and may or may not be anything to be afraid of. The first is entirely Latin in origin. The other is a mixture of Greek and Latin via French. Words of such origins in English reflect the elitism of mediaeval times when you could be burnt at the stake for reading the Bible in English, as it would challenge the authority of the establishment and gatekeepers belonging to the higher echelons of Church and state, as depicted recently on ‘Wolf Hall’. Any phobia you might have of Latinate and Greek words is the result of ancient rulers trying to keep you in what they think is your divinely preordained place at the bottom of society. Ignore this aversion.

Having said all that, neither word is particularly high-falutin’ in terms of intellectual braininess-signalling because both have occurred a lot in pop culture by now. Typing them into a popular lyrics website yields several pop songs called ‘Multiverse’ or ‘Hyperspace’. They bring disco or acid house to mind when I see those words, and The Shamen’s ‘Destination Eschaton’ in particular. I haven’t bothered to explore further though, so maybe not. But this is not some arcane mathematical treatise. I am a member of no academic communities and don’t pursue such things with rigour. Always bear in mind that I am never more than a pseudo-intellectual and that insofar as I have any strengths, the main one is being able to sound clever even though I’m

First of all, what are they? The Multiverse is the collection of all possible worlds. Wilhelm Leibniz, in his attempt to justify the ways of God to us, seems to have come up with the idea of possible worlds, with the idea that God, being good, would ensure that creatures would reside in “the best of all possible worlds”, defined as the most varied world with the greatest simplicity. That doesn’t seem to follow at all of course – it’s easy to imagine a very simple ultimate torture chamber for instance, where every mote of dust experiences eternal infinite pain in a unique manner. Voltaire satirised the whole idea in ‘Candide’. Nonetheless it’s possible to imagine that if the Universe were a simulation, it could be procedurally generated from a numerical seed to allow the greatest variation from the simplest principles. I should probably dilate on this with an example.

One genre of computer game is the Roguelike. This consists of a series of either isometric or two-dimensional depictions of floors in a practically infinite descending series of dungeons, caverns, cellars or whatever, containing various hazards and powerups with exits. The original Rogue is descended from a game called perdit5 found on mainframes in 1975 and intended to mimic Colossal Cave Adventure without the relatively massive memory overheads that needed. I’m confident that I could implement a Roguelike in well under 16K. They’re very simple to write, but I’ll compare and contrast first.

Suppose you want to implement a map for a series of dungeons in 32K. Each dungeon consists of a thirty-two by thirty-two grid and there are thirty-two floors. Each square of that grid has one of two hundred and fifty-six possibilities in it, including the position of the player, an exit downstairs, a hundred or so special potions or other powerups, a hundred or so monsters or hazards, a wall, blank floor and so on. Now you could just make a three-dimensional 32K array and plonk all that stuff into it, and that would work fine except that it would take up half of a sixteen-bit complement of memory, which in older eight-bit computers is rather too much. Another approach would be to start with that array and work out how to compress it. For instance, you could ensure that instead of storing every location on a blank expanse of flooring, you just store two five-bit values plus two five-bit coördinates for the location of that bit of floor and you’ve stored up to an entire kilobyte of data in less than three bytes. You would of course have to write code to “decompress” this, at least as a display on the screen, which would slow things down a bit, but in theory you’d be able to ensure the way the data were actually stored and even interacted with via the player remained compressed. At this point I could start going off on a tangent about generative adversarial networks but I won’t.

A much more compact alternative to this (am I assuming it’s the only other option?) is to generate the hazards, powerups and exits randomly. That way the actual data amount to being compressed practically to nothing, except that if it’s truly random you get a different set of maps every time and you can never play the same game twice. This is fine for most purposes, but in fact because classical computers are so deterministic, true randomness is practically impossible to achieve using purely digital circuits. It does exist out there in the messy world but inside the computer it’s really difficult to do. I used to do it by reading the refresh register of the Z80 CPU, which was responsible for maintaining the contents of dynamic RAM and changed from moment to moment, but that wasn’t truly random and also if I remember correctly only changed six of its eight bits, so needed to be masked using some kind of Boolean function. Another quasi-random approach I took was to take consecutive values from the code of the system software, which presumably can’t be done any more because of the tendency of modern computers to hide their internal gubbins from potential crackers for security reasons.

However, it may not be desirable to have true randomness. A pseudorandom sequence of numbers can be generated using a straightforward deterministic algorithm. For instance, and I haven’t checked this which is coming off the top of my head, it could start with the eight bit unsigned number 170, rotate it right three times, reverse it and XOR it with 22, then use that value to generate the next number and so on. In fact I shall try this. 170, 1, 26, 44…seems to work, but may end up in a short cycle. 170 was the seed. If the seed was something different a different sequence of “random” numbers would result, and this is key because it amounts to the entire Roguelike map being compressed to that single seed number, which in this case is a whole number from zero to 255 inclusive. Hence the seed for a certain algorithm amounts to a compression of its results to a remarkable degree.

Imagine, therefore, a computer which generates a whole universe in this way, and that you want the best of all possible universes generated from such a seed. What would that seed be? Might it be forty-two?

Taking this back to the Big Bang Burger Bar, clearly a process does occur whereby the Universe unfolds from initial conditions and these could easily have been different. However, in a deterministic universe things could only ever turn out one way according to those conditions. This is not in fact how things do proceed though. Take the bomb on Hiroshima. This consisted of uranium-235, each atom of which had a set probability of decaying, which may have been set off by a cosmic ray or other external ionising radiation. I don’t know, but I suspect it wasn’t because it was sealed in a thick metal shell. Therefore, the initial decay would’ve been random, i.e. it could’ve been any atom in the uranium used. Which atoms split first is of no consequence in the ending of the Second World War, but what if it hadn’t gone off at all? This is, perhaps surprisingly, entirely possible. There is a certain absolutely minute probability that the atom bomb could’ve been built exactly as it was and then for it not to have worked at all. The same applies to nuclear reactors, radioactive tracers in medicine and other situations. They’re fantastically improbable but the probability is not zero, and the result of an apparently dud bomb dropping on Hiroshima would not only have, perhaps temporarily, saved the lives of over a hundred thousand people immediately plus however many are going to die as a result of the irradiation, which has no limit – people born thirty years from now could still die as a result of Hiroshima – but possibly have prevented the Cold War and everything which followed from that. This particular kind of “what if” is not the same as “what if Hitler had died in the First World War?”, because most of the processes which involved his survival were set in place at the Big Bang, ignoring the probabilistic nature of nucleosynthesis. The Cold War is not a deterministic event at all, from the viewpoint of the events of 6th August 1945. The important question, though, is whether the Big Bang was, and to the extent which I believe in the Big Bang, I don’t believe it was, although I have had a problem with this which is now resolved. I’ll come to that.

At an early point in the Universe’s history, a seemingly random distribution of matter and energy came to be from a previous homogeneous state. Hence there were regions where matter was more concentrated, from which more galaxies and stars formed, and others where it was less so, which may have become the voids. Within those galaxies and stars, other variations in concentration and other conditions occurred, ultimately leading to Adolf Hitler and Rubik’s cubes becoming popular in the early 1980s. However, given that these conditions were not determined, or don’t appear to have been, ab initio, I think a probabilistic element exists at the Planck Epoch. The Planck Epoch is the first hundred quadrillionths (long scale) of a yoctosecond, which is not determined because of the scale of space and time involved, meaning that cause and effect didn’t operate. I suspect that the initial conditions of the Universe were probabilistic at this stage, which means that in fact there wasn’t a Universe at all at this point but a multiverse, because all the parallel universes related to changes in initial conditions started then and there, which at the time also happened to be everywhere.

It’s possible to think of the timeline of each universe as a literal line running through time and to think of this particular universe as a line with other lines either side of it, but there’s a problem with seeing it in this way. In this model, the distance of a timeline from one’s own is related to its probability compared to the actual state of affairs. This would mean that two simultaneous 50% probable events could be either side of the “real” timeline at the same distance from it. However, this can’t account for more than two events of exactly the same probability, and for this reason the model has to be at least two dimensional. After this point too, various branches occur which need to be arrayed in the two-dimensional space of the diagram, although the whole multiverse is not a tree but a forest. An example would be Hiroshima, with the branch where the bomb didn’t work being way off to one side somewhere. There’s a branch even further off where radioactivity hasn’t been discovered because no atomic decay phenomenon has ever been recognised and has not occurred in a place where it could be detected as itself, which is a world diverging from ours in 1896.

Two questions arise at this point. Is the model in which the universes branch or start off rooted in the Big Bang literally a space consisting of at least two dimensions? And, if it is, is it coherent to see the Big Bang as occurring at the edge of time? I’ll start with the first one. If the two dimensions of probability are literally dimensions, all the space between timelines has to be “filled in” and probability is quantised like everything else in time and space because it is itself a property of space-time. The timelines aren’t separated by anything but form a continuum like the more familiar space and time. This is also one candidate for conceiving of probability even if it needn’t be like spatiotemporal dimensions. In fact the option for gaps does not exist because a timeline which cannot happen, which is what those gaps would amount to, would have a probability of zero and therefore be at the edge of the probability dimensions. There are other kinds of space which are not literally “space”. One example is isotopic spin space. Isotopic spin space is basically the symmetrical way in which varieties of particles are arranged, so for example for every type of positively charged particle there must be a corresponding negatively charged particle which is otherwise similar, and so on. It doesn’t seem to be literally space in the geometrical sense. To be honest I can’t tell whether probability space is literally two or more dimensions of space time or not, but I always think of it that way. Thus in my ‘Here Be Dragons’, the Yates-Leason device swaps two masses diagonally not only through space and time but also through probability, meaning that they both enter each others’ timelines while being lost to their own – it’s a plot device which enables travel to parallel universes in other words. It could be that something about the properties of probability rule out it being a kind of space, but if it is, that makes the multiverse hyperspatial – it has more than just the dimensions of space and time. I’m not unusual in thinking of it in this way. The title of Murray Leinster’s story ‘Sidewise In Time’ refers to this, and it’s clear that Douglas Adams’ Hitch-Hiker universe also works like this. Whether it’s true or not is another matter.

The other question regards the edge of such a space. Space itself has no edge but is not infinite, which is also an answer to the irritating question “what is space expanding into?” provided you don’t believe in ‘Brane Theory. Space is not expanding into anything because it isn’t a “thing” but a relationship between physical items, and what the idea of space expanding means is that the maximum possible distance between two locations is constantly increasing. The maximum possible distance, incidentally, is the distance at which the directions in which the two objects are relative to each other swap over. This combination of distance and direction is what space is. Time is apparently simpler than that, at least in classical terms, because it’s conventionally conceived of as being just one dimensional and therefore just what stops everything happening at once, with the extra feature of having a direction. This assumes one thing and ignores another, respectively that there is only one time dimension (there might be more than one) and that things don’t really happen simultaneously. It also ignores the fact that the proportion of space-time which is space to something and time to it is divided up differently according to how fast it’s moving and how strong gravity is, but I don’t want to over-complicate things.

Nonetheless, there is a problem with the idea of space-time having an edge, which would be the beginning and end of time, space itself being non-Euclidean and finite without an edge as described above. It might be easier to explain using space. Suppose space is like a piece of fabric and there’s a rip in it. Where is that rip? If an object moves into that rip, it either skips over it and carries on on the other side or disappears completely. I can’t say “disappears into it” because there is no “into”. “Into” implies a destination, and destinations are in space. Hence there can’t be a rip in space, although there can be a pit or possibly a bump, although if there is a bump as opposed to a pit that would make faster than light travel possible and it isn’t because that would be too good to be true and various other things would also then be possible such as gravity control, tractor beams and practically limitless energy as a consequence. The same problem emerges with space having an edge. There can’t literally be a “wall” off which things “bounce” or on which they get stuck just because of space as such. There can be a maximum distance, partly due to gravity, but if there was a limit as described there would simply be empty, inaccessible space beyond it and a relationship of direction (“beyond”) and distance would still exist – in other words space would still be there with nothing in it. Such a situation would be more like the Jain cosmos, which is infinite except for a region six hundred light years high in the middle where a subtle substance allowing movement fills it. That’s a conceivable set of circumstances, but probably not one which actually exists.

This raises a problem with time. The model of the multiverse I’ve just outlined does seem to have an edge, namely the beginning of time. The solution to this issue with space is that space kind of “loops round”, although I prefer to express that in terms of space being non-Euclidean and simply having a maximum distance at any one time, if indeed there is such a set of circumstances as “one time”. In temporal terms that would make time cyclical of course, which is quite popular and has even been respectable in cosmology when it was thought by some that the Universe would ultimately contract to a point and start “again”. However, there isn’t enough mass in the Universe for that to happen and in fact it will continue to expand forever if the Big Bang is all there is to the truth about its origins. This also seems to place a limit on the nature of other universes because it means none of them can both expand at the same rate as this one and have more mass, or they would in fact be looping and the Big Bang wouldn’t be the start of every universe, plus it would lead to the same “ragged border” which would be involved in discontinuous parallel universes. However, the recent strong evidence in favour of time flowing in two directions from the Big Bang, therefore making the Universe symmetrical in terms of parity, space and in matter-antimatter terms does away with the problem of a beginning to time, because the time before the Big Bang is an identical sequence of events happening backwards. Hence this does away with the idea of an edge to time, because there is no end to it, and there was no beginning – the Big Bang is in the “middle”, though not literally because of eternity.

There is another way in which parallel universes can be made sense of which can coëxist with this model, but I won’t go into this now.

So it’s quite simple really isn’t it?

The Gift Greek Bears

This is a pretty spontaneous post, typed directly into the woefully inadequate WordPress app which gives me a ridiculously narrow column width, possibly so it can fit into portrait format on a mobile device. You will note also that I haven’t said anything of significance before the fold.

I think both Sarada and I have been employing ourselves well during the lockdown. I wrote a seven thousand word story yesterday which has already been read by ninety-seven people since I submitted it late yesterday afternoon and have had an idea for another which I’ll begin and possibly finish today. I’ve also had some dealings with my publisher about ideas to promote my novel and Sarada will be glad to hear that they will now be putting the Hugo nomination on the front cover. Meanwhile, Sarada has been learning Ancient Greek. I know I’ve already blogged about Greek but this is a new angle, because it’s not so much about the internal gubbins of the Greek language as its uses and the experience of learning it.

For some reason it’s made me inordinately happy that Sarada is learning Greek. She’s picked up the alphabet now and is presumably making further progress. I think maybe the fact that unlike most other European languages it doesn’t use the Latin alphabet daunts people. However, it only has twenty-four letters nowadays and of those eleven of the capitals have practically the same form and pronunciation as their Latin counterparts. Some of the others are false friends because they look like our own letters but are pronounced very differently. The lowercase letters are squiggly and a little harder to grok, and one of them, sigma, has a different form for the end of a word than elsewhere, but the same is true of Latin up until the beginning of the nineteenth century with the “long S”, which was employed as a joke in ‘The Vicar Of Dibley’. What is is about esses which makes them behave this way? Arabic has at least two forms for every letter, but that’s because it’s cursive, and Hebrew also uses final and medial (as they’re called) forms for several of its letters, but Greek just does it for sigma.

Another oddity of Greek writing is the presence of rough and smooth breathings at the start of words before vowels. The hard breathing looks like an opening inverted comma placed before a capital initial vowel and above a lower case vowel or the letter rho. It represents what we think of as an H sound, although in fact an H in English is always a whispered version of the vowel following it, which I presume is why modern English never has an H sound after a vowel or before a consonant. It seems odd to contemporary literate speakers of a language using the Latin script that Ancient Greek has no letter for H but does have a way of writing it, because it seems like a proper sound which ought to have a letter, but it isn’t. What seems less odd to speakers of languages like English which elide vowels at the start of words is that Greek also has the smooth breathing, which is the apostrophe-like symbol put in the same places. This in fact represents the glottal stop, as in “wo’ a lo’ of li’le bo’les”, which is widely acknowledged to begin words widely understood to start with vowels by some other people. Hebrew and Arabic both represent the start of a word using a letter which indicates a glottal stop in such situations, but there are other languages, such as Samoan, where some words begin with vowels and others with glottal stops. It happens in most Germanic languages, with the exception of English, meaning that I once said “das ist ein Problem” to my ex’s mother but she thought I said “das ist dein Problem”, which was inappropriate. In Greek the rough and smooth breathings were originally respectively the left and right halves of an H-shaped letter, and nowadays Greek always drops its aitches.

Sarada’s assiduous attention to learning the script revealed a gap in my knowledge, or rather confidence, which I’d never got round to addressing. Some vowels in Greek have a sign like the Spanish tilde (~) above them, which represents stress or rather accent, because Greek used to be a tonal language and even today has words of different meaning according to stress, which as far as I know only occurs in English in words derived from each other. This is know as a pitch accent, and in the tilde’s case is a quick raising and lowering of pitch here. Similar phenomena occur in other ancient Indo-european languages including Sanskrit, and I’m aware that Punjabi did the same until quite recently although I don’t know if that’s a remnant of the Sanskrit practice, but it probably means that Proto-Indoeuropean itself was a tonal language too. Latin lacks this feature.

Now for the notorious issue of dead white males.

One criticism made of classical studies is that it’s about dead white males, more specifically written documents by wealthy slave-owning men, often in fact the paterfamilia who had the power of life and death over his family and slaves. On the whole I agree with this claim but I also think it isn’t that simple. First of all, Roger Scruton, with whom I generally violently disagree, came up with the concept of οικοφοβία (there’s a rough breathing missing there due to the way I typed it, over the ιωτα (and over that too!)) – “home fear”, a word which had been used before but in Roger’s case applied for the first time, as far as I know, to an aversion to the familiar. Scruton would of course have taken this up and run with it to justify white supremacism, but there is nevertheless a point to it. For instance, democracy is a predominantly Western idea and so are civil liberties, and they’re both good things. There are also apparent universals in human experience, so the chances are that Greek artists and scholars will have managed to uncover and express those too. For instance, most of what Aristotle says seems to be balderdash but his biology and ethics both work pretty well. It’s said that Aristotle’s zoology mentions things which scientists didn’t realise were so until the twentieth century, and his account of vice and virtue, where a happy medium exists towards one end of two vices, certainly seems true to me as well, and neither of those depends on a rich white male perspective. Nor is it true that they were all male. Ψάπφω (Psappho, usually known as Sappho) is of course widely reputed to be a lesbian poet, living in fact on the isle of  Λέσβος (Lesbos), so she was literally a Lesbian even if she wasn’t literarily one.

This last aspect of aversion to Classics is, I think, cause for concern. One of the big advantages the Old Right has over the New Left is their classical education, and that crucially includes rhetoric. Rhetoric is of course the art of persuasion, and I know a little about it but not enough. The practice of spin has been going on since the Bronze Age in written form, and doubtless much earlier in speech, and although we have propaganda, marketing and advertising nowadays, we’ve also had three millennia of experience which we on the Left tend to ignore. This is the crucial thing about Boris, I think. Because he has a degree in Classics, he is able to use that to run rings round people and persuade us of whatever is expedient at the time. Unlike many Lefties, he’s not an amateur in this respect. When I compare the internal machinations of the Conservative and Labour Parties, I do see much in common, which is fascinating, but one big advantage the Tories have over us is their long schooling in the art of persuasion, at which they are professionals. Labour members tend to be amateurs at this, and their lack of expertise is often very glaring. I can remember sitting in the hustings of my local constituency being “persuaded” by someone on the Labour candidate’s team and it was transparently obvious what she was trying to do and offensively manipulative. This is because she wasn’t good at it, and the reason she wasn’t good at it was that like other people on the left, not all by any means but too many, she had a disdain for the traditions of rhetoric and was having to rely on a recent reinvention of the technique which had turned its back on the fine tradition of emotional manipulation to which the Tory Party is heir. And this really bothers me, because I know I’m on the right side. I’m absolutely confident of that fact.

To finish off, Greek and Latin and their associated cultural paraphenalia are passports to the world of the intelligentsia, and I’m proud that Sarada has chosen to avail herself of this advantage.

On a final note, I seriously wish I hadn’t used the Android WordPress app to type this because I’ve lost the whole blog post three