Popping over to Cynthia’s

Fifty years ago today, Apollo 11’s Saturn V rocket was launched from Cape Canaveral towards what most English speakers call “the Moon”. The US government had also tried to include the Soviet Union in the mission, but Khrushchev turned them down. I can’t remember the first landing at all although I do remember listening to the reel-to-reel tape recordings my father had of the whole mission. It was very much in the air of my early childhood and I do remember later ones, in particular the last one, Apollo 17, and I’ve had the experience I expect many people share of looking up and thinking, there have been people walking around up there.

I need to get a couple of things out of the way first. One is the issue of nomenclature. I call “the Moon” Cynthia and refer to her in the feminine, just as I call Earth Earth as a proper name, not “the Earth”, and use “she”. I also use gendered pronouns to refer to all planets in the Solar System. First the names. Although the word “moon” usually refers to the body most closely associated with Earth in her orbit around the Sun, she’s arguably not really a moon at all and the word “moon” is also used to refer to any natural satellite of a planet. Using it specifically as a proper name is parochial, and makes it seem special in a way which restricts our perspective on the Universe. A restricted perspective contributes very much to our predicament as a species. Every other moon in the Solar System has a name or a serial number. As far as I know, there are no serial numbers in use right now but there used to be before certain objects got names, for example 1979J1, the first moon of Jupiter discovered in 1979 by Voyager 2 now known as Adrastea. Therefore I call “the Moon” Cynthia, which is one of several options in the Greco-Roman tradition, including Selene, Diana and Artemis. In fact there’s a planned mission called Artemis right now, which plans to return humans there by 2024. I chose the name Cynthia because I have a slight preference for Greek over Latin, which I realise is not reflected in my name, but in any case it’s an epithet of Artemis, meaning that she was born on Mount Kynthos on Delos in the Cyclades. Artemis and Diana are hunting goddesses, presumably because it was easier to hunt by the light of the full “moon”. It really grates with me, incidentally, to call her “the Moon”. This does reflect a Western bias, but then we don’t use the Chinese terms to refer to planets in English so I don’t really see this as a problem.

And also I refer to her as “she”. I do this with all planets and moons in the Solar System, using the gender of the associated deities. This means that of the oft-mentioned objects, Venus, Earth and Cynthia are “she” and the rest “he”. This is not because of personification, although being panpsychist I believe that consciousness is present in all matter and therefore that in some sense they are all conscious. I realise it makes me sound like Francis of Assisi. The reason I use gender to refer to celestial bodies is to subvert the idea that pronouns are specifically associated with people of particular gender, and I also do it with other referents such as ducks, cats and dogs because the unmarked noun for each is, in these cases, female, female and male respectively. I don’t use the word “bitch” at all, so this means it sounds like I misgender female dogs but the real reason I do that is to restore grammatical gender to the English language and reduce its human significance. Hence there’s no astronomical significance to it, it just tends to be more noticeable.

The other issue that comes up a lot regarding Cynthia is whether humans really went there. The answer is, of course, that they did, six times, and that twelve people walked on her. The website clavius.org is the usual place I direct people to when they ask because they do an excellent debunking job, but I would put people who doubt in the same category as flat-earthers and creationists, and of course flat-earthers are more or less constrained to deny the visits too. I’m not going to spend much time on this except to observe a few things. There are laser reflectors there placed by Apollo astronauts used by observatories to measure the distance to them from Earth, although there are also two from the Lunokhod automatic rovers put there by the Russians, so in theory the Apollo ones could’ve been put there without humans doing it. In general the astronomical community is aware of these and some of them will have done it, which I think reflects one issue which might explain why people doubt the landings: they feel excluded from academia and perhaps envy the supposedly well-educated, and are therefore unlikely to know any professional astronomers, and of course there’s the Dunning-Kruger Effect that the less one knows about something, the less one realises how much there is to know about it. A few other things I find somewhat baffling. Some of these are the claims made about photographs having crosshairs disappearing behind objects, the absence of stars and the presence of letters on rocks. I’m not by any means an expert photographer. In fact, I take fewer photographs than most other sighted people in the world because I don’t use mobiles much and don’t own a working camera nowadays, but even I know the answers to those. You wouldn’t expect to be able to see stars in a fully sunlit scene like those in the Apollo photographs because if the exposure, aperture or whatever (see, I told you) was sufficient to show them, the glare from the surface would bleach all the details out. Similarly, you can’t see crosshairs in front of brightly-lit rocks because of the glare, and the likes of the letter C on them is pareidolia – seeing patterns where there are none of significance. The Van Allen belt argument is also easily explained by the route the spacecraft took and the short time they spent in the belts and there would in any case have had to be a huge conspiracy involving hundreds of thousands of people at a time when people didn’t trust authority with no whistleblowers. But I don’t want to go on too much about this because it’s been allowed to dominate things already, except to say one more thing: many doubters believe there was only one apparent landing rather than six and for some reason are also aware of Apollo 13, so they’re not that reliable.

I mentioned above that “moon” may be an inadequate word for Cynthia, which is a bit unfair because the original referent was her. She has various oddities which don’t apply to the other bodies associated with planets in this solar system. The mass is about 1/81 of Earth’s, which if Pluto is not considered a planet is far larger than any other mass ratios. The largest moon:planet ratio is about 1/4500 for Triton and Neptune. Among the inner planets only Mars has moons, and those are temporary captured asteroids about the size of the Isle of Wight. This worries me.

I’ve said before that one thing which would make it difficult for me to worship God would be if it could be demonstrated beyond reasonable doubt that there was life elsewhere in the Universe, because I would then have to contend with a deity who had created a vast, empty Cosmos with us on just one planet in spite of all the countless others which exist. If it does turn out there’s anything special about Earth, this could lead to that conclusion. It doesn’t mean I’d stop believing in God, just that her ways would be not only beyond my own understanding but also absurd to me, so it’s more like a deal-breaker. Cynthia may be such an anomaly, because she’s responsible for the Van Allen belts. The magnetic field of our planet is generated by tides being raised in the iron-nickel core which then traps charged particles radiating from space, mainly the Sun, in belts around us. If we were to go to Mars or elsewhere, one way of protecting the astronauts would be to generate such a magnetic field to keep such particles away from them. The significance regarding living things generally here is that it prevents organisms from being killed by hard radiation. There are several things I’m unclear about with this. I don’t know, for example, if photosynthesis would be impossible on a planet without Van Allen belts because the upper layers of the ocean would be too irradiated, or whether organisms could survive underground running their metabolism on geothermal energy, or if ice would protect them. It’s also possible that a roughly Earth-sized moon orbiting what I might call a “warm Jupiter”, that is, one within the “Goldilocks Zone” at the right distance could be protected using the planet’s magnetic fields. As far as Jupiter himself is concerned, this constitutes a problem for any humans wishing to land on Io, Europa and Ganymede, all of which orbit within such a belt. Second-hand information regarding optimism about the idea of life on Europa in particular suggests to me that a thick layer of ice ought to be enough to keep life safe from such a threat.

Another oddity about Cynthia is that the gravitational pull of the Sun on her is stronger than Earth’s, which is not true of other bodies associated with planets elsewhere in the Solar System. Therefore, in a sense she isn’t so much orbiting us as that the two bodies are twisting around each other in their orbits around the Sun. This is another reason for not referring to her as “the Moon”, because strictly speaking she isn’t one.

I expect you know this already, but I’m going to mention it anyway. It appears that Earth was hit by a Mars-sized body now called Theia soon after being formed which chipped off the outer layer of this planet, leaving it to form into a separate globe. The density of Cynthia is only 60% of this planet, possibly because the lighter materials were nearer the surface. Theia was possibly originally a Trojan with Earth, meaning that she formed an equilateral triangle with us and the Sun. I personally wonder if she actually is Mars. Mars has a similar density to Cynthia’s incidentally, unlike Venus and Mercury, whose densities are similar to ours. If this hadn’t happened, Earth wouldn’t be much bigger than she is today, so life wouldn’t be ruled out for that reason, but if the Van Allen belts are essential, these would at least be weaker if they existed at all. There would be a weak pull from Theia but I don’t know how much difference that would make.

Gravity pulls apart objects less than 2.44 radii from a planet’s centre, which places a minimum orbital radius for a genuine moon orbiting this planet of 15562 kilometres. For that to exert the same gravitational pull as Cynthia, the mass of such a moon would only need to be about 1/600 of her mass. This is only a fifty thousandth of Earth’s mass, which places the ratio below that of Triton for Neptune. This makes the prospect of life on an Earth-like planet more feasible, but it still leaves a mystery: why have we got such a large associated celestial body?

There’s another mystery about Cynthia, which is that because she’s a four hundredth of the diameter of the Sun and four hundred times closer, solar eclipses are possible. This is apparently a coincidence, but it’s a very odd one because it means this may well be the only planet in the Milky Way where there are such eclipses. Elsewhere, moons will either blot out their suns completely or show a wide ring of the photosphere, but on Earth, although there are annular eclipses where some of the Sun’s surface shows, there are also total eclipses where only the corona, the solar atmosphere, is visible. This would make Earth a good tourist destination and it’s even been suggested that solar eclipses would be a good time to look for alien spacecraft!

As I mentioned before, the Artemis project plans to send more people there in the next decade. This opens up a further quandary for me. I’ve previously mentioned that the Doomsday Argument seems to establish a 50% probability of human extinction by about 2130. I’ve written about this elsewhere so I’ll just go through a few highlights of my argument. It partly depends on when you decide something is able to wonder if it’s one of the last people to be born, and also could be reinterpreted as a measure of whether the thought of human extinction is going to disappear rather than actual humans. The Singularity might be one way this thought could vanish without necessarily causing us to disappear, or a drastic increase in the longevity perhaps combined with a very slow reproduction rate could do the same. However, if I take the Doomsday Argument at face value, I have to conclude that Artemis will not lead to wholesale settlement of the Solar System or the construction of large space habitats, because the longer that goes on for, the less likely it is that the random sample of human existence which is my own life would be this early in human history. This thought opens up a new avenue of Artemis Hoax conspiracy theory.

There’s little doubt that if Artemis happens, there will be internet conspiracy groups which will claim that it’s faked. Strangely, there’s a corollary of the Doomsday Argument which leads to the belief that it will be a hoax, or that it will be half-hearted or abortive, or even that it predicts the imminent passing of our species before it can happen. This is how that thought works. If Artemis goes ahead, it could lead to lunar bases, and a jumping off point for human exploration and eventual settlement of other planets in this solar system. If this involves any large-scale construction of space habitats, settlements on Mars or the eventual terraforming of Venus and settlement there, this would have to be a short-term project. The “simple” act of rendering Venus habitable to an eventual population of a thousand million, less than today, with a generous generation time of four decades with mere replacement would only give the human race three thousand years more of history assuming Earth’s population quickly falls to zero. Projecting that backwards only takes us to the start of the Iron Age, so it’s not long and still means we’re near the end of human history. Space habitats have greater potential than terraforming or settling planets and moons, so this argument would effectively completely rule out their existence.

Early plans for visiting our nearest neighbour in space were somewhat different from what ended up happening. One idea was to send one astronaut who would spend a year or so building a lunar base which would then be inhabited by others. This would have given humanity a toehold on the place, but it wasn’t put into practice.

The trip there is only equivalent to ten circumnavigations of the globe. There are cars which have been further than the round trip. Although I don’t want to talk down the achievement, I do want to emphasise the idea that it might not be that difficult to go back. But there is one thing in particular which does make it harder than it seems.

Imagine you have a Betamax video recorder today which you want to repair. You’re unlikely to be able to find anyone easily who would be able to fix it or even find the necessary parts. Now suppose that repair person had only mended seventeen video recorders in their whole career, because there were only seventeen video recorders ever made. Suppose also, and this is probably true, that all the Sony employees who worked on designing and manufacturing Betamax recorders had moved on to other projects, retired or died. Your task would then be to manufacture a new Betamax recorder from scratch, and when you’d done that you still wouldn’t have a new TV set to hand you’d be able to plug it into, any video cassettes or even any TV signals which would enable you to record onto the non-existent video cassettes. This is basically the problem with going back. It’s really not that easy because it’s been so long that people need to start from scratch from a different starting position.

To conclude, I share the general frustration that nothing much happened after Apollo 17 and I see it as a general malaise of humanity, or a symptom of it, that we haven’t done anything since. But I would welcome going back, perhaps as the first step to something more.

An End Note

I word processed this one!


Polls and Replication

News stories tend to have certain features which don’t reflect reality particularly well. Firstly, they have to be stories. They have to abstract something from the mish-mash of daily events and turn it into a tale, often with a sensationalist climax. They also tend to be negative, with a few exceptions such as ‘Positive News’. That said, there are problems with the idea of positive news because not everyone will agree on what’s positive.

Scientific reports and research also have their share of problems, partly due to journalistic impingement. For instance, I find it more than a bit suspicious that the genus Homo apparently has so many species compared to other primate genera and wonder if it’s to do with the kudos which “discovering” a new species of human carries with it, along with publicity for the associated institution. There are now said to be seventeen species and subspecies, which given that there are two species of chimp, three of gorilla, two of orangutan and so on, seems a bit excessive.

The rest of this post has been lost. Sorry.

Let’s Change The World With Music

I can’t leave the idea of alternate Beatles timelines alone. It keeps bugging me. Because ‘Yesterday’ was a fun, light film, it didn’t go various places which it kind of should’ve done. The world it showed was an alternate timeline without the Beatles and I think four other, separate, differences, which we can presume had nothing to do with them, although two of them would’ve made a difference. What it failed to do, or rather, chose not to explore, is what other consequences would’ve followed from a world without the Beatles, and this bugs me because without the Beatles, our children would never have been born.

Let me take you down where I’m going to. The Ruhrgebiet in the 1960s. My ex was a child living in Germany, having been born in 1960 (yesterday was her birthday actually). Her interest in English was piqued by wanting to understand English Beatles lyrics, and this grew until she eventually chose to read English Literature and Philosophy at the University of Tübingen, in around 1978. In the mid-’80s, she took advantage of the Erasmus scholarship scheme to come to England and study for a Master’s in Modern English Literature at the University of Leicester, my alma mater, and met me. In 1989, we got together and stayed together for around eighteen months. During that period, she met Sarada, who was teaching English at Moat Community college, and we were invited around to her and her partner’s flat for dinner. Sarada was nervous because she’d never cooked for a vegan before and got some mushroom paté, which I hated but pretended to like, so it could be said that our whole relationship was founded on a lie! This was in April 1990. In September, while in Invermoriston in the Highlands, my ex and I split but remained friends. Sarada also split up with her partner. In early 1993, Sarada, back from teaching in Madrid, and I got together and we married that year and had our two children, in 1994 and 1997, both now in their twenties, and we’re due to have a grandchild in a few weeks if all goes well.

None of this would’ve happened without the Beatles. Or would it? Is it possible that there would’ve been another stellar British band with such talent and creativity which my ex-to-be would’ve got interested in? Also, how could it have been that the Beatles hadn’t happened? I’m going to deal with this question first.

I don’t know much about the lives of John, Paul, George and Ringo. I do, however, know that Paul McCartney got his girlfriend Dot Rhone pregnant in 1962, but she then had a miscarriage. By that time, the Beatles had been to Hamburg but if the pregnancy had continued to term it’s probable that McCartney wouldn’t have stayed in the group. The question then would be whether it would have been a Pete Best-type scenario, with him being replaced by someone else, or whether it would just have meant that they would’ve fizzled out and split. If they had, it’s possible that the Rolling Stones wouldn’t have succeeded either because of the links between the two bands. Lennon and McCartney wrote their second single, ‘I Wanna Be Your Man’. Also, the Beatles got their manager and them together. Nonetheless, I think it’s feasible that the Stones could’ve been successful without the Beatles, and their image might also have been different as they were marketed as a contrast to them, so perhaps they would’ve been the alternate version. Or another band. Who knows?

Incidentally, there’s an odd anomaly regarding the Stones and Sarada’s and my tastes. They’re perfectly complementary. You can guarantee that if I like a particular Stones track, she’ll dislike it and vice versa. I don’t know why.

There was a period during which the youth of the day thought that music could change the world. If this is so, it could mean that certain absences or presences would alter the course of history on a global scale. However, one of the problems with this idea, at least in terms of popular music, is that it’s dominated by capitalism and marketing. There’s also the principle of ars gratia artis, meaning that whereas music might alter minor details on the scale of the oecumene, it shouldn’t have to do anything because it’s worth something in itself. That said, protest songs do exist and it’s in the area of protest that the value of music for social change is most significant.

I no longer bother to go to demonstrations. My initial reason for going on them, consciously at least, was that they would make a difference to government policy. I felt guilty about not going on the CND demonstrations in the early 1980s because I thought they could make a difference to the Cold War and the prospect of a nuclear Holocaust. Ironically, the person who made the most difference to that outlook was probably Stanislav Petrov, who saved the world in September 1983. This might actually make him the most influential person in the twentieth century, more so than Hitler or Lenin.

As I grew up, I started going to protest marches and the like, and came to conclude that they aren’t actually about changing the world in that way at all. Nevertheless, they do actually change the world and are worthwhile in other ways. First of all, it would make sense for the establishment to deny that they’re influential because it would make them vulnerable to direct action to change policies to which they’re committed. Second of all, there’s the final straw argument that a slight push could at some stage change things. However, neither of these are really the point of demonstrating to me.

Demos do the following:

  • Encourage people by making them feel less isolated.
  • Allow people to express their feelings.
  • Provoke protestors into going back into their communities to make changes by grassroots action.
  • Publicise issues.

Hence they do make a difference. However, I see them mainly as a form of street theatre, and as such completely valid. I don’t mean to detract from their value in anything I say here. It does, though, often mean that attending a demo or not is a matter of personal taste and not a moral decision.

Music, even pop music, can have similar functions. They allow us to express our feelings and they make people feel less isolated. ‘Saving All My Love For You’ by Whitney Houston is, in my opinion, a pretty naff song, but it does attempt to express the experience of being the mistress, immoral though that situation may initially seem. Less mainstream examples would be Tom Robinson’s ‘Glad To Be Gay’, and Bronski Beat. There’s a sense in which such songs are sung by young dudes carrying the news. Incidentally, I always used to wonder where they plugged in their electric guitars in an apocalyptic wasteland. In fact, pop music does all of these things, and I’m not even talking about Crass or Chumbawumba, so it is in fact possible for music to change the world, although like demos that change is intangible and not quantifiable. On a smaller individual scale, which amounts to a second set of intangibles, it makes a difference on an “our tune” level, or people meeting at gigs and bonding over music at parties or in bedsits. It does happen, in both these ways. It happened to us, in a very roundabout way.

Then there’s the “art vs science” aspect of all this. Popular music is in a sense an art, although with a considerable portion of conveniently forgotten dross which gives us the impression of a golden age which may never have existed. Sometimes I even like the dross, mainly for its associations in my own life. Other things are science and technology, but creative even so. It may or may not be the case that Edison and Swan both invented the filament lightbulb. In fact they didn’t because that honour belongs to Humphry Davy who made one using platinum wire, or possibly someone else, in the early nineteenth century. The idea of the filament light bulb was out there to be plucked, and artifice and genius was needed to perfect it, for example the removal of soot and the development of better vacuum pumps. If Edison had never been born, someone else would’ve invented a good light bulb and no-one would be any the wiser. If Lennon and McCartney had never existed, it’s unlikely that any Beatles music would, but possibly the Stones would, and if not them some other equally successful band. If Shakespeare hadn’t existed, the situation is hard to imagine due to the remoteness of the sixteenth and seventeenth centuries, but there would undoubtedly have been other good and successful playwrights.

In political terms, although I tend to think in terms of impersonal forces influencing world events in a similar way to technological and scientific change, that would suggest that politics is more a science than an art. If politics is an art, though, the great “man” theory of history could be true, because there are then elements of personal creativity in politics. People don’t just happen to be in certain places and times, although this surely helps. Henry VIII’s establishment of the Church of England comes to mind as a counter-example. Although this was a manifestation of the Reformation in England, Martin Luther’s actions had similar results in the Holy Roman Empire at a similar time, though the details differ, as they would. This would also mean that Winston Churchill’s premiership during the War was irrelevant and that Chamberlain could’ve achieved a victory. Although I find that hard to believe, that may be the result of propaganda. Nonetheless I don’t accept it.

It could be that whether individual figures are important depends on the political leaning of the party. The Conservative Party, which portrays itself as supporting the freedom of the individual and almost certainly does support it in the case of the wealthy, might depend more on the personality of the leadership or other individual members than Labour. However, many people right now would consider Jeremy Corbyn’s leadership to be a significant factor in electoral success, if nothing else, and it could also be that the Tories set too much store, at least publicly, in personality. But this may be to focus too much on what happens in Parliament, and as my mention of demonstrations as a single example of many different processes happening in society at large shows, this may turn out to be less significant than it seems, regardless of whether one is left or right wing.

It’s been noted that the rich tend to attribute success to their own personal qualities and hard work, whereas the poor tend to ascribe it to luck. Both biasses are the result of one’s background, but it’s not possible to step outside that and comment from outside society on which is more valid. It would follow, even so, that the party of the rich, if that’s what the Tories are, would believe in individual effort and something like noblesse oblige and the party of the poor, if that’s what Labour is, would believe in less identifiably individualistic influences. I’m also aware of missing out a lot of other parties with other perspectives, and of more “extremist”, if that’s what they are, positions.

Nonetheless, I maintain that music does change the world. It changes how you see your life, for good or ill, and it changes how able you are to do something about the world. In this respect it’s similar to other aspects of public life such as political and social activism, and I’m not just talking about demos although they’re included in that. However, it shouldn’t have to be any of those things to have value, because it has intrinsci worth.

Future Englishes

English is currently the most successful human language, and in terms of users, the most successful human language of all time up until now. This thought raises another: was there a time when everyone used the same language, i.e. the ancestor of all spoken languages? Or, did spoken language appear several times in different places, so that there are or have been languages completely isolated from each other?

This is not what I’m going to go on about today, although it’s interesting to consider the related matter of how English will disappear. This may come up. I’ve written a novel on that very subject, after all, so you think it might. One day English must die out, maybe because it develops into something else, or maybe because the community using it will disappear. It could, for example, fall from grace for political reasons, but if that happens it’s likely to become detached from the original Anglosphere and be replaced by a more global impetus. Or, it could just become incomprehensible to present-day speakers and other users, as it has done in the past.

Linguistic communities are defined by mutual comprehensibility. That is, if two people with no knowledge of other languages are guaranteed to understand each other, they are using the same language. One-way comprehension isn’t enough. This is a 1989 translation of the Lord’s Prayer into Dutch:

Onze Vader in de hemel,

uw naam worde geheiligd,

uw koninkrijk kome,

uw wil geschiede,

op aarde zoals in de hemel.

Geef ons heden ons dagelijks brood

en vergeef ons onze schulden

zoals ook wij anderen hun schulden hebben vergeven,

en stel ons niet op de proef

maar verlos ons van de duivel.

…and this is a translation of the same into Afrikaans:

Onse Vader wat in die hemele is,

laat u naam geheilig word.

Laat u koninkryk kom,

laat u wil geskied,

soos in die hemel net so ook op die aarde.

Gee ons vandag ons daaglikse brood,

en vergeef ons ons skulde,

soos ons ook ons skuldenaars vergewe.

En lei ons nie in versoeking nie,

maar verlos ons van die bose.

A Dutch speaker won’t have any problems understanding both, but an Afrikaner might well struggle with the Dutch version, although both are in fairly simple language because of the nature of the prayer. Getting back to English, this can mean that speakers (I’m going to use that word for now although there are good reasons not to in some contexts) of certain registers would find it easier to understand the English of other eras, and speakers of the past would be more likely to understand later examples of the language than the other way round.

If I approach English naïvely, and this has to be a guess because of having reached the “unconscious competence” stage in German and earlier phases of English itself, I would guess that the cut-off point for spoken English comprehensible for someone who learned it in the late 1960s would probably be about 1500. The written version is misleading because our spelling is notoriously conservative, and earlier writings are easier to follow, although they also contain many false friends which give the reader the illusion of being able to understand them. For me, the iconic feature of English pronunciation which would obscure the language is Middle English long A. There are still accents which pronounce the long A as /ɛ:/, close to the vowel in RP “air”, but in 1500 that pronunciation would’ve been /æ:/, and to my mind that’s too big a difference for it to be readily understood to twentieth-century ears, let alone today’s. Nor would it be the only difference. However, someone in 1500 probably wouldn’t have the same difficulty understanding how we speak today, particularly away from the English home counties and the Southern Hemisphere, although the omission of “thou”, the use of the present continuous and the incessant use of “do” would be hard to handle. A speaker in 1400, though, wouldn’t be able to understand how we speak today. This takes us practically back to Chaucer’s time, when we’d have to handle something like this:

Whan that Aprille with his shoures soote,

The droghte of March hath perced to the roote,

And bathed every veyne in swich licóur

Of which vertú engendred is the flour;

Whan Zephirus eek with his swete breeth

Inspired hath in every holt and heeth

The tendre croppes, and the yonge sonne

Hath in the Ram his halfe cours y-ronne,

And smale foweles maken melodye,

That slepen al the nyght with open ye,

So priketh hem Natúre in hir corages,

Thanne longen folk to goon on pilgrimages,

And palmeres for to seken straunge strondes,

To ferne halwes, kowthe in sondry londes;

And specially, from every shires ende

Of Engelond, to Caunterbury they wende,

The hooly blisful martir for to seke,

That hem hath holpen whan that they were seeke.

That looks pretty close to present day English written down, but reading it aloud reveals how much the language has changed. Thus, if the average rate of change in the English language in the next half-millennium is the same as it was in the past one, we could expect people to be speaking a different language by about 2600. But it’s quite an assumption to suppose that the average rate of change will be the same.

The history of English, like that of many other languages, is divided into three periods: Old, Middle and Modern. However, considering that languages have different lifespans and rates of change vary over their history, this means that the divisions between them would occur at different times. For our tongue, the boundaries occur at the fairly arbitrary stages of 449, when the West Germanic tribes arrived in Britain, 1066, when the Norman invasion led to the oppression of English, and 1485, when the Battle of Bosworth Field occurred and is often used to mark the end of the Middle Ages in this country. Given that each division lasts about five centuries, we’re due for another phase in the history of our language.

Three different processes can be identified in the change of the English language. The first can be attributed to language change in general, so for example we can expect initial H to be dropped because that happens often in other languages. A Spanish dictionary I had as a child referred to older Spanish speakers pronouncing the H weakly and younger speakers having dropped it, and the initial H of Latin was also dropped many centuries before that. The second concerns distinctive English trends. The Great Vowel Shift comes to mind, and I’ve already alluded to that. Finally, there are the external influences on the language, such as the fact that it ceased to be the official Crown language for several centuries after 1066.

The first trend is easy to anticipate. The grammar of languages tends towards isolation. That is, they go from complex inflections of words – amo, amas, amat; amamus, amatis, amant – to simpler – j’aime, tu aimes, elle aime; nous aimons, vous aimez, elles aiment. In this case the spelling lags somewhat behind the pronunciation but even in writing the number of different forms has fallen from six to five at the cost of introducing an obligatory subject pronoun. Likewise in English we used to have different forms which have levelled to fewer, notably in the area of strong verbs, so that for example “help” and “climb” are now “helped” and “climbed” as past participles rather than “holpen” and “clomben”, and this process is continuing today with, for example, “thrived” rather than “throve” and “thriven”. Afrikaans has taken this trend further and now has no strong verbs at all. It also only has “is”, a development I can easily imagine happening in English too. Most verbs in the present indicative have little to lose nowadays, since most of them only vary in the third person singular – “she takes” rather than “she take” – although that development could occur.

As I’ve said, the Great Vowel Shift is the most obvious distinctively English trend, although similar processes have occurred in other languages. This has been blamed on the Black Death and the subsequent movement of people from Northern to Southern England. For this reason, vowels in English accents are often described in terms of their Middle English ancestors. The biggest changes are in the long vowels: A, open E, closed E, I, open O, closed O and U. Of these, long I and U (now spelt “ou” or “ow”) have changed the most, now being pronounced “eye” and “ow” as opposed to “ee” and “oo”. Their changes left a vacuum into which the other vowels were then able to move without creating ambiguity and confusion, A becoming “ay”, closed E and open E merging as “ee” in most accents, Irish being the main exception, closed O becoming “owe” and open “aw”. This shift is most pronounced in Australia and New Zealand and less so in North America than in RP. The short vowels are less affected. These shifts are somewhat paralleled in High German with the change from long I to “ei” and long U to “au”, although the rest haven’t been affected as much as in English, in the opposite direction in the merging of French “an/am” and “en/em” to the more open form followed by the alteration of “in/im” to what used to be the position of “en/em” (though with a consonant), and in Greek with the change of eta to an “ee” sound.

Initial consonant clusters have also tended to disappear, such as “kn-” and “wr-“, and a similar process occurred with “wh-” merging with “w-“, which has happened in my lifetime.

The third process would be the external influence on English. One of these is the spread of literacy, which has led to pronunciations falling somewhat more into line with spelling. For instance, “often” no longer has a silent T and “again” tends to be pronounced with a diphthong rather than short E. Another possible change could be wrought by the use of English as an international auxiliary language between people neither of whose mother tongues are English. In the Far East, it’s possible to encounter a form such as “I hear a smell”, which I imagine is influenced by a language which uses a more general verb for certain sensations than our separate “smell”, “taste” and “hear”, although for some reason I’d still expect “see” to be different. This might come to alter two particular idiosyncracies of English, namely the two separate verbs “do” and “make” but the single verb for “know”, which tend to be the other way round in at least many European languages if not others.

A couple of things are going on right now which could influence the future of this language. One is Brexit. English is used as a lingua franca in the rest of Europe, in its Commonwealth variant. If we leave the EU and stay outside it, it’s possible that we will fall more under the influence of American English, which has already been making itself felt since at least the War, and end up using at least an American idiom if not the actual General American accent itself, while the rest of Europe maintains “British” English, perhaps more influenced by Irish English and maybe also Scottish English than before. Another is the rise of other powers than the English-speaking United States, which may lead to the loss of prestige as an international language. A third issue, which we’ve only encountered for rather over a century so far, is that since about 1877 it’s been possible to record the voice, meaning that we hear older ways of speaking more than we used to. I would expect this to slow the change in English pronunciation, although of course listening even to the pronunciation of English English speakers in 1960 and before can sound quaint to us today.

There have been numerous fictional attempts to imagine the future of English, and I plan here to focus on six: Russell Hoban’s ‘Riddley Walker’, Will Self’s ‘The Book Of Dave’, Anthony Burgess’s Nadsat, George Orwell’s ‘Newspeak’, David Mitchell’s ‘Cloud Atlas’ and the attempt made in ‘The Dune Encyclopedia’. There will obviously be spoilers for all of these, although it could be said that it’s impossible to spoil a literary work because the plot is subservient to other aspects of the novel.

Russell Hoban imagines Kent two thousand years after a nuclear holocaust. He includes a map of the county as he imagines it then. I expect there’s a lot that goes over my head with this book, although it interests me because it’s set near my birthplace. One of the language’s distinctive features is its use of folk etymologies. It tends to analyse and reconstruct words based on presumed popular ideas of their origin, and these etymologies are often bawdy or risqué, such as “Dunk Your Arse” for Dungeness above and “Sam’s Itch” for Sandwich. This reflects the salvaging of the remnants of advanced technology for new purposes that now constitutes the characters’ way of life. There are some surprising grammatical developments such as “et” for “did eat”, which is a strong verb, rather than “eated”, part of a general inconsistency. Viewed realistically, it couldn’t be expected that the vernacular of two millennia in the future would be even remotely comprehensible to us today, particularly if all records have been destroyed.

Will Self’s ‘The Book Of Dave’ has much in common with Hoban’s, but Self clearly has an axe to grind about sacred texts having unforseen consequences and his novel is largely satirical. An embittered cabbie called Dave has angrily written down his feelings about women in the wake of a messy divorce, printed them on metal and buried them. After a flood destroyed much of Southern England, these are discovered, and by five hundred years later, this has become a sacred text written in a language now known as “Mokni”, largely based on Cockney and text speak. This religion is of course highly misogynistic. This includes quite inventive terms such as “befansemis” (Elizabethan semis) for “houses”, “childsupport” for “dowry”, “cloakyfing” for “burqa” and “dashboard” for “Milky Way”. The general idea seems to be of a restricted view of the world where the unwitting founder of the religion can’t look beyond his own restricted life, such as the dashboard of his own car, to see the wider world or the possibility of other perspectives.

‘The Book Of Dave’ and ‘Riddley Walker’ form a kind of pair, and whereas I haven’t heard this from Will Self, I’d expect him to acknowledge openly that Hoban was a major influence. The form of the language in both is linked to a general idea which extends beyond the usual changes one might expect in English, although the use of F and V for “TH” is to be expected and both are clearly being spoken in future worlds which are no longer globalised, so the influence of other countries is absent. The same is not true of Anthony Burgess’s “Nadsat”, the youth argot used in ‘A Clockwork Orange’.

I’m not sure when the events of Burgess’s novel are supposed to take place, but it’s clearly supposed to be in the near future of the publication date of 1962. The most distinctive feature of Nadsat is its use of Russian vocabulary, such as “horrorshow” for “good”, from the Russian “хорошо”. The name of the slang itself is from the Russian equivalent of the “-teen” morpheme. “Droog” for “friend” is another instance, but there are also other techniques of word formation such as eggy-peggy-type codes and rhyming slang. The idea was partly to create futuristic-seeming slang which wouldn’t seem quickly dated, a common problem with coinages and the use of slang in fiction. I can only suppose that Burgess chose Russian as a reference to the Cold War, and I wonder if it was meant as a sign of naïveté on the part of the youth subculture, kind of mindless rebellion against the establishment as seen in the later real world by the quasi-punk adoption of Nazi symbolism.

By far the most famous example of a future version of English is George Orwell’s Newspeak as used in ‘1984’. Although the main influence on this language’s creation was polemical in a political sense, Orwell also chose to include features he disliked in English as spoken at the time of writing. The role of Newspeak is to restrict thought by reducing the flexibility and variety of language. I used a Newspeak-derived version of English in my short story ‘Kibuco’, partly because the narrator’s first language was Esperanto, which had been used for a similar purpose in the story, and Newspeak and Esperanto share many features although I don’t know if it’s intentional. It’s said that an Esperanto dictionary will only be about a tenth the size of a similarly comprehensive dictionary in another European language because it relies so heavily on affixes. The idea behind this feature in Zamenhoff’s language is to make it easier to learn and acquire a useful vocabulary, but if Orwell is right about the thought-restricting capacity of such an approach, it could also be used for that purpose. Each new edition of the Newspeak dictionary is smaller than the last because words are being destroyed. This is all so well-known it’s probably not worth mentioning.

‘Cloud Atlas’ depicts two different stages in the development of English. The earlier example, ‘An Orison Of Sonmi-451’ is more sterile and restricted, perhaps like Newspeak although not deliberatedly constructed, using many genericised trademarks such as “nikon” for “camera”. This is mid-twenty-second century, and dystopian. This is part of the drift from the high-flown language of the chronologically earlier sections of the novel into the impoverished and commercialised vocabulary of the penultimate setting. After the fall, the second, later version of English is used, in ‘Sloosha’s Crossin’ An Ev’rything After’, but although it could be seen as a still more degenerate version of the language, it manages to be much more vibrant and expressive than its predecessor. The commercialised elements are gone. Unfortunately, I found it impenetrable and quite trying, and all the more so because it was the longest section of the book.

All of the above examples could be said to be distorted from the viewpoint that rather than trying to portray the actual future of English. The same doesn’t apply to the same extent to the languages of Frank Herbert’s ‘Dune’ series trilogy, of which there are two: Fremen and Galach. Most of the neologisms in the trilogy are from Fremen, which is a conservative descendant of the Arabic language, but Galach is a development of English and Slavic, mainly the former, although in the narrative parts of the novels it’s hardly explored at all. Fortunately, there is a de-canonised encyclopedia associated with the books which does, and in this case professional linguists have had a go at it in considerable detail. Five stages are distinguished, dating from about the year 9000 CE up until seventeen thousand years after that. The first stage involves the change of TH to F and V, as with several other examples above, “-ing” in gerunds becoming “-in”, along with the Second Vowel Cycle, which is a similar process to the Great Vowel Shift but applied to long pure vowels in present-day English. One of the crucial grammatical changes is the evolution of “of X” into “əX”, replacing the Saxon genitive “-‘s”, the idea being that it’s similar to “man o’ war”. This acts as a precedent to the junction of other prepositions to nouns and the development of an extensive inflectional prefixing case system. In the meantime, pleonastic pronouns begin to be used and are similarly appended to nouns. An example of the language given is the Galach for “a bird in the hand is worth two in the bush” – “baradit nehiidit beed gwarp tau aubukt”. However, it’s still quite sketchy.

In conclusion, all of this could be seen as rather optimistic because it isn’t at all guaranteed that there will even be people around to speak this language in the case of the mid-twenty-second century onward, or if there are, what kind of world they’ll be living in. This post-apocalyptic thought has been employed in the creation of some of these visions. On the other hand, there’s a strong theme of using changes in language to illustrate certain points, which is to be expected and not problematic. My own vision of English, if there’s anyone left to speak it in the future, it’s likely to have been influenced by non-European languages but sound rather like an exaggerated version of the New Zealand/Aotearoan accent along with a strongly creolised flavour to the grammar like Jamaican patois. That is, if it exists at all.

Writing And Depression

There’s said to be a correlation between writing and depression, in that people who have a diagnosis of clinical depression are more likely to keep a journal. One response to this finding is to advise depressive people not to keep diaries, in the expectation that there’s a causal relationship in that correlation. As far as I know, although this connection has been established, it’s unclear whether it’s because journalling, if that’s the word, and depression have a common cause, or whether depressed people write diaries for therapy, or, as seems to be the assumption, writing a diary helps make you depressed.

All three could be true to some extent. Just because you think something is therapeutic, it doesn’t mean it is. One thing I learnt from being a herbalist is that in terms of health, people tend to be their own worst enemies, and in particular that some people have a dynamic where they seem to be prepared to change absolutely anything about their lifestyle except for the one simple thing which would make the most difference. People are driven to self-harm. But self-harm itself isn’t simple and can be a coping strategy and a form of therapy for some. It can be motivated by numbness or the need to express outwardly pain one feels inwardly, and also self-harm can be very subtle, as with Lesch-Nyhan Syndrome for example. This is the inherited X-linked absence or deficiency of an enzyme which results in the build up of uric acid in the tissues, leading to physical damage and death before the age of thirty in most cases. It also involves self-harm, involving severe lip- and finger-biting, head-banging, gouging of eyes and scratching one’s face. People with it are often physically restrained to prevent them from doing this. But from a psychiatric point of view, one of the interesting things about Lesch-Nyhan is that in people with two X-chromosomes, who are therefore less affected, although it’s usually asymptomatic some of them develop emotionally self-sabotaging behaviour instead of physical self-harm, tending to exclude themselves from socialising in spite of wanting to, and pushing people away emotionally. I say “interesting”, but I could equally well have said “tragic”. It’s fair to say that those who are wont to self-harm physically are familiar with more subtle self-sabotage, and this could be carried out through writing.

Some people troll themselves. They make up sock-puppet accounts on social media and fora and comment on their own stuff in a negative way, in order to make themselves feel bad. This is very obviously self-harm, in this case in public. They tell themselves that they’re ugly and stupid, that nobody will ever love them and so forth. This is a kind of writing, of course, which is not therapeutic as far as I can see, although I shouldn’t make assumptions from the outside. All I can say is that it seems unlikely that it helps people feel better about themselves, although they may sometimes be trying to elicit sympathy from others.

But this is not necessarily particularly novel. The difference is the easy publicity of approaching it in this way. A rather less public and much older form of self-trolling might occur in writing a diary, and this wouldn’t seem to be therapeutic. A diary could consist of a series of passages where the diarist is trying to make themselves feel bad, but there could also be less overt ways in which the entries are harmful because they may brood over things and pull the writer down into the abyss.

I’m portraying this as if there’s a choice about it, but there may not be. I don’t wish to label and pathologise everything I do, but along with being practically certain I’m diagnosable as depressed, I’m also pretty sure I exhibit hypergraphia. Hypergraphia is the compulsion to write. There’s a case on record of a neurologist who wrote a textbook based on her compulsive note-taking and went into overdrive when she lost twin babies shortly after they were born. There’s an association with temporal lobe epilepsy and bipolar disorder and it responds to anti-depressants. It’s said to be rare but I’m extremely doubtful about that. Maybe it’s unusual for someone to write all over their walls and ceiling and proceed to cover every blank piece of paper they can find with writing, but that’s an extreme and judging by my own internal state, and also I suspect Sarada’s, the urge to write is constantly there and needs to be sat upon to stop it from happening, although the existence of writer’s block suggests that this is not always so. Isaac Asimov, Sylvia Plath and Danielle Steele are examples of hypergraphics whose products have turned out to be publishable and popular. However, even if it’s helpful, there are problems with writing because it can stop you from doing other things like earning a living, so if you are hypergraphic, you’d better hope you’re also lucky because your stuff may well not be publishable or noticed as such. It just spools out with no inner critic. The inner critic is something to do with the temporal and frontal lobes, or maybe their interaction, and we don’t always have the luxury of the right degree of connection between the two.

Not everything is hypergraphia. Sometimes there’s just diary-writing, but even therapeutic self-examination can later turn out to be problematic. When you write a journal, you may well be helping yourself work through stuff, but that could also be the stuff you need to work through at the time rather than later. If you do a good enough job of putting your feelings meaningfully down on paper, re-reading it can pull you back into that state of mind when you’ve got past it, and if like me you tend to dwell on the past and have difficulty letting go of things, this is quite a hazard. The same could apply to more creative writing such as short stories and novels.

There are reasons why writing itself as a profession or pastime could predispose one to depression. It’s a solitary activity and it can take a long time, if ever, to receive validation for one’s work. Your success depends more than usual on the approval of others, possibly lots of them. It can also be very hard to deal with rejection, although to be honest I can’t personally see a distinction between that and failed job applications. You might be writing indoors and depriving yourself of daylight, or you might be writing late into the night or find that your sleep is interrupted by ideas or conversations which you have to get down. If that happens, and they won’t leave you be unless you do something about them, you start to lose sleep which seriously risks depression and other mental health problems. You might also not exercise much, although I’ve found that exercise stimulates creativity, for better or worse.

Finally, back to the issue of writing as self-harm. This is where it gets complicated. If one is wont to exercise self-sabotage, it can get hard to tell whether pathologising one’s output is a sign of self-sabotage or the output itself is self-sabotage. This is the kind of thing one might want to write about, but then again, should I?

. . .

Nothing here today because I felt I was pressurising myself too much to churn out material and it was reducing the quality. But I have blogged on transwaffle.

Get Unknotted!

I don’t really hate many people at all. Michael Howard was one person I genuinely did hate, and on the whole the people I’ve hated have been distant from me rather than in my social circle. It’s probably telling to consider who the people I hate actually are. They tend to be either politicians or philosophers, which probably means the things which touch me most personally are philosophical and political issues.

Surely am I not alone in hating politicians. This is probably entirely normal, and the difference there is in the specific politicians I hate. I know, for example, that a lot of people hated Tony Benn. The only animosity I felt towards him was his continued loyalty to the Labour Party long after it had apparently become a force for evil, and even then I didn’t hate him. I just didn’t respect him any more. As it turns out, I was provisionally wrong because it’s clearly possible for the party to become left wing, also known as rational, again. But there’s something about politics which makes people hated. Looking back on Michael Howard, he accepted what became Section 28, which made it compulsory for teachers to condone hatred, and was responsible for introducing the Criminal Justice Act 1994, which removed the right to silence, among many other things. But is a politician to be judged on what they do in office? Jeremy Corbyn, for example, is not going to cancel Trident even though he believes it should be.

My hatred for philosophers is probably more like a sports fan’s hatred for a rival team, but it’s based on the idea that philosophers have a responsibility to the world. This could be opposed quite easily by a philosophical position, and sometimes is, but to me the point of everything is to make the world a better place, or at least to encourage people to do good, and an important role and duty of a philosopher is to criticise the way things are and come up with a different way of seeing and doing things. There are many philosophers who abrogate this responsibility, either because they’re in denial about it or because they lack the determination to stay on the side of the angels. Since coherence and integrity are important parts of philosophy, a position such as Heideggers support for the Nazis is untenable and throws all of his thought into doubt. It’s not enough to claim that he was living in fear in a totalitarian regime because he actually supported them with considerable enthusiasm, and even if he hadn’t, it would have placed his whole system of thought into doubt if it failed to give him the fortitude to have the courage of his convictions and stand against the Nazis regardless of reprisals. Not that it wouldn’t have been understandable, but it means something is wrong. Philosophy needs martyrs.

But I don’t hate Heidegger. My unease with his writing is more to do with what seem to be its implications – that it doesn’t impel one to oppose evil, or at least didn’t impel him to do so, which leads to me finding his philosophy hard to trust. Is there something about its implications which makes the Holocaust seem okay? But I wouldn’t think this about the medical research done by Joseph Mengele if it turned out to be useful, so I have to ask the question of what kinds of study are damned by their association and what aren’t. Maybe it’s just that Heideggers work is in the second category.

In other words, I tie myself up in knots about philosophy and philosophers.

Though there are philosophers I hate, two of whom are Jean Baudrillard and Jacques Lacan, who was also a psychoanalyst. It hasn’t escaped my attention that both are French, and I wonder about the significance of this. I generally reject the idea that language shapes thought, but I do believe culture taken as a whole influences it, so Baudrillard’s and Lacan’s dodginess are interesting from this perspective. Regarding Baudrillard, the main problem is that he’s playful with serious ideas and seems to be completely callous about real suffering and death, as with his “The Gulf War Never Happened”. Denying atrocities that really happened is the role of the likes of neo-Nazis and child molestors, not philosophers. This callousness extends for other continental philosophers in denying that other species suffer or are even conscious, which is suspiciously convenient.

When I say I hate Lacan, I don’t just mean I disagree violently with his thought and work. I mean I actually hate him as a person. His writing style is deliberately and self-consciously opaque. Choosing a random paragraph, translated into English:

The notion of an incessant sliding of the signified under the signifier thus comes to the fore – which Ferdinand de Saussure illustrates with an image resembling the wavy lines of the upper and lower Waters in miniatures from manuscripts of Genesis. It is a twofold flood in which the landmarks–fine streaks of rain traced by vertical dotted lines that supposedly delimit corresponding segments – seem unnatural.

Jacques Lacan, Ecrits. Translated by Bruce Fink.

Although this is out of context and might therefore be thought of as leading on from and to other things, but in fact wherever I quoted from, it would leave the reader with the same impression. There is an ongoing crisis of style in writing which includes his but started much earlier, perhaps with Hegel, in continental philosophy which leads me to wonder if the reason they write like that is that they have nothing important to say. On the other hand, I also wonder if the point is to act like random patterns for the reader to project meanings into, but when I read David Hume, whose work is over two centuries old by this point, I can’t help but be impressed by his clarity of style:

There are some philosophers, who imagine that we are at any moment conscious of what we call our self ; we feel its existence and its continuing to exist, and are certain—more even than any demonstration could make us—both of its perfect identity and of its simplicity. The strongest sensations and most violent emotions, instead of distracting us from this view ·of our self·, only focus it all the more intensely, making us think about how these sensations and emotions affect our self by bringing it pain or pleasure. To offer further evidence of the existence of one’s self would make it less evident, not more, because no fact we could use as evidence is as intimately present to our consciousness as is the existence of our self. If we doubt the latter, we can’t be certain of anything.

Although the passage from Lacan is a translation and the one from Hume isn’t, I still think Hume’s style is way less knotty than Lacan’s. There’s much more about Lacan to hate than that though, for instance his contempt for his clients and his uncaring attitude towards them killing themselves, and it should also be borne in mind that Freudian psychoanalysis generally involved fleecing unhappy people for years on end at exorbitant rates while being sexist and homophobic and using a florid style of theory which probably made their problems worse.

By Matemateca (IME USP) / name of the photographer when stated, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=68828200

Sometimes you can straighten out a tangle and end up with a straight length of whatever it is which was tangled up. Sometimes this even applies to a closed loop, and we don’t always realise the simplicity underlying these tangles. We might find that this tangle consists of one loop, two loops, three or more, joined or not joined. Knot theory classifies these in various ways, such as in the above exhibit. Although it isn’t included, the rings at the top of this post, Borromean Rings, obsessed Lacan in his later years. I should probably put another picture of them here:

The crucial thing about Borromean rings is that although they’re linked, removing any one of them leaves the other two unconnected. Lacan used this to illustrate the nature of the mind, namely the relationships between the real, the imaginary and the symbolic. None of these is more important than the other two and where one is missing, various forms of abnormal psychology result. For instance, psychosis identifies the real with the imaginary without recognising the symbolic, so the imagination just is real to a psychotic person and the possibility that it’s symbolic of something else is not present in their mind. My immediate problem with this is that it suggests that psychosis is a defect and there seems to be no motivation involved to reclaim madness as positive, although it is often negative, and I don’t think that should be ignored either. One reason Lacan shifted to using the knot is that he no longer considered language adequate to describe the human mind, which in fact is fair enough. He also considered adding an extra ring, the Sinthome, to the knot. James Joyce is seen as doing this. But there’s another problem with Lacan and the Borromean Rings (which are more of a chain than a knot but still a knot in the mathematical sense).

This is one of those occasions when I agree with Roger Scruton. It does happen sometimes. Scruton sees Lacan as a fool. Chomsky is also hostile to him, seeing him as a charlatan, and he’s also been seen as leaving a wasteland of damaged people who were unfortunate to come into contact with him. Just on the subject of the Borromean Knot though, the issue is that whereas it might superficially look like Lacan is trying to link mathematical knot theory and psychoanalysis, he doesn’t actually seem to know what he’s talking about and is merely using it as an extended metaphor, or perhaps trying to give his work a patina of wisdom and respectability without really understanding what he was doing. In the end, he just seems to be performing pretentiously and has taken psychoanalysis so far from the actual emotional problems people have that it’s useless.

Even so, knots are used by R D Laing to describe complicated serious games which really seem to ring true to me. To my surprise, his ‘Knots’ is sometimes described as a series of poems, which given my previous hang ups presumably means I can’t understand them, which is odd because I thought I did up until now. Here’s a quote which illustrates the kind of thing Laing considers to be a knot:

There must be something the matter with him

because he would not be acting as he does

unless there was

therefore he is acting as he is

because there is something the matter with him

He does not think there is anything the matter with him


one of the things that is

the matter with him

is that he does not think that there is anything

the matter with him


we have to help him realize that,

the fact that he does not think there is anything

the matter with him

is one of the things that is

the matter with him

there is something the matter with him

because he thinks

there must be something the matter with us

for trying to help him to see

that there must be something the matter with him

to think that there is something the matter with us

for trying to help him to see that

we are helping him

to see that

we are not persecuting him

by helping him

to see we are not persecuting him

by helping him

to see that

he is refusing to see

that there is something the matter with


for not seing there is something the matter

with him

for not being grateful to us

for at least trying to help him

to see that there is something the matter with


for not seeing that must be something the

matter with him

for not seeing that there must be something the

matter with him

for not seeing that there is something the

matter with him

for not seeing that there is something the

matter with him

for not being grateful

that we never tried to make him

feel grateful

This kind of text seems to make sense to me in a way Lacan’s never could, and it’s also an example of a psychological knot of a different kind than Lacan’s Borromean Knot. It also seems to be helpful in a way none of Lacan’s stuff is.

We do get ourselves tied in knots, and it might even be possible to draw lines between stages in that process which indicate exactly how we’ve got tangled. Given that, it does make sense to me to link knot theory with how we understand our own psyche even though it’s defiled by Lacan’s attention. It also seems to me that looking at my own prose style as a tangle might lead me to straighten it out and make it clearer in a way Lacan took pleasure in not doing with his own.

Mathematical knots are not the same as knots in string with loose ends. A mathematical knot is made of one or more tangled closed loops which cross over and under themselves and each other, or tangled loops which are fused at certain points. There are practical applications to them. One is in molecular biology. A DNA strand is coiled into coils of coils, and as it’s either read to produce proteins or to replicate itself in a process such as cell division, those coils would be bunched up until it became impossible to do anything with them because they’d be too tangled. This is solved by enzymes called topoisomerases. Much DNA is coiled in the opposite direction to the coils of the double helix itself, and topoisomerases reverse this coiling by cutting the strands and gluing them back together again. Organisms living at very high temperatures have DNA coiled in the same direction as the turns of the “staircase” because this stops them from melting apart by reinforcing them. Some antibiotics work by stopping topoisomerases from working, which means pathogens can no longer reproduce or produce toxins because they get too tangled. There’s also a disease called scleroderma, which involves the loss of elasticity in skin and other epithelial tissue which can be fatal, and ironically can be caused by exposure to chemicals used to manufacture PVC. This can involve the production of antibodies to a topoisomerase. Understanding knot theory better might open up possible approaches to dealing with scleroderma and the production of new antibiotics. It also occurs to me, and this is just me so I may well be wrong, that Alzheimers Disease also involves tangles of tau protein in brain microtubules, and tau protein can also be produced by physical trauma to the brain. I don’t know for sure, but I wonder if a drug could be produced which dealt with these tangles which could be used to prevent dementia and the damage caused by head injury, and since one of the consequences of head injury in childhood can be the development of paedophilia, this might turn out to be extremely practical. But that’s just my personal guess and I could be totally wrong.

You can also use it to tie bows faster and untangle things. One interesting question for me right is whether I can use it to clarify my writing, although some of that is easier to address through the likes of turning nouns into verbs, the passive voice into the active and analysing the logical structure of what I’m saying, so for example I could say “the fact that I think is incompatible with it not being the case that I exist” or I could say “I think, therefore I am”. This scene from ‘Blackadder Goes Forth’ is an excellent example of what I do wrong, and to be honest if I could get that sorted, I would myself be sorted. One of the reasons I write like this is that I’ve done a Masters in continental philosophy, so in a way it’s Lacan’s fault.

The other important possible application for me, though, would be to help sort out my own head and work out if I myself have any unnecessary tangles in how I think and feel, like R D Laing’s version of knots. Whether knot theory can be applied to that is quite imponderable, but wouldn’t it be great if it could? And if it couldn’t, you could always use it as an obfuscating theory to start a cult, like what seems to have happened with Lacan.