Nineteen Eighty-Four and 1984

There you go: don’t say I don’t listen to my readers! I don’t want this to seem self-indulgent, so before I start I want to point out that this is a response to a comment, that someone would like me to do something like this, so that’s what I’m doing.

Without tinkering with HTML, it seems difficult to provide links within a document in WordPress, so for now I’ll just give you a table of contents in order to prevent you being overwhelmed with the length of this post:


1. The Eternal Present

2. The Never-Ending. . .December?

3. George Orwell Is Better Than War-Warwell

4. My Secret Diary, Aged 16¾

5. A Collision With The Great White Whale

6. Armageddon

7. The Stereophonic Present

8. Harvest For The World

9. The Ending Story

10. Life Off The Fast Lane

11. Green Shoots

  1. The Eternal Present

To me, the year 1984 CE is a kind of eternal present. I sometimes joke about this, saying that all the years after that one were clearly made up, and someone pointed out to me that that was highly Orwellian, but in fact it really is the case that all years are made up and we just have this arbitrary numbering scheme based on someone’s incorrect guess about the birthdate of Jesus, and yes, here I’m assuming there was an historical Jesus, which considering I’m Christian is hardly surprising.

2. The Never-Ending. . . December?

There is a fairly easy if absurd way to make it 1984 still, which is just to have a never-ending December. It’s currently Hallowe’en 2025, in which case it’s the 14945th December 1984. This wouldn’t be a completely useless dating system and I sometimes think we can conceive of time (in the waking sense: see last entry) differently according to how we choose to parcel it up. Another way of making it 1984 would be to date years from forty years later, and no that’s not a mistake as there was no year zero in the Julian or Gregorian calendars. There was one in a certain Cambodian calendar of course, from 17th April 1975, where it was inspired by the French revolutionary Year One, the idea being that history started on that date because everything that happened before that was irrelevant, being part of capitalism and imperialism I presume. My insistence that it’s always 1984 is the opposite of that, as I’m affectedly sceptical about anything happening afterwards. Coincidentally, I use a day-based dating system starting on 17th July 1975 in my diary, and I don’t actually know why I do this, but it’s only ninety-one days after the start of Year Zero (there are other things to be said about Pol Pot which would reveal the over-simplification of this apparent myth). It’s based on the first dated entry in any notebook and my mother’s suggestion that I keep a diary which I didn’t follow. It’s actually the second dated entry, as the first one is of a series of measurements of a staircase, which isn’t really about anything personal. I’ve also toyed with the idea of Earth’s orbit being a couple of metres wider, which would make the year very slightly longer but which would add up over 4.6 aeons (Earth’s age) to quite a difference, but if that were so, asteroid impacts and mass extinctions wouldn’t’ve happened which did and other ones which didn’t might’ve, so it totally changes the history of the world if you do that. If the year was a week longer, it would now be 1988 dated from the same point, but a lot of other things would also be different such as the calendar. It’s quite remarkable how finely-tuned some things are.

3. George Orwell Is Better Than War-Warwell

Although I could go on in this vein, I sense it might irritate some people, so the rest of this is going to be about my feeling of the eternal present, how 1984 actually was to me and thoughts about George Orwell. I’m just telling you this if you feel like giving up at this point.

I have habitually said that “George Orwell is better than War-Warwell” as a reference to Harold MacMillan’s paraphrase of Winston Churchill, and I wonder if Churchill is one of those figures who is always having quotes misattributed to him, like Albert Einstein. The trouble is, of course, that this is a practically meaningless phrase which I can’t do anything with, although Sarada has published a story with that title. I’ve read a lot of Orwell, although unlike most people who have that doesn’t include ‘Animal Farm’. It’s been suggested that if he’d lived longer, he would’ve gone to the Right and become a rather embarrassing figure like David Bellamy or Lord Kelvin, but of course we don’t know and I don’t know what that’s based on. He was known to be quite keen on the idea of patriotism though, so maybe it’s that.

Within the universe of his novel ‘Nineteen Eighty-Four’, we don’t actually know that it is that year. It does seem to be about that time, because Winston Smith was a small boy just after the end of World War II. The Party is constantly revising history and is now claiming that Big Brother invented the steam engine, so it seems easily possible that it isn’t exactly 1984 and that either new years have been written into history or removed from it, and just maybe it’s always 1984 and has been for many years by that point. Maybe they just want to save on printing new calendars or are trying to perfect the year by repeating it over and over again, for example. Maybe ‘Nineteen Eighty-Four’ is like ‘Groundhog Day’, and what we read is merely one iteration among many of that story. I’ve heard, although appropriately maybe this can’t be trusted, that Orwell simply came up with it by transposing the last two digits of the year he wrote it. Whereas it’s possible to play with this, the truth is probably simply that he needed to give Winston enough time to grow up and reach his forties so he could tell the story.

It interests me that there was a somewhat jocular, artsy attempt to claim that a period called the 19A0s existed between the late ’70s and early ’80s which has been edited out of history, which is similar to the Phantom Time Hypothesis. Just to cover these, I’ve written about this before, and the Phantom Time Hypothesis, so if you want you can read about it there.

A slightly puzzling aspect of ‘Nineteen Eighty-Four’ is why its title is spelt out rather than written as figures, but it seems that this was common practice at the time. It’s one thing that everyone gets wrong about the book, as it’s almost always referred to as ‘1984’. I should point out that one reason I didn’t get any further than A-level with English Literature is that I experience an impenetrable thicket of associations whenever I consider mainstream creative works which make it difficult to respond meaningfully to them. In the case of Orwell’s novel though, since it’s arguably science fiction it might be more appropriate than usual to do so, since that’s also how I respond to that genre but find it more in keeping with that kind of imagination. I’m not alone in this it seems: Orwell’s novel is analysed in such a manner by the YouTube channel ‘1984 Lore’. I myself used Newspeak to write a short story about a kibbutz-like community on another planet where everyone actually spoke Esperanto to explore whether language restricts thought, portraying it in terms of the idea that it does.

4. My Secret Diary, Aged 16¾

My personal experience in the year 1984 represents a peak in my life. Note that it’s just one peak, neither the biggest nor the only one. It doesn’t overshadow the year of my wedding or the births of our children, grandchildren or anything like that. ’82 and ’83 are also significant in their own ways. ’82 I thought of as the “endless summer” characterised by the nice pictures of young people in yellow T-shirts and long blond hair on the envelopes you got back from the chemists with the photos in them, and ’83 had been particularly poignant, but the year after those had been highly focussed on for a long time in various circles by many people. 1984 opened for me hiding under a table in a suburban living room in Canterbury whispering to my friend about when midnight came. I was wearing a navy blue M&S sweatshirt whose inner flock was worn on the inside of the left elbow, a blue and white striped shirt with a button-down collar which I was only wearing because she liked it, and jeans which annoyed me by not having any bum pockets, and she was wearing jeans which did have bum pockets and a white blouse with yellow check-line lines on it, but it was completely dark so neither of us could see anything. I was sixteen and had had a lot to drink considering my age, naughtily, as had she. We eventually conjectured that midnight must have passed and I rang my dad, who came to pick me up and whom I immediately told I’d had some alcohol (Martini, Cinzano and a Snowball) which my friend saw as not only typical of my impulsiveness and indiscreetness but also liable to get me in trouble but it didn’t. The street lights looked rather blurry on the way home. Thus opened my 1984. A few days later I was back in the sixth form and my friend Mark Watts, who was later to go on to found an investigative journalism agency and uncover a number of cases of child sexual abuse, informed me that it was vital that we didn’t fall for whatever spin the media were likely to put on it being the year named after that novel and that whenever he referred to George Orwell it would be under the name Lionel Wise (Eric Blair – Lionel Blair; Eric Blair – Eric Morecambe – Ernie Wise), which was quite clever if also rather adolescent, which is what we were. We were all very conscious that it was 1984 at last. Anne Nightingale played David Bowie’s ‘1984’ and Van Halen’s ‘1984’ on her request show on the evening of New Year’s Day. I didn’t have a hangover, because I don’t get them. I asked my brother to record something off Anne Nightingale because I was about to go out again to see my friends, and it happened that the next track was Steve Winwood’s ‘While You See A Chance, Take It’, which I’d wanted to get on tape for years but he cut it off halfway through the first verse. The machine on which that was recorded was a rapidly failing mono Sanyo radio cassette recorder which my mum was annoyed was deteriorating so fast seeing as it was less than four years old and I’d got it for my thirteenth birthday. Incidentally, I’m writing all this without reference to diaries or any other kind of record. I just remember it, plainly, clearly, in great detail, and I don’t know how this compares to others’ memories. My memories of much of the ’80s are as clear as flashbulb memories because they occur within my reminiscence bump. There are errors, such as the exact name of the Steve Winwood record, but also a lot of clarity. Anyway, later that year on my seventeenth birthday, 30th July, I got a stereo boom box possibly from Sony which I first recorded on on 8th August, namely Tracey Ullman’s ‘Sunglasses’, followed by ‘Smalltown Boy’. In September, I got my first job, as a cashier at the new Safeway, which looked enormous to me at the time but on returning to the Waitrose which it now is seems really tiny nowadays, and lost it after eleven weeks due to being too slow on the till, not assertive enough to turn people away from the “Nine Items Or Less” (now “fewer” apparently) queue, and £2 out on the cashing up on two occasions. Apparently this was a lot stricter than other places, such as Lipton’s where my sister worked and who was much further out than I on many occasions when she first worked there. I could say more about her situation there but probably shouldn’t. Anyway, I got £1.41 an hour from Safeway which I saved up to buy the first big item I’d ever got for myself, which was a Jupiter Ace microcomputer. Which brings me to computers.

I was very into computers in the early to mid-’80s, but also deeply ambivalent about them. At the start of the year, the family had owned a ZX81 for a year and a bit. I found this annoying because it was such a low-spec machine, but restrictions fuel creativity so it was in fact not a bad thing. I was spending a lot of my time reading computer magazines and wishing I had a better computer, which I resolved late in that year, and also writing software, mainly graphically-oriented, which was difficult considering that our computer only had a resolution of 64×48, although I was later able to increase this to 192 on the Y-axis by pointing the I register on the Z80A somewhere else than the character set, so I could make bar graphs which looked quite good. I did also write a computerised version of Ramon Llull’s ‘Machine That Explains Everything’, a couple of primitive computer viruses and an adventure game. Later on, after I got the Jupiter Ace, I got it to display runes and produce screeds of nonsense words in Finnish. As I said though, I was ambivalent. I’ve never been comfortable with my interest in IT for several reasons, and for more reasons at this point. One reason was that at the time I was communist, and also kind of Stalinist, and felt that the use of IT and automation as fuelled by the microchip boom would create massive unemployment and reduce the power of the workers to withdraw their labour. However, it isn’t clear to me now why me not having a ZX81 would’ve made any difference to that. In the middle of the year, I decided that communism was over-optimistic and there was a brief period during which people were very eager for me to adopt their views, but I quickly opted for Green politics. I was not yet anarchist and believed in a Hobbesian state of nature. Besides this perspective, I was also uncomfortable about my interest in computers because it seemed nerdy, something very negative at the time, and unbalanced – obsessive and not “humanities” enough to my taste. It felt too much like my comfort zone and not challenging enough. It did, however, become apparent that I had spent so much time studying computers, with text books as well as mags and experimentation, that I could’ve easily aced the O-level, which was another example of how my formal educational focus was outside educational institutions at the time, and it was also suggested that my aforementioned friend with whom I hid under the table and was trying to learn BASIC at the technical college, would’ve welcomed me teaching her. This got to the point where I helped her with her homework. On another occasion, an acquaintance was trying to write a FORTH programming language interpreter in Z80 assembler and I had a look through it with interest. One of my other friends later went on to write parts of the major GNU text editor “religion” Emacs, already almost a decade old by ’84, which I still use today. However, I found my interest in computers made me feel embarrassed and self-conscious and I felt somewhat ashamed of it. I think I found a lot of my interests at the time to be very personal and not something I felt comfortable sharing with others.

It was also the year of my perhaps most significant cultural shift. I entered the year enthusiastic about mainstream literature and poetry. I had been warned, though, by my O-level English teacher, that A-level English Lit was likely to spoil my appreciation of reading, and this did in fact happen. Early in the year my enthusiasm continued and I came to enjoy reading poetry and literature. I planned to continue my writing on the works of Samuel Beckett as part of my A-level and the fact we were studying Joyce gave me optimism in that regard. We had a fair bit of freedom to do that kind of thing. In the summer exams, my practical criticism of a particular poem was chosen as a model answer for others to emulate and I was able, for example, to uncover themes in poetry which my teacher hadn’t noticed, which was mainly due to my insistence on maintaining a wide education. I was applying to university in the later part of the year, having researched them in the earlier part, and having opted for degrees in English and Psychology or Philosophy and Psychology, I was clearly sufficiently committed to English at the time to consider it as a first degree. However, all of that was about to go to shit.

5. A Collision With The Great White Whale

It may be worth analysing what went wrong in some depth, but the simple facts of how it happened were as follows. My A-levels were in English, RE and Biology, which I want to stress is a very popular combination. At the end of the first year, around June, there was a marine biology field trip which was in itself quite formative for me because I didn’t relish getting stuck in the stinky, sticky black tarry mud encouraged by the anaerobic respiration in Pegwell Bay, an estuary on the edge of Thanet. It was cold and wet, and the water was of course salty, and I thought I’d ruined that sweatshirt I’d mentioned earlier which I was once again wearing. My dissatisfaction was palpable. Anyway, it was assumed by the English department that those who were off on the field trip would, possibly from their friends, learn their summer reading assignments, which were to read James Joyce’s ‘Dubliners’ anthology and Herman Melville’s ‘Moby Dick’. I didn’t get that information, didn’t talk about the assignments with my friends because it wasn’t a priority for us and consequently was confronted with reading an absolute doorstep of a book plus much of the Joyce one, which was less problematic because being short stories it was easy to catch up with that one. I was then confronted, on reading Melville’s novel, with a load of American men murdering whales for a living. Right then, I wasn’t even vegetarian but I did, like a lot of other people, believe in saving the whale. Over my childhood, I’d read a lot of story books about animals, like ‘Ring Of Bright Water’, ‘All Creatures Great And Small’, ‘Incredible Journey’, ‘Bambi’, ‘Watership Down’ and ‘A Skunk In The Family’. Of course there was peril in these and also horrible deaths on occasion, not to mention sad endings, but the focus was on the otter, the bovines, dogs, cats, deer, rabbit and skunk. There is no problem with depicting them being treated badly, suffering and so forth. But in ‘Moby Dick’, there is never any sympathy or focus on the experience of the whales or acknowledgement of them as victims, in a similar manner to the people who had lived in North America before White colonisers turned up. It was all about something else, and there wasn’t just an elephant in the room but a whale. I was unable to bring myself to step into Ishmael’s or anyone else’s shoes. The only bits I could tolerate were the encyclopaedic sections. I could go into more depth here. I think Melville was probably trying to make a whale-sized book, was using the whale as a metaphor for the intractable and incomprehensible nature of, well, nature and the world in general and as a tabula rasa, them being white like a piece of paper, and there’s the angle that the whale is in some way a phallic symbol. Ahab also anthropomorphises the whale, seeing them as a rival in a battle with him when in the end the whale is just the whale and doesn’t even realise the tiny figures above lobbing harpoons at them are even conscious beings. From the novel’s perspective, the whale probably isn’t even a conscious being. Hence I was confronted with what I read as a hostile, nasty and animal-hating, actually animal-indifferent story where I couldn’t work out whether any of the characters were supposed to be sympathetic and,moreover, the only chapters I could actually garner any interest in were dismissed as mere padding by my teachers. I also found, for some reason, that the same approach I’d been taking to poetry up until the summer no longer seemed to work. It probably didn’t help that one of my teachers was a frustrated Classics teacher who later left and taught that at the King’s School, although I was interested in the classics she managed to shoehorn into the lessons such as Oedipus Tyrannus, the Oresteia and Antigone. I would say, though, that I really didn’t get on with the Oresteia because I felt very much that it lacked universalism. None of that was in the exams of course, but I wasn’t ever very oriented towards those. I was more just interested or not.

The autumn of the year was marked mainly by anxious procrastination about submitting my UCCA form, which I handed in a month later than I was supposed to due to indecision about what to put in my personal statement, which wasn’t up to much partly because of not wanting to admit what I was interested in, and partly because of not pursuing it in a public way due to the shame I felt about admitting it. I also got annoyed with universities insisting on being put first, so rather than selecting places I actually wanted to go to, although my first choice, Keele, I was very keen on due to the balanced and eclectic nature of their educational approach, I deliberately listed Nottingham, Reading and Exeter, followed by Sheffield in which I was in fact fairly interested in. I got rejected by all of them except Keele and Sheffield, Exeter apparently by return of post. Among the polys I applied for Hatfield, Oxford and NELP, and would’ve got into NELP in fact. I liked the modular nature of the course at Oxford, which appealed to me for the same reason as Keele did.

6. Armageddon

Another association which arrived in 1984 and which has been with me ever since is the idea of “proper Britain”. I may have mentioned this before, but the notorious nuclear holocaust drama ‘Threads’ was broadcast on 23rd September 1984, notable for being the first depiction of nuclear winter in the mass media, and I remember being edgelordy about it by saying to my friends that it was over-optimistic. I was ostentatiously and performatively depressive at the time. I did not in fact feel this, but my takeaway from it was probably unusual. There’s a scene at the start where Ruth and Jimmy are canoodling on Curbar Edge above Hope Valley which really struck me. It was grey, drizzly and clearly quite cold, even though I think the action begins in May. There’s also the heavily built up large city of Sheffield, where I might be going in a year or so, and it suddenly crystallised my image of what Britain was really like. Not the South with its many villages and small towns densely dotted about with relatively dry and sunny weather, which I was used to, but the larger block of large post-industrial cities with redbrick terraced houses, back-to-backs, towerblocks and brutalist municipal architecture set against a background of rain, wind and greyness. I relished that prospect, and it felt like real Britain. This is how the bulk of the British population lives, and it becomes increasingly like that the further north you get, hence my repeated attempts to move to Scotland, which in a way I feel is more British than England because of many of those features. By contrast, if you go from Kent to France it’s basically the same landscape and climate with different furniture. Maybe a strange reaction to a depiction of a nuclear war, but there you go.

I did, however, also feel very much that it would be strange and foreign to move away to an area dominated by Victorian redbrick terraced houses. I couldn’t imagine that they’d ever feel like home to me and I couldn’t envisage settling down there. I was still very much a Southerner at that time. I was also, however, fully aware of the privileged bubble I was living in and it made me feel very awkward.

Nor am I ignoring the actual content of the film. The Cold War and the threat of nuclear destruction was very high in many people’s minds at the time and it almost seemed inevitable. This made even bothering to make plans for the future seem rather pointless and almost like busy work. We all “knew” we were going to die horribly, as was everyone around us, so doing the stuff I’ve mentioned, like applying to university, seemed more like something I did as a distraction from that worry than something with an actual aim sometimes, depending on my mood. This had a number of consequences. One is that I wonder if a lot of Gen-Xers underachieve because they missed out on pushing themselves into things in their youth, expecting the world to end at any time. Another is that as the ’80s wore on, pop music and other aspects of popular culture began to reflect that anxiety. Ultimately even Squeeze (basically) ended up producing an eerie and haunting post-nuclear song in the shape of ‘Apple Tree’. Alphaville’s ‘Forever Young’ particularly captures the attitude and is widely misunderstood. The reason we’d be forever young is that we’d never get a chance to grow up and live out full lives. That single was released a mere four days after ‘Threads’ was broadcast.

7. The Stereophonic Present

Speaking of music, there were something like four bands in the Sixth Form at that point, the most prominent being The Cosmic Mushroom, clearly influenced by the Canterbury Scene even in the mid-’80s. My own attitude to music was to concentrate on cassettes because I didn’t trust myself to take care of vinyl properly. The advent of proper stereo in my life was on my birthday at the end of July, and there’s something vivid and recent-sounding about all stereo music I own for that reason. This is in fact one factor in my feeling that 1984 is current rather than in the past. The present is characterised by clear, stereophonic music, the past by lo-fi mono, and that switch occurred for me in summer that year. This is actually more vivid than the earlier shift between black and white and colour TV. Incidentally, CDs were out there for sure, but only for the rich, having been first released two years previously. Like mobile ‘phones, they were a “yuppie” thing, like jug kettles. Back to music. Effectively the charts and my perception of them that year were dominated by ‘Relax’, by Frankie Goes To Hollywood. This was released in November the previous year and entered the charts in early January. This got banned as it climbed the charts, which boosted its popularity enormously and got it to number 1. It stayed in the Top 100 until April the next year. We played it at the school discos, the other standard being ‘Hi-Ho Silver Lining’, which we all used to sing along and dance to. My personal preferences included The The, Bauhaus and The Damned at the time, although the ongoing appreciation of the likes of Kate Bush continued.

8. Harvest For The World

On 24th October, the famous Michael Buerk report on the famine in Ethiopia was broadcast. This led in the next couple of years to Live Aid and Run The World, but from that year’s perspective it only just began. There’s been a lot of justified criticism of media framing of the famine, but as a naive teenager I didn’t have much awareness of that and simply saw it as a disaster which required a response from me, which was initially in the form of a sponsored silence for the whole school in the sports hall, then later a sponsored 24- or 36-hour fast supervised by one of my biology teachers in which I also participated. Although I can’t really mention this without pointing out that the whole thing was dodgy, it did start a ball rolling which continued in much later political activism on my part and a passionate youthful idealism to make the world a better place, which I felt confident had to come soon and meant action from me. ‘Do They Know It’s Christmas’ was a further effort in that campaign, satirised by Chumbawumba as ‘Pictures Of Starving Children Sell Records’ and roundly criticised by the World Development Movement, but at the time I knew nothing of this. By the way, it’s remarkable how the unpopular Chumbawumba cynicism managed to get from the political fringe into the mainstream in just a few years with the Simpsons parody ‘We’re Sending Our Love Down The Well’ only eight years later, although that was also linked to a Gulf War song it seems, which however is in that tradition, which I first became aware of, superficially, that year. In fact I can’t overestimate the importance of this sequence of events, even with its grubby and cynical connotations, and my support of it has a simplicity and innocence which I wish in a way I still had. I want the world to be one in which something like that works straightforwardly and simply. As I’ve said before, nobody is Whiter or more middle class than I am.

A rather different aspect of this is that I and someone called Louise almost got the giggles during the sponsored silence and we both spent most of our time doing it, which was I think a whole hour, trying not to laugh. A while after that the same thing happened with the two of us in an English class, though on that occasion we gave into it and there was actually nothing provoking it at all. It then spread through the whole class. Once again, in an English class shortly after that, the teacher, discussing Moby Dick of course, took out a model of a sperm whale on wheels unexpectedly and rolled it up and down the desk, which again led to uncontrollable laughter. This was Thatcher’s Britain, yes, and most of us hated her, but it wasn’t grim or joyless, at least for seventeen year olds, and I actually managed to get some pleasure out of Herman Melville’s writing!

CND was very active at the time. I, however, was not, for a couple of reasons. I was slightly uncomfortable with the idea of unilateral disarmament, and in fact that was the last of the standard lefty/Green causes I committed to, but I had a feeling they were right and wanted to go on the demos but never actually did. This is by contrast with the Miners’ Strike. Kent, like Northern France, was a coalmining area and the strike was very close to us because several of my friends were in coal miners’ families. I asked what I could do but nothing really came to mind. I was also aware of hunt sabbing but was unable to work out how to find out about it. Had I got involved in that, I might’ve gone vegan years earlier than I did.

9. The Ending Story

Then there was cinema. My aforementioned friend under the table rang me up one day and just said we should go and watch ‘Champion’ at the ABC. That cinema, incidentally, was managed by someone I later got to know when he and I both coincidentally moved to Leicester. I was surprised my friend just spontaneously bet on the horses when I’d never dreamt of doing that, at the time because it was gambling. The film, in case you didn’t know as it may be quite obscure, was based on a true story about a famous jockey who has cancer and survives. One impression I got from it was that he looked like Lionel Blair, which is the second time I’ve mentioned him today. At this time it was still possible to sit in the cinema for as long as you wanted while the same films, yes, films plural, played over and over again. This was actually the last year it was possible. The year after, I’d just finished watching ‘Letter To Brezhnev’ and the ushers chucked us all out. It was a real shock, and you don’t know what you’ve got till it’s gone. It meant that parents could use cinemas as babysitting services, though this may have been somewhat reckless by today’s standards. They did the same with swimming pools: Kingsmead had this going on, although specifically in ’84 I didn’t exercise much apart from walking eight miles, to school and back, every day. This lazy year ended immediately with my New Years’ resolution to go running every morning from 1st January 1985.

‘Ghostbusters’ was also quite memorable. I took my younger brother to see it and I wasn’t expecting the whole audience to shout the song when it came on. It’s a good film, with a memorable scene involving a fridge and an unforgettable line which is usually cut towards the end. It also mentions selenium for no apparent reason, and has Zener cards at the start. At the time, rather surprisingly, it seemed to be generally accepted even in academia that some people were psychic. I often wonder whether it’s really good-quality research which has led to received opinion on this changing or whether it’s just a reputational thing that psi is now widely rejected by academic researchers. The other major film I remember watching was ‘Star Trek III’, which is also very good, and at the time there was no plan to bring Star Trek back. It was considered a sequel too far by one of my friends, so at the time it looked like the show was completely defunct and they were trying to revive it beyond all reason. I also saw ‘2010’, which I liked for incorporating the new findings about Europa, but it definitely lacks the appeal of the original. Incidentally, the long gap between Voyager visits to Saturn and Uranus was underway and the remaining probe wouldn’t get there for another two years. The original ‘Dune’ also came out this year, and although I wanted to see it, I don’t think it came to Canterbury. I wouldn’t’ve liked it at the time, having seen it since, and oddly I had the impression it was in a completely different directing style and that it was also a 3-D film. It may also have been the most expensive feature film ever made at the time. ‘1984’, of course, also came out then, but that deserves its own treatment. As other people I’ve since got to know of my age have commented, ‘Neverending Story’ marked the first time I perceived a film as definitely too young for me, and in a way that realisation reflected the twilight before the dawn of adulthood to me.

10. Life Off The Fast Lane

Speaking of marks of adulthood, many of my peers were learning to drive and passing their tests at this point. Although I got a provisional licence that year and my parents strongly suggested I learn, I refused to do so for environmental and anti-materialistic reasons. Although I’ve had lessons since, I’ve never in fact got there and I’ve also heard that an ADHD diagnosis can bar one from driving in any case, if it affects one’s driving ability. I’m not sure mine would but I do think my dyspraxia is a serious issue there. 1984 is in fact the only year I’ve independently driven any motorised vehicle, namely one friend’s scooter and other’s motorbike. Like the underage drinking, it’s apparent that we didn’t take certain laws particularly seriously at the time and I’m wondering if that was just us, our age or whether that’s changed since. I was dead set against learning to drive, and this was probably the first thing which marked me as not destined to live a “normal” adult life. It has on two occasions prevented me from getting paid work.

Television didn’t form a major part of my life at the time. We couldn’t get Channel 4 yet, so the groundbreaking work done there was a closed book to me. ‘Alas Smith And Jones’ started in January and incredibly continued to run for fourteen years. I’d stopped watching ‘Doctor Who’ two years previously when ‘Time Flight’ was so awful that I decided it was a kid’s show and put it away. Tommy Cooper died on stage. The second and final series of ‘The Young Ones’ broadcast. ‘Crimewatch UK’, which would eventually become compulsive but guilty viewing for Sarada and me, started. In a somewhat similar vein, ‘The Bill’ started in October, which I used to enjoy watching years later due to the handheld camera work, which made it seem very immediate and “real” somehow. NYPD Blue is like that for other reasons incidentally. ‘Casualty’ was still two years in the future and ‘Angels’ had just ended, so I was in a wilderness of no medical dramas.

11. Green Shoots

Also, of course, the Brighton hotel bombing took place, and many of my friends felt very conflicted because on the one hand there was the general sympathy and empathy for people being attacked, injured and killed, but on the other they were very much hated for what they were doing. I’m sure this was a widespread feeling, and there is of course the band Tebbit Under Rubble, which very much expresses one side of that sentiment. Greenham Common was in progress and a major eviction took place in March. Although I was later to become heavily involved in the peace movement, at the time I was still very much on the sidelines although some of the people I knew were connected, and I do remember thinking that computer and human error were major and unavoidable risks which meant that the very existence of nuclear arsenals was too dangerous to be allowed to continue.

Then there was the Bishop of Durham, and since I was doing an A-level in RE at the time, his stance was highly relevant. The Sea Of Faith Movement was in full swing, which promoted a kind of secularised Christianity which was largely non-theistic or even atheist in nature, and the foundations were being laid in my mind which I’d later extend but allow the high-control group I became involved in to demolish, almost inexplicably. Over that whole period, I was expected to read a newspaper of my choice and take cuttings from it on relevant religious and moral issues to put in a scrapbook, so my long-term readership of ‘The Guardian’ began a few months before this and persisted through the year. It was either 25p or 30p at the time, and this was before colour newspapers had come to be. I had also been an avid Radio 4 listener since 1980, but unlike later I also listened to Radio 3 a bit, never really managing to appreciate classical music to the full.

This was also the year I finally decided I wanted to become an academic philosopher, and I still think I could’ve followed that through though it didn’t happen. This is the end of a kind of winnowing process probably connected to my dyspraxia, where I became increasingly aware of practical things which I simply couldn’t do, I’d been put off biology by the griminess and unpleasantness of field work and therefore philosophy was the way forward. That said, like many other people I was also very motivated to study psychology in an attempt to understand myself, and as you probably know a lot of psychology undergraduates begin their degrees by being concerned about major issues in their own personalities, so in that respect I’m not unusual. I also presented two assemblies, one on existentialism and the other on the sex life of elephants as a parable of romantic love.

I feel like this could go on and on, so I’m going to finish off this reminiscence in a similar way to how I started. My emotional world revolved around the friend I was hiding under the table with at the beginning of the year and our significance to each other was important to both of us. About halfway through it, having just visited her she became concerned that she and I were going to be found together alone in the house by her parents who were coming back unexpectedly, so I left the house by the back door and crept surreptitiously over the front garden, only to be stopped and “citizen’s arrested” by their next door neighbour. This turned out to make the situation more embarrassing for her and me than it would’ve been if I’d just left when they came back. I don’t know if anything can be made or a picture can be drawn of who she or I was at the time by putting those two incidents together.

I’m aware that I haven’t talked about Orwell’s book and its adaptations as much as I’d like, so that’s something I’ll need to come back to, and there are huge things I’ve missed out, but I hope I’ve managed to paint a portrait of my 1984 and possibly also yours. I may also have portrayed someone who peaked in high school, but I do also think tremendous things happened afterwards. 1984 is, though, the first foothill of my life, which makes it significant. It’s sometimes said that the reminiscence bump is only there because fifteen to twenty-five is the most eventful period of one’s time here, but maybe not. It’s hard to say.

Wordle

You must surely know Wordle, but just in case you don’t, it’s a daily game where you have to guess a five-letter word in six goes. I won’t post an example in case it’s one you haven’t done. If you get the right letter in the wrong place, you get a yellow tile. The right letter in the right place gets you green. The wrong letter completely yields a black tile.

This is very similar to the old computer game Moo and the board game Mastermind. I’d even go so far as to say it basically is Moo, but with words rather than numbers or patterns of colours. Mastermind works like this. One player sets up four pegs of different colours, including missing ones, which is hidden from the other. The other then has to guess by putting down their own sets of pegs along something like twenty holes, depending on the size of the board, and the first player places black or white pegs in a 2×2 grid at the side of the guess row indicating right pegs in the right place (black) and right colours in the wrong place (white). The colours of the guesses seem to have changed, as indicated by this illustration:

Photo taken by User:ZeroOne

Completely wrong guesses are indicated by blanks. Mastermind came out in 1970, and in my own child mind there was a clear association with the quiz programme but in fact the two have little in common. I actually thought it was a tie-in at the time, but I no longer think it was and it seems to have been coincidental. I have a tendency to be silly with games, and even more so as a child, and I remember one of my “patterns” being completely empty. I won with that one.

Moo is kind of the same game. Invicta, the company which made the “board” game, actually branded a calculator-like device in 1977 on which one could play that game:

By MaltaGC – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=114645559

Moo was written for the TITAN computer system in the 1960s. I first came across it in a BASIC programming language primer in 1981 CE. This version has the computer generate a four-digit positive integer which the user then has to guess. Right guesses in the right place are “bulls” and right ones in the wrong place are “cows”. It’s been proven that any four-digit sequence needs at most seven goes to get it right. Before being computerised, Bulls & Cows was a paper and pencil game.

Wordle is a little different, but not very. There is a four-letter version which was apparently also a pencil and paper game and that variant is out there online too. I imagine the issue with that is that a lot of profanities would be used.

Now, I could make myself out to be an expert on Wordle but I’m really not. I’m sure there are strategies but so far I haven’t worked one out much. Because you only get one go a day, it’s hard to practice. Certain combinations of letters are more likely in English generally. For instance, if there’s a Q there will very probably be a U, meaning that Q is only likely to be found in the first three positions. A while back I actually studied form for the first hundred or so Wordle answers, tallying which letters were most likely to occur in which positions. This is not completely random because five-letter English words will inevitably have certain features. For instance, no word in English, so far as I know, begins with “MK” or “TK”, but many begin with “SL” and “TH”. Based on this selection, it came to appear that the most likely word was “SRORE”, which is plainly wrong. A more likely word is “SHIRE”. However, there is actually a definitive list of all Wordle answers and that word was used on 22nd January 2022. No, I haven’t looked closely at that list although I do know what the last word is.

Wordle has a finite lifespan. It will end on 20th October 2027 and there are a total of 2 314 words. However, there is a much larger number of words allowed in the sequence leading up to the answer. I have the number 12 000 in my head but maybe not. This is the biggest difference between Wordle on the one hand and Moo and Mastermind on the other. The other two permit any combination, meaning that nothing needs to be stored. To use modern jargon, the patterns in the other two are procedurally generated. There is an algorithm which determines which symbols occur where, or rather, it’s completely random. In computerised versions it’s more likely to be pseudo-random. Not so with Wordle, which needs a series of stored words. Or does it? Is there a way to determine meaningful English five-letter words? It’s already clear that very few if any of end in Q, but my intuition tells me that at best there would be a substantial number of nonsense words if you tried to do this, which rules out that approach.

It might be thought that the game’s reliance on a set number of words would make it a creature of the age of cheap information storage, but this is only partly true. The ASCII version of an uncompressed list of 2314 five letter words is only 11K plus a couple of hundred bytes, and even that’s a lazy way of storing them because only two dozen and two characters are in use, which is only five bits per character, reducing it to just over 7K. This is with no real compression algorithm, but of course there can easily be one because of common sequences of vowels and consonants, or both, and the non-occurrence of letters in particular places. For instance, it’s rare for a Y to occur immediately before another letter in the middle of a word and rare for an I to occur at the end. However, because of the list of permissible words this is not the whole story, and if that is 12 000 the storage needed will be several times larger at around 37K.

I will now appear to digress.

In 1987, the big hit computer game was of course Tetris, also known, a little irritatingly, as TETЯIS. At the time the kind of home computer you might find which was somewhat up-market but nonetheless just about affordable for some people was the Amiga 500, with a 68000 CPU, 512K of RAM and a screen resolution comparable to that of a PAL TV. Nonetheless, there is now a version of Tetris for the ZX81 and it would’ve been feasible to write one for the very first mass-market microcomputers, particularly the Apple in low-resolution graphics mode. This brings to mind the oddity that whereas some inventions depend heavily on a series of predecessors leading up to shortly before their own appearance, others are just waiting for someone to think of them. The form factors of PC cases are, er, a case in point. It would’ve been entirely feasible for a Georgian cabinet maker to have churned out wooden and metal PC cases although it would’ve been a while before anything suitable could’ve been put in them, so there would’ve been no market.

End of apparent digression.

Wordle is an example of this low-dependency type of game. In 1755, a couple of people could’ve taken Samuel Johnson’s dictionary and used it to play it. Even in computer form it could’ve existed quite a long time ago. The limiting factor is the storage space needed for the list of possible intermediate words. There are a total of 12 972 words in its dictionary, whereof only a small fraction are permissible as answers. It’s possible, using modern compression algorithms, to get this down to 43K but that doesn’t mean an old computer would have had the storage space to process such algorithms, which might also be very slow. However, even without working very hard it’s feasible to get this down to 40Kb, meaning that the entire dictionary could be held in the RAM of an eight-bit computer. That computer would of course have to do other things than just hold words in its memory. Dispensing with that requirement, it would also be possible to take the same approach as early spell-checking algorithms and have the machine check words for feasibility. For instance, it could disallow any instance of five identical letters or impronounceable consonant clusters, or, as Commodore 64 Scrabble used to do, simply trust the user not to use nonsense words. With the latter approach, only 12K of RAM would be required for the list of right answers.

Here’s a possible 16K ZX81 implementation of Wordle. There are 2314 words in the dictionary, compressed to a packed five-bit per letter form, taking up less than 12K. There is no error-checking for forbidden words. A correct letter in the correct place is shown as white on black, a correct letter in the wrong place as black on white flashing with white on black, which would have to be done through software, and a completely incorrect letter is black on white. The program asks for the date when you start, converts it to a position value for a compressed series of strings (maximum string length on the ZX81 was 4096, so three strings would be needed) and loads that four-byte value into a short string literal, clearing the last seven bits which would belong to the next word. Every word the user inputs is appropriately compressed and compared five bits at a time with the string. This is then displayed on screen with only the flashing letters stored by position in the display file. If all letters are correct, the appropriate response is generated from an array depending on how many turns the user has had.

I think it’s clear that if this is feasible on a 16K ZX81 it would also be feasible on practically any computer (except maybe the MC-10) manufactured since 1982, and in most cases colour could be used. This is not a difficult game to implement, even in BASIC, although it seems to lend itself more to BCPL, C, FORTH or Assembler. It’s just eminently doable, and it even existed in some form on computers back to about 1968.

As to strategy, I have little idea. There’s little opportunity to practice with only one word a day, and in that way it’s a bit of a leveller. I have developed very rudimentary tricks. For instance, I tend to move a letter one place in either direction if it’s the right one in the wrong place, I start with a small list of possible words (CRANE, ADIEU, POINT or SOARE) and avoid impossible consonant clusters, which I have to anyway because they aren’t in the dictionary. I can’t actually implement this algorithm because it would involve looking at the word list and therefore cheating, but obviously it can be done quite straightforwardly.

Bye for now.

Living In The Past One Day At A Time

In this blog, I’ve made occasional references to what I call my “Reënactment Project”, which is a long-term ongoing thing I’ve been doing since about 2017. The idea is that every day I make an at least cursory examination of the same day thirty-nine years previously. The reason for choosing thirty-nine years is that for the initial year I planned to do it all the dates were on the same days of the week, meaning that the years concerned were substantially similar. The very basic arithmetic involved is of some interest and I’ll be returning to that later in the post. A side-effect of the thirty-nine year difference is that I am thirty-nine years younger than my father, so he would’ve been the age I am now back then, which focusses me on ageing, life stages and how to stay as young as possible by doing things like addressing my balance through Yoga so it doesn’t deteriorate as fast as it has for him. I can see the end result and know some of the things to avoid, which means that if I do reach his current age I’ll probably have a completely different set of health problems from which my own hopefully not estranged descendants will in turn know what they should avoid. And so on.

My motivation for doing this stems from the disconcerting awareness that we edit our memories, and are also only able to experience things as we are at the time. Also, various media and popular misconceptions lead us to forget and mutate the memories we do believe ourselves to have, and this was particularly important for 1978 as it included the famous Winter Of Discontent, also the Winter Of Discotheque, and I feel we may have been usefully manipulated into seeing this particular season in a particular way to justify everything that came after it. I also want to know how I was as a child and adolescent and pay attention to things which are the seeds of how I am now, and also that which was in me which I didn’t end up expressing. There is of course a bit of a risk here because I’m living in the past and to some extent dwelling upon it, but I do have a life outside this project and find it quite informative and enriching for today’s experiences. However, in general it’s just interesting.

I’ve now reached 1982, and am in the depths of the Falklands War, which was a significant historical event in securing Margaret Thatcher a second parliamentary term. Well, I say “in the depths”. In fact an end to hostilities was announced on 20th June and the Canberra was almost home by 7th July, which is when I’m writing this. I more or less stand by the position I had by the mid-’80s on this subject, which is that Galtieri and Thatcher were both aware that a war would be likely to boost their popularity, although at the time I thought it was an actual conspiracy between them whereas now I just think they were both aware of its expediency. It came as something of a shock to me, a year later, when I realised we didn’t have fixed-term parliaments and therefore the Tories could take advantage of their victory by calling an election whenever they wanted. ‘Shipbuilding’ is redolent of the time:

Although I know Elvis Costello wrote and performed the song, the Robert Wyatt version is the one I associate most closely with the incident. Robert Wyatt was part of the Canterbury Scene and an early member of Soft Machine, so I’m obviously more likely to associate it with him. Just in case you don’t know, Wyatt got drunk and fell out of a window in 1973, paralysing himself from the waist down. Jean Shrimpton, my second cousin once removed, gave him a car and Pink Floyd raised £10 000 for him in a benefit concert. Tommy Vance once described him as “a man who has had more than his share of bad luck in life”.

Another association I make with the Falklands from the time is a play about an Irish barman who was accepted as a member of his community in London until the breakout of the war. He finds himself sandwiched between Irish Republicans and his customers, with racism growing against him which culminates in his murder. This was originally a radio play but later appeared on TV. Although the Troubles were significant and also a spur to creativity, there was a long period during which practically every new play was about them, and it became tedious and annoying. This wasn’t yet the case in ’82 though. There’s also the 1988 BBC TV drama ‘Tumbledown’.

1982 was probably the last year there was really any hope that the previous pattern of alternating Conservative and Labour administrations we were used to would continue into the decade. In fact, this had been a relatively recent development. The first Labour government after the Second World War had been followed by thirteen years of Tory rule, and it was only after that that an alternation of parties in power had begun, lasting only fifteen years. Nonetheless, up until 1982 that’s what most people seemed to expect, and that alternation had held policies and the general timbre of the country in the political centre because the next government could be expected to come along and undo much of what the previous one had done, and so on. This was satirised on the Radio 4 comedy programme ‘Week Ending’ which depicted the future of privatisation and nationalisation as permanently oscillating ad infinitum every five years, which was probably one reason I thought we had fixed terms.

I was communist in ’82, and when I say “communist” I mean Stalinist. I took it seriously enough that I attempted to learn Russian and listened regularly to Radio Moscow, and I was very upset when Leonid Brezhnev died. I was completely convinced that what the Soviet Union was saying about us and themselves was accurate and that the BBC and the like was nothing more than propaganda. I was also very concerned indeed about unemployment, racism and homophobia. I considered being called racist to be the worst insult imaginable, which of course misses the point. I was, however, still a meat eater and was, as you can probably tell, quite naïve. I was also a lovesick teenager in love with the idea of being in love.

However, this isn’t just about 1982 and the events of that year, for me or the world, but also the value of the exercise. It’s often been suggested that I have autistic tendencies and I imagine that this kind of meticulous rerun of the late ’70s and early ’80s is going to come across as confirmatory evidence for that. Clearly people do do things just because they want to and then come up with reasons for doing so to justify themselves to other people. My novel ‘1934’ covers a community where they have chosen to relive the mid-twentieth century over and over again in an endless loop because the leaders think everything has gone to Hell in a handcart ever since, and this would not be a healthy attitude. I made the mistake, a few years ago, of re-reading my diary in a particular way and found myself falling back into the mindset I had at the time in a way which felt distinctly unhealthy. Nonetheless, I consider this activity to be worthwhile because our memories are re-written, and history is written by the winners, in this case the winners of the Falklands War, so our memories are re-written by the winners.

It’s been said that films set in the past usually say more about the time they were made than the period they’re supposed to have happened in. Hence ‘Dazed And Confused’ is really about the 1990s, for example. We generally have a set of preconceptions about a particular period within living memory which turn into a caricature of the time which we find hard to penetrate to reach the reality, and it isn’t the reality in any case because it’s filtered through the preconceptions of the people at the time, even when those people were us. This much is almost too obvious to state. However, there’s also continuity. Time isn’t really neatly parcelled off into years, decades and centuries. People don’t just throw away all their furniture at the end of the decade, or at least they shouldn’t, and buy a whole new lot. We’re all aware of patterns repeating in families down the generations. It isn’t really possible to recapture the past as if it’s preserved in amber. But it is possible to attempt to adopt something like the mindset prevalent at the time, or the Zeitgeist, to think about today, and the older you get the more tempting it is to do so. Since the menopause exists, there must be some value in becoming an elder and sharing the fruits of one’s experience, even when one is in cognitive decline. And of course the clock seems to have been going backwards since 1979, making this year equivalent to 1937. World War II was so 2019.

How, then, does 2021 look from 1982? On a superficial level, it tends to look very slick and well-presented, although airbrushing had a slickness to it too. The graphic at the top of this post is more ’87 than ’82, but it does succeed in capturing the retro-futurism. Progressive politics was losing the fight with conservatism at the time, but the complete rewrite of how we think of ourselves had not yet happened. Nowadays, people are wont to parcel up their identity and activities into marketable units because they have no choice but to do so. The fragmentation there is as significant as the commodification. The kind of unity of experience which existed in terms of the consumption of popular culture back then is gone, although it was gradually disintegrating even then. We were about to get Channel 4 and video recorders were becoming popular among the rich, although they were still insisting that there was no way to get the price below £400 at the time, which is more like £1 400 today. It’s hard to tell, but it certainly feels like the mass media, government and other less definable forces have got better at manipulating public opinion and attitudes. This feels like an “advance” in the technology of rhetoric. However, we may also be slowly emerging from the shadow of the “greed is good” ethic which was descending at the time because we’ve reached the point where most public assets have been sold off and workers’ rights have been eroded that reality tends to intrude a lot more than it used to, and I wonder if people tend to be more aware of the discrepancy between what they’re told and what their experience is. Perhaps the rise in mental health problems is related to this: people are less able to reconcile their reality with the representation of “reality”, and are therefore constantly caught in a double bind.

It isn’t all bad. It’s widely recognised now that homophobia, sexism, racism, ableism and other forms of prejudice are bad for all of us and people seem to be more aware that these are structural problems as well. Veganism is better understood but also very commercialised, taking it away from its meaning. Social ideas which are prevalent among the general public today may have been circulating in academia at the time and their wider influence was yet to be felt. This is probably part of a general trend. There was also a strongly perceived secularisation trend which has in some respects now reversed. The West was in the process of encouraging Afghan fundamentalists and they may also have begun arming Saddam Hussein by this point, although that might’ve come later. CND was in the ascendancy, and the government hadn’t yet got into gear dissing them.

Another distinctive feature of the time was the ascendancy of home microcomputers, although for me this was somewhat in the future. I’ll focus more on my suspicions and distrust here. To me, silicon chips were primarily a way to put people out of work and therefore I didn’t feel able to get wholeheartedly into the IT revolution with a clear conscience. I had, however, learnt BASIC the previous year. I don’t really know what I expected to happen as clearly computers were really getting going and it seemed inevitable. There was also only a rather tenuous connection between a home computer and automation taking place in factories. However, by now the usual cycle of job destruction and creation has indeed ceased to operate, as the work created by automation is nowhere near as much as the work replaced by it, or rather, done by computers or robots in some way. My interest in computers was basically to do with CGI, so the appearance of a ZX81 in my life proved to be rather disappointing.

1982 was also the only year I read OMNI. Although it was interesting, and in fact contained the first publication of ‘Burning Chrome’ that very year, it also came across as very commercialised and quite lightweight to me compared to, for example, ‘New Scientist’. It was also into a fair bit of what would be called “woo” nowadays, and it’s hard to judge but I get the impression that back then psi was more acceptable as a subject of research for science than it is today. This could reflect a number of things, but there are two ways of looking at this trend. One is that a large number of well-designed experiments were conducted which failed to show any significant psi activity. The other is that there is a psychologically-driven tendency towards metaphysical naturalism in the consensus scientific community which has little basis in reason. I would prefer the latter, although the way the subject was presented tended to be anecdotal and far from rigorous. From a neutral perspective, there does seem to be a trend in the West away from belief in the supernatural, and the fact that this was thirty-nine years ago means that trend is discernible on this scale.

Then there’s music, more specifically New Wave. For me, because of my age and generation, New Wave doesn’t even sound like a genre. It’s just “music”. This may not just be me, because it’s so vaguely defined that it seems practically meaningless. It’s certainly easy to point at particular artists and styles as definitely not New Wave though, such as prog rock, ABBA, disco and heavy metal, but I perceive it as having emerged from punk, and in fact American punk just seems to be New Wave to me. It’s also hard for me to distinguish from synth-pop at times. British punk could even be seen as a short-lived offshoot of the genre. By 1982, the apocalyptic atmosphere of pop music around the turn of the decade was practically dead, although I still think there’s a tinge of that in Japan, The Associates and Classix Nouveaux. The New Romantics had been around for a while by then. I disliked them because I perceived them as upper class and vapid. I was of course also into Art Rock, and to some extent world music.

In the visual arts, for me 1982 saw a resurgence in my interest in Dalí, who had interested me from the mid-’70s onward, but this time I was also interested in other surrealists such as Magritte and Ernst, and also to some extent Dada. As with New Romantics, Dalí was a bit of a guilty pleasure as I was aware of his associations with fascism. This was all, of course, nothing to do with what was going on in the art scene of the early ’80s, although I was very interested and felt passionately positively about graffiti. I felt that the destruction of graffiti was tantamount to vandalising a work of art. To be honest, although I’m concerned that people might feel threatened by it and feel a lot of it is rather low-effort and unoriginal, I’m still a fan of it, although I wouldn’t engage in it myself.

1982 was close to the beginning of the cyberpunk æsthetic. I’ve already mentioned William Gibson’s ‘Burning Chrome’, which first appeared in OMNI this month in 1982, and there was also ‘Blade Runner’, which was already being written about, again in OMNI, although it wasn’t released until September. The influence of the genre can be seen in the graphic at the top of this post. To a limited extent even ‘TRON’, from October, was a form of bowdlerised cyberpunk, with the idea of a universe inside a computer. Cyberpunk is dystopian, near-future, can involve body modification, does involve VR and has alienated characters and anarcho-capitalism, with a world dominated by multinationals. ‘Johnny Mnemonic’ had been published, also in OMNI, the year before. The question arises of how much today’s world resembles that imagined by cyberpunk, and to be honest I’d say it does to a considerable extent, and will probably do so increasingly as time goes by.

On a different note, although the days and dates match up between 2021 and 1982, this will only continue until 28th February 2023, after which a leap day for 1984 will throw them out of kilter again. It can almost be guaranteed that years twenty-eight years apart will have the same calendar. One thing which can’t be guaranteed is the date of Good Friday and the other days which are influenced by it. This means that there is almost always a difference between calendars even when the days of the week match up. I also said “almost be guaranteed”. Because the Gregorian calendar skips leap days when they occur in a ’00 year whose century is not divisible by four, we are currently in a lng run of matching twenty-eight year cycles which began in 1900 and will end in 2100. Hence up until 1928 the years of the twentieth century don’t match up on this pattern, and likewise from 2072 onward there will be another disruption of the pattern down into the future. There are also other periods which match between leap days, such as the thirty-nine year one I’m currently exploring, which began last year and includes two complete years as well. This also divides up the years a little oddly, because since I was in full-time school at the time, academic years were also quite important to me, and in fact continued to be so right into the 1990s. This makes a period between 29th February 1980 and the start of September 1980 and will also make a further period between September 1983 and 29th February 1984. Finally, astronomical phenomena don’t line up at all really. Solar and lunar eclipses, and transits of Venus and Mercury, for example, won’t correspond at all.

So anyway, that’s one of the possibly pointless things I do with my time at the moment. It does bring home to me how slowly time does in fact go, because to be honest doing this seems to have slowed the pace of the passage of time back to how it was when I was fourteen or fifteen. What other effects it has on my mind I’m not sure, although I think there must be both positive and negative influences.

My Hardware History – A Tale of Mass Disempowerment

(c) Gary Cattell

Not gonna lie: this post is inspired by Brian of Brian’s Blog fame, which you should all of course now visit and whom you should subscribe. But his post is about operating systems, whereas mine is about how it’s all gone to Hell in a handcart.

Ethical decision making in the marketplace often comes down to trust versus self-sufficiency. On the one hand, when you buy something, even from a sole trader, you are kind of out-sourcing your ethical choices to a third party, whom you may not know and in whom your trust may be misplaced. To some extent, it may not even be their fault that they have been forced to take negative and harmful paths in what they do, because it’s the system, not the people. The very fact that we live in a monopoly capitalist society forces us to harm others. To an extent, leaving the larger political picture out of the equation for a bit, one can take control of one’s life to varying degrees on the ethical front, but ultimately this usually amounts to self-sufficiency. The extent to which one is self-sufficient correlates with the degree of responsibility one is taking for how one’s actions affect others, including the planet. However, even there it’s important to recognise privilege. One may have access to an allotment and be able to grow food, but that depends on various factors such as the good fortune of having that access due to much land formerly devoted to allotments being sold off to the highest bidder, and not living in a food desert, having the time to put into raising that food and so on. The same applies to gardening for food, to a greater extent. Not everyone has the luxury of a garden where they can grow crops, and foraging only works even to the degree it does because relatively few people do it. There’s also a knowledge, experience and skills base some people can draw on and others can’t. The fact that I’m a herbalist makes it easier for me to forage, for instance.

In the case of growing food, one usually at least has a fairly good chance of having facilities and skills needed to manage this. In other areas this may not be so. For instance, we can’t make a gate ourselves, or so I tell myself (I do have very limited carpentry skills which I never use and were pretty poor anyway, so in theory I could), but we could buy a gate and fix it onto the wall using the abilities available in this household. This costs about £50. Presumably buying the timber and having invested in the available tools would cost us less, but on asking someone else to do it for us we were quoted a figure of many hundreds of pounds.

The opposite end of this kind of self-sufficiency is found in the likes of computers and allied devices. The degree of skill and automation and the combined effort of innumerable people, is what makes devices like laptops, tablets and smartphones, and their associated storage media and support items such as chargers, cables and Wi-Fi, possible. That huge morass of people doing things for you is difficult to replace with your own skills because there is only so much that can be fitted inside a single head. Given a few lengths of wire and some relays, I could probably put together an arithmetic and logic unit which worked on a very short binary word length, and it isn’t often appreciated that one major reason I can do this is that I’m a philosophy graduate. I’m also dyspraxic though, so it would be a struggle for different reasons than the mere knowledge of how to do it. Consequently I rely on others for the digital electronics I use, and that reliance means that, as usual, I’m handing over responsibility for ethical choices to other people, whose own choices are compromised by working within a capitalist system.

I need to get down to specifics.

In the 1970s, the microprocessor was seen as a threat to manual labour. Since I was entering a Marxist phase at the time, it simply seemed wrong to have anything to do with microcomputers back then, since it would be supporting companies whose products were putting people out of paid employment. Compared to my peers, our family were relatively late adopters of all sorts of technology, such as colour TV, cassette recorders and stereo record players. I knew one early adopter whose family bought a ZX80 when it came out. My family and others were rather disdainful of this and saw it as kind of “uppity”, I suppose. We got a ZX81 in November 1982, some time after the price had come down owing to the introduction of the ZX Spectrum. After only a week, we purchased a 16K RAMpack. However, by the time we had one, I’d known BASIC for about a year, having taught myself from textbooks with no hands on experience. I was still almost superstitiously suspicious of even home micros at the time. After all, Clive Sinclair had claimed that the ZX80 was powerful enough to control a nuclear power station, so taking that at face value even that had the potential to replace human workers aplenty. At the time I had no concept of automation creating jobs to replace the ones that had been destroyed, so all I could see for the future was mass unemployment.

The ZX81, and to an extent its predecessor, has a kind of nostalgic charm to it even for the time. For instance, like older and larger computers it has its own character set, exclusively upper case letters and only sixty-four printable characters, and in particular it uses “**” to denote raising to a power rather than “^”, or actually “↑” for most micros of that vintage if I remember correctly. It also displays black characters on a white background, giving the impression that the output is all coming out of a printer onto the old-fangled paper which looked like it had green musical staves on it and holes up the sides, and was folded with perforations separating the sheets. It was also, in practical terms, silent out of the box. My enduring impression of the ZX81 is that it’s an early ’60s minicomputer trapped in a plastic matchbox, and as such it had a flavour of former glories about it.

To me, the most mystifying thing about this computer was that it somehow seemed to be able to produce sufficiently detailed characters as would appear on a much higher resolution display but could not actually address those pixels directly. Why couldn’t it draw in as much detail as it displayed text? There were 2×2 graphics characters but they only afforded a resolution up to 64×44. It also didn’t scroll unless you told it to. I attempted to combine characters by printing them in the same location, expecting a kind of overstrike effect, but that didn’t work. Then there was the question of machine code. At the time, I assumed that computers directly acted upon the BASIC code they were given. When I saw the table at the back of the manual showing the machine code instruction equivalents to the alphanumeric and other characters, I drew the erroneous conclusion that the microprocessor simply read the BASIC line by line off the display file and executed it to achieve the ends, so for example SIN was a series of three instructions which together could achieve a floating point trig function, and so could COS, and so forth. This is kind of how the human brain operates with language, so I would defend this naïve view.

Another unexpected feature of microcomputers was that they almost all used BASIC. I had expected that the relatively cheap home computers such as the VIC-20 or TI-99/4A would use that language, but because it’s Beginner’s All-purpose Symbolic Input Code, more expensive computers would have built-in PASCAL, FORTRAN or ALGOL. This was, however, only true to a very limited extent.

It was eventually borne in upon me that programming languages were themselves software, written in a more fundamental language called machine code which controlled the microprocessor directly, so I learnt Z80 machine code and programmed in it in a limited way. I discovered there were ways of getting the ZX81 to produce sound and increase its vertical resolution, and even managed to produce a program which played the drum machine bit of ‘Blue Monday’. This suggests that I traversed a very steep learning curve very quickly since we acquired the computer in late November 1982 and New Order’s single was released less than five months later. I felt uncomfortable with the extent to which I seemed to be fixated on computer programming and tried to maintain interest in other topics. My attitude to computers has always had this ambivalence to it. It’s also very likely that my pursuit of this hobby adversely affected my O-level results, and it’s a shame that the knowledge I was acquiring couldn’t have been used to get an O-level in computing. I actually used to help pupils who were studying computing with their homework, and I’ve often wondered what the disconnect is here. It reflects a pattern in my life of not being able to integrate my skills and experience with formal accreditation or recognition, and I suspect it’s linked to neurodiversity but I don’t know how.

Much of the time I spent programming the ZX81 was also spent envying the specifications of more powerful computers, but at the time I think my parents were trying to motivate me to find paid work, which I did in fact do and proceeded to buy, of all things, a Jupiter Ace. This is a fairly famous computer designed by the team who did the ZX Spectrum at Sinclair Research Ltd and then left to form their own company, and is chiefly known for the fact that it had the programming language FORTH in ROM rather than BASIC. This was almost unique. There was an attempt to design a more advanced FORTH-using micro called the Microkey 4500, which was basically a wishlist fantasy computer which sounded excellent but hardly even got to the drawing board stage, but to me the main appeal of the Ace was that it behaved like a “proper computer”. It has the complete ASCII character set, displays white text on a black background and has a Spectrum-style keyboard. It is in fact very similar to the Spectrum even down to the font it uses, but lacks colour and point-addressable high resolution graphics. However, by the time of its release, October 1982, consumers were expecting computers to have high resolution colour graphics and proper sound. For some reason I’ve never understood to this day, most British micros at the time had built-in speakers for sound rather than using the speaker of the TV they were plugged into, which sometimes compromised the sound production to a barely audible beep and necessitated the addition of built-in sound hardware while the audio capability of the TV was just sitting there unused. A strange decision which would probably make more sense if I knew more about electronics. Jupiter Cantab, the company which made the Ace, went bust after less than two years and this enabled me to buy the Ace. This has a different special place in my life because it was the first durable product I ever bought with money I’d earned myself, and I still own it today.

FORTH is sufficiently close to the machine that it can actually be implemented as machine code in itself, and much of the language as supplied is definable in terms of more primitive words in that language. FORTH’s appeal to me was that it enabled me to work very closely with the hardware of the computer without having to use actual op codes. I attempted to design a prefix notation version of the language, but although I wrote a complete description which came quite naturally to me, I never completed it. I noticed also in myself a tendency to attempt to reduce user-friendliness to ease programming: for instance, I opted to use signed sixteen bit integers as the only data type and express them in hexadecimal with leading zeros.

By this time I’d been aware of operating systems for about two years. I was at first only cognisant of CP/M, which had been devised in 1974, and Unix, which I think dates from 1969 although clearly not in its later form as the dates begin in 1970. MSDOS and PCDOS were also out there somewhere but since IBM PC compatible computers cost upwards of £3000 at the time I regarded them as permanently out of reach. Oddly, we did in fact have a PC clone briefly in our house in 1983, although it was never switched on and was simply being stored there for my father’s workplace. Incidentally, at the time that workplace had been using a confoundingly simple minicomputer they’d bought in the 1960s which appeared to have only four op codes. I found this hard to believe even at the time but extensive study of the technical manuals showed that this was indeed the case. I have no idea what it was now, but it was very strange and sounds like it would’ve been a challenge to program, though an interesting one.

For me, Unix was associated with minicomputers and wouldn’t even get out of bed for what at the time seemed like a ridiculously vast amount of RAM. However, there were also versions for the 32:16-bit 68000 and I think also the failed Z8000, although oddly CP/M was also rewritten in C for the 68000, which seemed ridiculously underpowered to me at the time. It was at that point that an annoying trend became apparent to me, which had been going on since at least the 1960s when I’m aware it was implemented on the PDP-11 range of minicomputers. There were privileged and user instructions. On the 6809 CPU, there had been a system and a user stack pointer, though both available to the user via machine code and assembler (a direct one-to-one programming language slightly friendlier to the user). On its more powerful successor, the 68000, the system stack pointer was unavailable to the user and only the user stack pointer was accessible. Other tendencies also came into play, such as many of the system flags being alterable only by the operating system and whole areas of memory locked out from user access. This is done for security and stability purposes, but to me it felt patronising and hand-holding, and also like an in-built class system. There’s hoi polloi, the likes of us, and there’s the Programmer, who wrote the operating system and has total control. We are merely their minions, second-class computer users, and this was the start of a trend to lock people out of controlling their own devices which continues today. It’s the opposite of self-sufficiency and it means you have to trust whoever wrote and often sells or licences the operating system.

There was also another trend which drives me round the bend: virtual memory. When I first learned about multiuser systems, I was astonished to find that they would sometimes save the whole program the user was running and switch to another user to load their program and run that, continuing in that cycle depending on how many users were on the system. Since hard drive storage is mechanical, it’s many orders of magnitude slower than solid-state RAM or ROM, and this makes things very slow, so I assumed that this was a soon-to-be-superceded by cheaper and larger memory sizes. This didn’t happen. What happened instead was that later operating systems were designed to pretend there was more physical memory than there actually was, with the result that it was all too easy for a computer to get lost in its own internal musings and kind of forget there was some person out there trying to use the bloody thing. Fortunately we now have solid state drives and the situation is somewhat better.

Way into the late 1980s I would still doodle Z80 assembler programs in notebooks meant to do various things, though without being interested in implementing them. By that time, GUIs were starting to take over, and thereby hangs another tale.

I liked the Xerox Star and the Apple Lisa, which stole the former’s user interface and preceded the Macintosh. That to me did seem to be the way forward with computers at the time. Later on, Microsoft tried to copy it, and it seemed like a pointless thing bolted onto the front of the operating system which slowed it down so much that it wasn’t worth whatever benefits it might bring. The same applied to GEM, its main competitor. To me, a GUI feels like another way computer design is taking control away from the user. This was just the beginning. I am used to imperative and procedural programming. As far as I’m concerned, a computer program is a list of orders or instructions organised into smaller subroutines which tell the computer what to do. Putting it like that, it seems absurd to me that anything else would ever be so. Object-oriented programming began to become very popular, and at no point have I remotely understood it. Every description of what it is seems to use a metaphor which fails to describe what’s actually going on inside the device. It will do something like say that there’s a class of vehicles which have the properties of size, number of wheels and so forth which can be applied to create new concepts of vehicles, such as distinguishing a car, motorbike and lorry. That’s all very well, but I can never make a connection between that metaphor and computer programming, which looks like something completely different. It also uses a declarative paradigm, where you just seem to tell the computer what’s there and leave it be, which baffles me because how can anything actually be happening if you haven’t made it happen? I’ve attempted to describe my understanding in terms of variables and procedures, but people in the know have always told me I haven’t got it right. It’s been said, and I’ve come up against this, that if you’ve learned imperative and procedural programming, it makes it so much harder to learn OOP (Object-Oriented Programming). I also can’t shed the impression that a lot of it is obscurantist technobabble hiding a naked emperor. And if you can’t see what’s going on inside, how can you have control.

Another annoying trend has been created by the easy availability of memory. Back in the old days, computers needed to be programmed efficiently in terms of memory. For instance, a spell checker would be based on the rules of English spelling and object to, say, a Q not followed by a U or the use of “AY” in the middle of a word, but it didn’t have a dictionary. Nowadays, it does, and that takes up many megabytes in its uncompressed form although I’m sure it is compressed. Likewise, chess games have tended to store whole lists of possible moves and try to find their way up a tree to the best result, using up huge amounts of memory, whereas previously they would’ve used the rules of chess and, I presume, some cleverly-written strategy to get to win. To me, this seems lazy and disappointing. I want programs to be optimised to the same extent as they were when 64K seemed like impossible luxury. So much processing power is also wasted on running the GUI. We don’t need this because it isn’t really using the computer. It’s just making the computer easier to use by tying up processing power in unnecessary trinkets.

So: you might think I’m a Linux person. I did too for a long time, but I’m not. As far as I’m concerned, you have to take care of your hardware and ensure it remains useful for as long as possible for a number of reasons:

  1. It takes a lot of energy, resources and environmental damage to make a computer and the working conditions of the people who mined the minerals, refined the compounds and put the device together are often questionable. Once all of that is done, you should be able to hang onto it for a long time.
  2. We ourselves haven’t necessarily got that much money and we should expect our devices to be as useful as possible. That means access to hardware and no hand-holding.
  3. When we finally discard our hardware it goes to landfill or some poor Third World community where it poisons the rivers and gives the people disassembling it physical disabilities, causes cancer and congenital birth defects, not to mention what it does to the biosphere.

All of this has got to be the number one priority when we consider computers and other devices, and to be fair Linux does go a considerable way to addressing this. But there’s a problem. A lot of the people who are involved in designing and coding for Linux are very focussed on supporting up to date hardware. It’s a big community and they aren’t all doing that, but many of them are and it’s often hard to find people who are more genuinely concerned on the E-Waste and other sustainability aspects of the issue. The other thing, and this may be less problematic today than it used to be, is that Linux people are often not people people, and in the end this amounts to user interfaces which are not very friendly. I’m reminded of the Dilbert cartoon strip of the computer programmer saying “it’s my philosophy that the computer interface should hurt the user”, and employing samples of birds being killed by cars as the main notification sound. I’ve had to use Linux GUIs which are unwittingly displaying at 320×200 and put the OK button completely outside the display, or don’t recognise that the memory map for the video RAM is organised in such a way that everything looks like incomprehensible multicoloured vertical stripes. And yet a perfectly good, low-end 640×480 sixteen colour display could’ve been used which somehow is not the default. Why?

Don’t get the impression that I’m not as susceptible as most other people to the appeal of Windows. I was very impressed by Windows 3.1, which I didn’t come across until 1998 due to forcing myself to go cold turkey on computers for a decade or two. As far as I’m concerned, I’d be happy for the GUI to look like that today and it all seems a bit unnecessary to make it any fancier, particularly because in doing so you’re consigning millions of PCs to landfill for no good reason. I think Program Manager, the main shell for Win 3.1, hung around until at least Windows XP although it didn’t retain its look. It may be due to the fact that our 486-based 33 MHz PC with VGA graphics and non-working sound card was just less messed-about than other computers we’ve used since, but it was the most stable version of Windows I’ve ever used in earnest. It crashed once the whole time we used it, and that was a major and worrying event. Incidentally, there used to be an Explorer-style shell for Win 3.1, which made it practically indistinguishable from classic 32-bit Windows, which I used for a short period of time, and in fact I’ve even installed it as the default shell on later versions due to it being more compact and stable than the default shell.

We then leapfrogged over Windows 95 to use Windows 98 on a computer which was a total mess. It was a 120 MHz Pentium with SVGA, a working sound card and 16 Mb RAM. This is below the recommended specs for Windows 98, but there were other imponderable issues with that PC at the time. It only lasted a few months before we handed it over to a friend who was more au fait with computers, who got it to work. We replaced it with our final desktop format computer, an AST Bravo MS P/75 with an ATI Mach-16 graphics card, which incidentally has 2-D acceleration but not the 3-D which had become practically universal by that point. This was actually a downgrade, but was more reliable, and at that time I was really into the themes and skins available on Windows 98. I also struggled endlessly to get Linux to work on it. QNX was fine, but then it always is, BeOS also worked okay. It ultimately got upgraded to 128 Mb RAM and a piggy-back 133 MHz Pentium if I remember correctly. This was the first computer to have a reliable dial-up internet connection. It was fine at running Quake but it was only barely capable of showing DivX videos even when upgraded to the nines.

The next stage came in 2002, as the start of our plan to give up television, partly for the children’s sake. This meant ultimately having access to DVDs and therefore we bought a new computer, though with a CD-ROM drive, mainly in order to get broadband internet. By this time we were accumulating E-Waste enormously because I was reluctant to let the old hardware go the way of all silicon. This PC had an Athlon and ran Windows XP. Windows XP was a good operating system on the whole but was the first O/S to break compatibility with 16-bit applications and drivers, which necessitated the slinging out of various bits of hardware. This was also our first PC to have USB slots, which I upgraded as well. The initial specifications of this machine were 128 Mb RAM, 40 Gb hard drive, 1 GHz Athlon of some description, on-board graphics, Silicon Integrated Systems motherboard. It started off with Windows 98 and we upgraded it in 2004. One thing I didn’t like about Windows XP was its childish-looking graphical scheme, but it still had Windows Classic, whose appearance was far better. This was also the first computer we used a TFT screen with, and the amount of space taken up by a 22″ CRT monitor is something to behold.

In 2007, we got a Windows Vista machine. This was because its predecessor had exploded due to me installing a graphics card whose power requirements exceeded that of our previous computer. Apparently it emitted something like ball lightning, although I wasn’t there at the time. This we persisted with for a further seven years. My chief issue with Windows Vista was that left to itself it seemed to spend too much time making the windows look translucent. Interestingly, the system requirements for Windows after a certain point went into reverse, probably because people were no longer impressed with the eye candy. In 2015 we acquired the computer which is currently sitting upstairs and of course runs Windows 10. Ironically, the stability of Windows 10 has made it possible to install Linux properly and use it on that PC. I have a history of using Live distributions of Linux on memory sticks as my preferred operating systems because they’re more secure that way.

Windows PCs have become very much secondary in our lives in recent years, as is probably true for many other people. We mainly use Android and Chromebooks, although Sarada still uses a Windows laptop from time to time. There no longer seems to be the pressure to upgrade, or maybe we’ve just become sidelined and lost interest in using them in other ways. I still program using FORTH and I have the modern version of the Commodore 64, which I don’t use as much as I should. To be honest, I’m not a big fan of the 64 because its BASIC doesn’t support the hardware properly and the palate is very odd for no apparent reason, but again all these are challenges to which I should rise. I’m vaguely considering writing a few arcade games for it, although I should stress that I’m very much in the Z80 camp rather than the 65 series one. I’ve never found the 6502 (technically the 6510 in the case of the 64) easy to program because I find its address modes and the complete impossibility of dealing with sixteen-bit integers irksome, although again it’s down to possibly admirable minimalist design.

We’ve also owned a few computers I haven’t mentioned. I bought a Tandy Color in 1999, there was a UK101 in our airing cupboard for a while, we had a Windows ME PC which was technically actually an ACT Apricot of all things, and also a plasma display 286-based Toshiba laptop I used DOSLynx on as a browser. That actually ran Windows 1.0!

Although I’ve never done it, I am curious as to the possibility of programming ARM or 64-bit Intel processors in machine code or assembler without an operating system as such. It would need a computer dedicated to that purpose of course. I would also imagine that attempting to use an ARM in that way would be a bit of a tall order, although I don’t know, because my understanding is that its instruction set is optimised for use with code no human has directly written nowadays, but I presume that Intel-like CPUs are another matter. But just as I’ve never really got into coding in a major way, I doubt I’ll ever get round to it. One thing I do still do with coding is occasionally write FORTH or BASIC programs to work out maths problems, and I’ve long preferred to use APL to spreadsheets, which quite frankly I can’t get my head round.

I very much doubt I’ll ever make the transition to object-oriented programming.

Astronauts Vs Computers

‘Rocket To The Renaissance’, written by Arthur C Clarke in about 1960 and expanded upon in his epilogue to ‘First On The Moon’, a book by Apollo astronauts, sets out many of his thoughts regarding the positive impact of human space travel on the human race. Since it was written in the mid-twentieth century by a White Englishman, though apparently a queer one, it unsurprisingly has its colonial biasses, though not fatally so. He focusses initially on White expansion across the globe, although he does also mention the views of non-White thinkers such as 胡適. That said, his point stands, and is paralleled by Arnold Toynbee, who once said:

Affiliated civilisations . . . produce their most striking early manifestations in places outside the area occupied by the “parent” civilisation, and even more when separated by sea.

I honestly can’t read this without thinking of the genocides committed by European powers, but there is a way of defusing this to some extent. There was a time when humans only lived in Afrika and slowly radiated out from that continent into the rest of the world, a process only completed in the twentieth century CE when we reached the South Pole, and not including the bottom of the ocean, which is of course most of the planet’s surface. Something I haven’t been able to track down is that there is supposed to be a genetic marker for the people who have spread furthest from East Afrika, which I presume means it’s found in Patagonia, Polynesia and Australia, although I suspect it actually refers to Aryans because there is indeed such a concentration in the so-called “Celtic Fringe”. Even this expansion may be problematic. It’s not clear what happened when Afrikan Homo sapiens left that continent and encountered other species of humans. Our genes are mixed with theirs, but they’re extinct and we don’t know how either of those things happened. It seems depressingly probable that we are all the descendants of children conceived by rape, within our own species, and this may have been the norm as we would understand it today, between or within our species. It seems more likely, though, that we simply outcompeted our relatives on the whole, and maybe the small portion of DNA from Neanderthals and Denisovans reflects their relatively smaller populations.

Leaving all this aside, the imperial winners of this million-year long onslaught on the planet benefitted culturally and technologically from it. 胡適 said:

Contact with foreign civilisations brings new standards of value.

And:

It is only through contact and comparison that the relative value or worthlessness of the various cultural elements can be clearly and critically seen and understood. What is sacred among one people may be ridiculous in another; and what is despised or rejected by one cultural group, may in a different environment become the cornerstone for a great edifice of strange grandeur and beauty.

Since I don’t want this to descend into some kind of patronising Orientalism, I’ll come back to Arnold Toynbee and his law of Challenge and Response. When difficult conditions are encountered, a minority of creative people respond by coming up with far-reaching solutions which transform their society. For instance, the Sumerians responded to the swamps in their area by irrigation and ended up kind of inventing civilisation as such, and the Church, having promulgated a belief system which caused the collapse of civilisation, went on to organise Christendom and invent Europe. We can of course still see the consequences of Sumer today all around us, but as I’ve mentioned before the very human geography of these isles reflects its location through the “diagonal” arrangement of cultural and economic differences we see locally due to the radial spread of change from the Fertile Crescent.

Even human expansion from East Afrika is problematic. There are clear signs that whatever it was we did led to enormous forest fires and the extinction of charismatic megafauna such as the nine metre long lizards who used to predate in Australia and the giant tortoises and birds of oceanic islands, not to mention the possibility that we helped wipe out the mammoths and woolly rhinos. Animals today tend to be nocturnal, smaller and to run away from humans because of what we’ve done in the prehistoric past. Nonetheless, there is an environment which is not problematic in this way. Actually, I should turn this round. The environments which are problematic from the viewpoint of being easily damaged and containing other sentient beings are largely confined to the thin film of air on this tiny blue speck we call Earth.

In his ‘Spaceships Of The Mind’, Nigel Calder pointed out that if we want to develop heavy industry, there’s always an environmental cost on this planet. On the other hand, if we were to do it in space, that problem goes away completely. Nothing we can do in space is ever going to make even the slightest scratch on the Cosmos in the forseeable future. Of course, it’s worth injecting a note of caution here because that attitude led to damage to our own planet, and locally even in space, that may not be true. Nonetheless, I do believe that one response to the energy crisis is orbiting solar power stations which beam their power back to remote receiving facilities on Earth which can then relay electricity globally, obviating the need for any fossil fuels or terrestrial nuclear power stations, or for that matter wind turbines or Earthbound solar arrays.

Space exploration has already yielded very positive results. These include the discovery of the possibility of nuclear winter, the Gaia Hypothesis, the Overview Effect and technological fallout. I’ll just briefly go into three of these.

  • Nuclear winter. When Mariner 9 reached Mars in 1971, there were problems imaging the surface due to a global dust storm. This was studied and it was noted that the fine particles in the atmosphere were blocking solar radiation and cooling the surface. The Soviet Mars 2 probe arrived at about the same time, sent a lander into the dust and it was destroyed. Carl Sagan then sent a telegram to the Soviet team asking them to consider the global implications of this event. This led to a 1982 paper which modelled the effect of nuclear firestorms and the consequential carbon particles in our own atmosphere which appeared to show that there would be a drastic cooling effect on this planet if that happened: the nuclear winter. Even now, with more sophisticated models, scientists recommend that global nuclear arsenals should be kept below the level where this is a significant risk during a nuclear exchange, and it’s also possible that it was a factor in ending the Cold War.
  • The Gaia Hypothesis. This is the belief that Earth is a homoeostatic system governed by its life. It’s still a hypothesis because many scientists still reject it or see it as only weakly supported, and it also coëxists with the Medea Hypothesis, that multicellular life will inevitably destroy itself. The roots of the hypothesis lie in Spaceship Earth and the observation that the other planets in the inner solar system, which didn’t appear to have life on them, were much less like Earth than might be expected. Up until the 1960s, life was more or less regarded as a dead cert on Mars because of the changes in appearance caused by the dust storms, which at the time were interpreted as seasonal changes in vegetation, and of course it had become popular to suppose there were canals there. On Venus, many people expected to find a swampy tropical world or a planet-wide water ocean teeming with life. When this didn’t happen, some scientists started to wonder if life had actually influenced this planet to keep it habitable rather than there already having been a hospitable environment for life which maintained itself. Viewing our whole Earth as alive is a way to engender compassion for all life, and is of course an example of hylozoism.
  • The Overview Effect. This is substantially related to the inspiration for the Gaia Hypothesis. When astronauts have seen Earth hanging in space, they have tended to gather a powerful impression of the fragility of life and the unity of the planet which has constituted a life-changing experience. The Apollo astronaut Edgar Mitchell set up the Institute of Noetic Sciences in response to his personal reaction, which was part of the human potential movement, and there are plans to make views of Earth from space available via virtual reality.

These are just three examples of how space exploration changes human consciousness for the better, and two out of three of them only happened because there were people in space, beyond low Earth orbit. Considering that even today only a tiny proportion of our species has ever been in space, and an even tinier proportion have left cis lunar space, this is an enormous influence relative to their number. It’s evident that the more astronauts and perhaps people living permanently off Earth there are, the more positive the effect on the human race would be.

But instead, we’ve gone the other way.

The biggest recent notable change in technology from a cultural perspective is of course information technology, mainly the internet and easy access to it via relatively cheap devices. This has led to the creation of cyberspace (I was there at the birth) and a generally inward-looking culture. I would contend that up until 1972, the human race had a spatial growing point, and that this had feedback into the rest of our cultures. And yes, it absolutely was the preserve of the rich and powerful countries, and yes, Whitey was on the “Moon”:

A rat done bit my sister Nell.
(with Whitey on the moon)
Her face and arms began to swell.
(and Whitey’s on the moon)I can’t pay no doctor bill.
(but Whitey’s on the moon)
Ten years from now I’ll be payin’ still.
(while Whitey’s on the moon)The man jus’ upped my rent las’ night.
(’cause Whitey’s on the moon)
No hot water, no toilets, no lights.
(but Whitey’s on the moon)I wonder why he’s uppi’ me?
(’cause Whitey’s on the moon?)
I was already payin’ ‘im fifty a week.
(with Whitey on the moon)
Taxes takin’ my whole damn check,
Junkies makin’ me a nervous wreck,
The price of food is goin’ up,
An’ as if all that shit wasn’t enough

A rat done bit my sister Nell.
(with Whitey on the moon)
Her face an’ arm began to swell.
(but Whitey’s on the moon)Was all that money I made las’ year
(for Whitey on the moon?)
How come there ain’t no money here?
(Hm! Whitey’s on the moon)
Y’know I jus’ ’bout had my fill
(of Whitey on the moon)
I think I’ll sen’ these doctor bills,
Airmail special
(to Whitey on the moon)

Gil Scot-Heron

The question here is of course of which America got the moon landing, and possibly which humankind. However, is there a reason to suppose that if enough people were to go into space it wouldn’t alter their consciousness enough for them to become, for instance, anti-racist and to recognise that we really are all in it together? To a Brit reading this, the reference to doctor’s bills brings the NHS to mind, and that kind of large-scale government-sponsored undertaking is pretty similar to NASA in many ways.

Apollo was also, of course, a propaganda coup, demonstrating what the so-called Free World could do that the “Communist” countries couldn’t. However, it wasn’t done via private enterprise or competition. It is at most an illustration of what a mixed economy can achieve, not capitalism. On the other hand, it could also be seen as an example of competition between the two power blocks dominating the world at the time, but is that capitalism?

As it stands, space probes even today have relatively low specifications, possibly due to long development times. In 1996, Pathfinder landed on Mars powered by an 8085 CPU running at 0.1 MHz. The Voyager probes run on a COSMAC 1802. There was eventually a problem with the Space Shuttle program because the craft used 8086 processors which became hard to find and had to be scavenged from antique PCs. The space program is startlingly primitive in this respect. As far as I know, there has only ever been one microcomputer based on the 1802 processor, the COMX 35, which came out in 1983. The Intel 8085 came out in March 1976, was a slightly upgraded version of the 8080, and was almost immediately eclipsed by the legendary Zilog Z80 which was released a month later. It had a longer life in control applications, which is presumably how it ended up in a Mars rover. The Shuttle program ended in 2011, which was thirty-three years after the 8086, a pretty conservative design in any case compared to the 68000 and Z8000, was mass-produced. Given all that primitive IT technology, the achievements of space probes are astonishing, and serve to illustrate the inefficiency of popular software used on modern devices on this planet. We have our priorities wrong.

I needn’t say much about the effect of social media on society. We all know it’s there, and it’s basically an ingrowing toenail, albeit one which has ingrown so far it’s started to pierce our brains. But we could’ve had a rocket to the renaissance, and instead we got Facebook and Trump. History has gone horribly wrong.

Brain Of Britain

In the 1950s, it was estimated that if all the connections in the human brain were modelled with valves (or tubes in American English), that “brain” would be the size of what we now call Greater London. I don’t quite understand how this happened because Greater London wasn’t in existence at the time. I seem to recall that that administrative unit was created by drawing a circle twelve miles in radius around Charing Cross and including all the boroughs which its circumference passed through, but that seems to be too small. In any event, this is an area of 1569 km². This was during the time of the first generation of digital computers, when the switching elements were valves.

Later on, transistors had replaced valves. These are effectively solid state switches which can be turned on or off by the application of a current, hence their tripodic nature, and were initially made of germanium, an element I used to think was called “geranium” for some reason. Later transistors were of course silicon, but in both cases they’re “doped” with small amounts of another element, often arsenic. If a human brain were to be made of that kind of transistor, it would be the size of the Albert Hall. This is a large public building in Covent Garden, near Charing Cross in fact, which is elliptical seen from above, and has axes of almost seven dozen and six dozen metres. Taking the mean diameter, this gives it an area of 4778 m², and this gives me pause because it’s so much smaller than the first figure. Did they actually mean Greater London at all?

By the early 1960s, engineers were starting to put several transistors in the same package. Incidentally, these were not the first integrated circuits. Earlier, valve circuitry was being put into the same evacuated chamber and it was possible to make a valve comprising several discrete components, which was sometimes done in radio receivers. However, the real advance was putting components onto the same wafer, because it became feasible to make a single chip which worked as a logic gate, and later an arithmetic and logic unit and eventually an entire CPU. In 1965, Gordon Moore made the observation that the number of transistors which could be made to fit in the same area of silicon seemed to be doubling every year, and this became known as Moore’s Law. Incidentally, in my alternate history known as the Caroline Timeline I tried to imagine what would have happened if advances in this area had been linear rather than geometrical, with progress proceeding at 1979 levels. I call this Vannevar’s Law. In 1978, the BBC TV science documentary series ‘Horizon’ noted that it was now possible to fit an equivalent number of transistors to the human brain in a five metre square room. Moore’s Law no longer applies. The doubling was revised to once every two years at some point and it seems to have broken down in the late ‘teens. Assuming ‘Horizon”s room was 5x5x3 metres, or 75 m³ in 1979, there would have been eighteen iterations between that year and 2015, which would’ve reduced the size by a factor of 2¹⁸, which is 262 144, equivalent to a cube less than seven centimetres on a side, or about the same size as a Rubik’s cube. It would’ve been about the same size as a human brain in 2011 or so by those calculations. Does that mean it’s been theoretically possible to build a gynoid or android with the equivalent of human abilities for a whole decade then? That would assume that human brain functions can be precisely replicated using hardware in the form of logic gates, and that’s not clear.

I actually want to go the other way with this.

Edited public domain image from NASA

This image is not ideal. It excludes the Shetlands and includes about half of Ireland and bits of France, but I want to focus on Great Britain here. But we’re not quite there yet. Back to Greater London and the Albert Hall.

It wasn’t clear to me what kind of plan the human brain in the first two scenarios had. Was it supposed to be a two-dimensional or three-dimensional equivalent? Is the Albert Hall in particular supposed to be filled with transistors or is it just a flat surface covered in them? Brains are not like that, and the human brain is even less so due to being very wrinkled. This brings up a bit of a quandary for me. My head has a feature called cutis verticis gyrata, where my scalp is convoluted in a brain-like manner. Its cause is unknown but it gives me cause to wonder, is my scalp folded in the same pattern as my brain? Feeling it certainly seems to divide it up into similar lobes, gyri and sulci to the presumed brain underneath it, but if so, is that because there is some geometrical reason why the folds would be in the same place as the brain, or is there some connection between my gyri and the skin of my scalp? Is it like my cortex somehow communicates with my dermis and causes it to pile up in that manner? I don’t know. Nor does anyone else actually: the condition is entirely mysterious.

An unfolded version of the human cerebral cortex would have an area of about two thousand four hundred cm², but even then it’s a three-dimensional object and there’s more to the brain than the cerebrum. Perhaps counter-intuitively, there are several times as many cells in the cerebellum than there are in the cortices, and these are in a similarly folded arrangement and there are plenty of other bits inside the brain such as the hippocampus, thalamus, hypothalamus, amygdalæ, corpus callosum, basal ganglia and so on. Producing three-dimensional circuitry is an unsolved problem in microelectronics, but if the brain can be understood correctly as circuitry, the problem has been solved.

Great Britain does not have a definite area, and there are two figures for it, for similar reasons as the difficulty in working out the true size of the brain and its equivalent. The United Kingdom, i.e. including the bit of Ireland still claimed by Westminster and therefore more than just Great Britain, can be considered to have an area of 242 495 km² or 243 610 km². There are two reasons for this discrepancy. One is illustrated by the idea that “if Wales were flattened out it’d be bigger than England”. The UK is not flat. The first figure is basically the area enclosed by the coastlines considered as a two-dimensional surface. The second takes into consideration the fact that there are hills, valleys and mountains involved. This is where fractals come into play. The resolution of the hills and valleys is important here too. Hills could be thought of as cones or pyramids, or they could be thought of as rough surfaces. However, this doesn’t make the area infinite because it converges on a limit. The question of the coastline and tides also arises, although area based on tides is standardised by some kind of international agreement. But the coastline is also fractal and this influences the area. By one measure, the coastline of Scotland is ten percent that of the whole of Europe, and Europe itself has an unusually long coastline compared to other parts of the world. I’m guessing this also converges to a limit. Scotland’s coastline is, remarkably almost half the length of the oceanic coastline of the whole of the United States, if the later is calculated by a method excluding firths, but more like a ninth if the Great Lakes and firths of the US are included. Scotland is convoluted. It’s “brainy”.

The other factor, which doesn’t influence Britain much, is the fact that it follows Earth’s surface. This gives it a slightly larger area than if it’s assumed to be flat, and also a slightly different area than if you assume it to be a 242 495 km² portion of the surface of a sphere, because Earth is not perfectly spherical but deviates something like 0.5% from the shape of a sphere, and not even in a particularly regular way. Hence there are difficulties.

For the sake of argument I’m going to assume that the area of Great Britain, i.e. the big island I live on, is 209 331 km². This excludes any claims made on Ireland and also Lewis, the rest of Na h-Innse Gall, Ellen Vannin/Man, Ynys Mon, the Isle of Wight, Sheppey and much of Portsmouth (which is on an island). However, it would include the likes of Bede and Frog Islands in Leicester, Dungeness and the Isle of Thanet, since those are not currently islands.

When I look at a map of Great Britain, it is of course very familiar to me, as it would be for most Brits, but when I look at a photograph, map or diagram of the human brain it is less so, unsurprisingly. The brain is also much more significantly three-dimensional than Great Britain is and my lacking spatial abilities are brought into play. Parts of Britain are familiar enough to me that I can fairly easily identify individual rivers, villages, towns and cities in some parts of the island on unmarked maps and photos, particularly if I know which way the compass directions are. What I want to know is, how familiar am I with the anatomy of the brain compared to the geography of Great Britain?

The human brain has a volume of about 1 500 ml. It isn’t entirely sensible to assume it to be a sphere, but it’s still probably the best I can manage as a means of estimating its horizontal cross-sectional area. A sphere with a volume of 1 500 ml would have a cross-sectional area of 158 cm². Compare this to the area of Great Britain, this means that every square millimetre of the brain is proportionate to thirteen and a quarter square kilometres, and I’m guessing that’s the size of a small town.

Like my knowledge of British geography, my knowledge of brain anatomy is very uneven. I know the lobes, some of the sulci and gyri, the structures I mentioned above and the connections made by the cranial nerves along with the ventricles, the basal ganglia and a few other things. I’m also aware of the general layout of the neurones and supporting cells on a microscopic scale. But please don’t ask me to do brain surgery or it’s “there go the piano lessons”, and you also have to contend with the fact that my spatial abilities are pretty poor. It took me a very long time to place even two minor brain structures in relation to each other. I’m talking months.

It amounts to this. There are small areas of Britain of which, like practically anyone else, I have detailed local knowledge. By the way, it’s an interesting exercise to assess the accuracy of Wikipedia by looking up one’s local areas and comparing the articles to what one knows to be true of them because in that respect we’re all experts. Most of us couldn’t do the same with articles of the nucleus accumbens or substantia nigra. The brain is also characterised by different systems marked out by their use of different neurotransmitters, which I might compare to things like the National Grid, the road network and the rail system, although like most other things they’re three-dimensionally arranged.

The body of a nerve cell is about 100 μm wide. A brain scaled up to the size of Great Britain would have such cells 300-400 metres in diameter, so they’d be about the size of a medium-sized park or small lake – perhaps part of a neighbourhood such as a street. You could probably jog round it in about five minutes if you were fairly fit.

I want to be more familiar with the anatomy of the brain just as I want to know the geography of Great Britain, but in order to be able to do that, I need my own brain to have sufficient abilities to grasp its own three-dimensional structure, which makes it a harder task than, say, “doing the knowledge”. But isn’t it odd how we can basically be such a complex organ without having any knowledge of what we are? It isn’t even the same as self-awareness. You could have any degree of that without having a clue about the brain. I can tease out individual bits of experience and correlate them to the nature of the brain, such as the visual system, sensorimotor homunculi, reticular system and others, and I’m aware of the probability of what this particular brain does with certain neurotransmitters compared to others, but like most other people my brain is in a dark bone box doing all this stuff and will forever remain a mystery to me, not least because in order to understand it fully, I would have to have a more complex brain which I wouldn’t then be able to understand, and so forth ad infinitum.

Life is strange.