The other night I was lying in bed listening to a radio dramatisation of ‘Fahrenheit 451’ on my Walkman using earphones when Sarada came in, and as usual I couldn’t hear what she was saying properly because of them. Ironically, if it’s true, the very part I was listening to was the scene where Guy Montag enters the bedroom to see his wife Mildred lying comatose on the bed with the “seashells” in her ears “listening” to the radio. This was not only not lost on me but in fact I had wanted it to happen. The invention of wireless earbuds, which these weren’t because I can’t get Bluetooth to work properly and don’t approve of having basically disposable batteries in devices which in any case only last a couple of years, so I’ve heard, but they do nevertheless resemble Bradbury’s “seashells” and their use. However, Ray Bradbury said he was in the business of prevention rather than prediction but it seems someone stepped on a butterfly.
Having looked at ‘Nineteen Eighty-Four’, and a few years ago ‘Brave New World’, it seemed about time I looked at a third classic dystopian science fiction novel. I don’t know if it makes sense to rank these things, but if the first two count as being number one and number two, Bradbury’s novel surely belongs somewhere in the top half-dozen. Were it not for Zamyatkin’s ‘We’ and Kazuo Ishiguro no ‘Never Let Me Go’, it might even deserve an undisputed third place, though it seems quite crass to do that to these works. Nonetheless, I’m sure it often finds itself onto high school reading lists almost as often as the others I’ve mentioned, and in fact probably more often than ‘We’ in fact, which is relatively unknown. Ray Bradbury, though, differs from the other authors in being a genre sci-fi author. Of a kind, anyway. Kazuo Ishiguro ga now has tendencies in that direction but his stories haven’t always been like that. Bradbury also wrote mainstream fiction: ‘The Fruit At The Bottom Of The Bowl’ comes to mind, and is a wonderful study of misplaced guilt which calls Lady Macbeth to mind.
In general, I find Bradbury a slightly odd author and I can’t put my finger on why. As I understand it, he’s usually considered one of the Big Four: Asimov, Heinlein, Clarke and Bradbury. The Big Three, however, doesn’t include him. He differs from the others in having a much more mainstream literary approach and despite his successful efforts to produce absolutely classic science fiction works such as ‘A Sound Of Thunder’, which seems to be the origin of the idea of the butterfly effect, he doesn’t really feel like a SF writer at all even when he’s writing absolutely classic stories. He characterises and uses elaborate imagery and turns of phrase, and whereas that’s admirable it also makes his prose feel foreign to the genre. To that extent, it seems inappropriate to think of his inventions as predictive or worth considering in itself. Science fiction is substantially two things: fiction whose plot depends non-trivially on the setting and fiction where ideas play the role of characters. Bradbury’s work is less like this than most SF. New Wave clearly is not like that, but that was still several years off when he was at his peak. It’s been said that he’s more a fantasy and horror writer. He’s also respectable enough for my third year English teacher (the folk singer, not the guy serving time) to have us read his 22-story anthology ‘The Golden Apples Of The Sun’, although I’d already read most of them.
The second story in that collection, 1951’s ‘The Pedestrian’, is one of the sources from which ‘Fahrenheit 451’ is taken. Depending on who’s reading this, my introduction to it may be from one of you, who described its plot to me in about ’79, before I read it, although by then I had already seen the Truffaut film, which was apparently his only English language production. The other source is the longer story ‘The Fireman’, which I haven’t read. I can identify quite strongly with the main protagonist in ‘The Pedestrian’, who is in the habit of taking long evening walks about the city. He is stopped by an automated police car and asked to justify his actions, which he does but is assessed as mentally ill by the AI and taken to a mental hospital. This very much accords with the pedestrian-hostile nature of many US cities, many of which are apparently not walkable, and jaywalking had been made an offence from 1925 on. I myself spend a lot of time walking the streets for exercise and mental health, and just to get places, and I can’t imagine how that would go in the States. One thing this story does illustrate, though, is Bradbury’s strong attachment to nostalgia.
Now for the novel itself. Guy Montag, a fireman in a futuristic world which has banned books, has a job whose main activities are tracking down people who own books and burning them, and yes that does sometimes mean the people. He meets a teenager called Clarisse whose experience of the world is more holistic and authentic than he’s accustomed to, which opens his eyes to the possibility that books must hold much of great value in view of the fact that some readers are prepared to die rather than relinquish them. In the meantime, his wife Mildred is an avid TV watcher, televisions having now become wall screens which can even be tiled to cover the entire parlour, and drifts into taking an overdose of sleeping pills which is remedied by a couple of technicians coming over and changing her blood. After he begins questioning the book ban, he begins to surreptitiously collect books himself, notably a copy of the King James Bible, and throws a sicky to stay off work. His boss Beatty then visits him at home, explains why books have been banned and hints that he knows his secret and that other firemen always do it once but surrender the book within twenty-four hours. There’s also a robot dog which hunts down miscreants and kills them, and seems also to “know” something about Montag, either automatically or through having been programmed to suspect him. At some point, Clarisse dies in a car accident and Mildred is completely emotionally detached about it, as opposed to her interest in something on TV called ‘The Family’. Montag recalls an incident when he met someone called Faber in a park who was a retired English professor, makes contact with him and goes to see him. Faber decries his cowardice for not doing more to stop the anti-intellectual drift of society for standing up for literacy and books and reveals to Montag that he has a two-way radio system which he uses with Montag to offer him guidance. Montag returns home to find Mildred has gathered with some of her friends and he tries to have a serious conversation with them which turns out to be futile. He then shows them a book of poetry, which Mildred excuses by making up a story that it’s a ritual firemen perform once a year to show how ridiculous books are. He then goes back to the fire station with a decoy book which Beatty discards and reveals that he was once an avid reader himself. Montag is then called out to a house which turns out to be his own and is ordered to set fire to his own books with a flamethrower. Mildred has reported him, but is distressed by the destruction of the parlour screens and walks out on him. He then burns Beatty alive with the flamethrower and is pursued by the hound, which injects him but he destroys it with the aforesaid flamethrower. He flees another hound and this is publicised on TV as a major spectacle, but escapes by crossing a river so his scent can’t be followed, and escapes to St Louis where there’s a rural community of people each of whom memorise a particular book. In a culmination of the aerial manoeuvres which have been going on in the background throughout the novel, his home city is destroyed by nuclear weapons but the community survives and returns to the city to re-build society.
Right, so what do I have to say about this? Well, it is considerably dated in a somewhat peculiar way and I have the strong impression that Bradbury isn’t that articulate about what he’s trying to defend. The general idea of the novel is that social and technological change have led to a general dumbing down and flatness to society, relationships and personalities because of the inconvenience of individuality and passion, which leads to life not being worth living because people drift zombie-like through it. Mildred seems to take the overdose accidentally, but she doesn’t really value her life as such so it doesn’t matter whether she lives or dies. Instead, she’s mesmerised by her TV soap opera and radio station and nothing else is going on in her life. She’s also treated like a machine, by non-medics, when she takes the overdose. It’s like changing the oil in a car – I should point out here that I have no idea what I’m talking about because I know nothing of internal combustion engines. The technicians are impersonal, callous and accidentally brutal. Mildred is really the Everywoman of that society, and this is where I start to worry and think it shows its age.
Yes, Guy Montags wife is the Everywoman. She doesn’t seem to do any paid work and it seems that whereas men have jobs, her life is vacuous because domestic labour has been rendered obsolete, but instead of it being replaced by a role where she goes out and participates in the labour market she is left without a role. What, then, is she supposed to do? Montag, the firemen and other men have that option but apparently she hasn’t, and Bradbury criticises her for it. It’s like she’s trapped in the stereotypical place of the ’50s housewife and lacks any inherent impetus to break out of it. Then there’s Clarisse. She’s been interpreted as a manic pixie dream girl, i.e. she’s only there to allow Guy Montag’s personal growth. In more detail, the manic pixie dream girl is said to be an eccentric young woman with no internal life, often seen as wish fulfillment by a lonely male writer. The other women protagonists are less significant. I find both significant women in this book problematic and unsatisfactory, which is not surprising as it was published in 1953.
That’s one problem. Another way it dates itself is in the rationalisation for the firemen’s roles. The backstory on their development is that houses are now fireproof and there are simply no more domestic fires. Although this has led to a dystopia, this sounds initially like a positive thing. With hindsight, we are now aware that making a house completely fireproof would have trade-offs. Given that it was written in the 1950s, asbestos would almost certainly be involved. A more recent approach is to use flame-retardant chemicals, which are toxic and environmentally harmful. This is what we’ve actually done, and the consequences are that our homes are still at risk of fire, though less than previously, but are more likely to give us cancer or harm us and our surroundings in other ways. It seems characteristic of the mid-century that problems would be solved with no downside, as expressed in Donald Fagen’s ‘IGY’, a song I used to find very irritating until I got it. All that said, Bradbury does portray the disadvantage very clearly, and this again relates to gender roles.
The firemen lost the purpose of their work. This is a bit peculiar as it seems to suggest that there are no industrial or forest fires or other disasters such as rescuing people from road traffic collisions, and this is too shallow for me. But it also feels like they found a new role substantially because they were underemployed, and rather than simply dispensing with the role of the firefighter, they had to find a new function. It’s almost as if the vacuum of having no station had to be filled. I very much doubt that this is the intention, but it’s productive to read that into it. Whereas the women are left with nothing to do but fill their lives with fatuousness rather than finding other niches, the men for some reason have to be given something else to do, no matter how destructive, which they have to be paid for and which has to have meaning.
There’s also an elusive issue which arises from books themselves and Bradbury’s attitude to them. It feels like he has accepted that there’s value in them without fully understanding what that value is or allowing it to inform his writing. He defends the idea of books as good for the soul and recognises that they do things like deepen thought and improve empathy and emotional intelligence, but he himself doesn’t seem to have undertaken that journey. Even at the end of the novel, the people left behind have undergone something like rote-learning without profoundly internalising the content. The defence is symbolic. We should have a right to emotional complexity and pain even though Bradbury may not recognise all that implies. I hope I’ve captured that.
Beatty’s defence of the society’s position is very clear. His view is that books are contradictory, complex and cause pain and conflict. This is where the most difficult aspect of the entire novel comes to light. Beatty traces the history leading up to all books being banned as originating in anti-racism, and for me this makes for very uncomfortable reading. He outlines a process where the offensiveness of books to certain marginalised groups expanded until it was forbidden even to offend people such as dog-walkers, bird-lovers and cookery writers. Whereas it’s easy and valid to portray this as bigoted, it is true that one may need to be offended from time to time and that hurt is an important part of life. The problem, however, is that Bradbury doesn’t seem to have any sense of either immutable traits being in a special position or of the idea of punching up versus punching down. He seems to have a view of society as it had been as fundamentally equal or merit-based with the marginalised in essentially no worse a position as anyone else for some reason. On the other hand, this view is being expressed by someone in 2025. Perhaps I’m being confronted with something which makes me uncomfortable today but something valuable may have still been lost. However, I simply cannot get on board with the idea that active racism is okay.
Salvaging something from that, though, Beatty seems to be saying that the process got beyond the political realm and started to be about not making anyone uncomfortable, which meant never being provocative. It’s tempting to see a parallel between the trend he describes and the trend towards supposedly being “right on”. This is surely something the Right would agree with nowadays, perhaps disingenuously, and it makes me wonder if Bradbury is essentially conservative. After all, nostalgia is about yearning for things to go back to how they used to be and there’s a strong element of that in his writing. Nevertheless, it still feels like something can be salvaged from this.
Beatty makes a couple of other points. He draws a connection between population growth and the loss of tolerance because people have little choice but to invade each other’s space. The idea of overpopulation being a problem is now thoroughly dead, so whether or not this could be a factor is now moot. Yet again this is a sign of datedness.
Then there’s the question of technological change. There’s plenty of vapidity nowadays in online coverage of books and book reviews, and that’s just about the ink and paper version. The books themselves can also be very much of low quality. Books also compete with videos, web pages, audio books and e-books, whereas Bradbury had only identified radio and linear broadcast television as a problem. For example, he didn’t seem to anticipate video recording. On the other hand, he did anticipate the shortening of attention span and the rise of ever shorter summaries, a tendency I probably find just as horrifying as he.
Viewing Beatty’s exposition alongside the possibility that the firemen are engaging in malignant busywork, it begins to look highly insincere. Beatty has changed from a surreptitiously well-read younger man to a self-justifying thug. Has he maybe been brutalised by his work? I feel this takes things beyond the confines of the story.
But the book is not a lost cause by any means. It still has a lot to say about the dumbing down of culture, mob rule, shortening attention spans and the dangers of veering away from emotionally difficult and troubling themes and explorations. If the reader can look past the awkward social conservatism, it’s still possible to salvage something from this, and it is the case that with the constant use of smartphones and constant shallow entertainment, we are currently seldom left with our own thoughts uninterrupted and undistracted. Finally, in my defence I’ve been doing something like this at night since 1980 and it hasn’t fried my brain yet. And finally finally, it really ought to be 233°C, not Fahrenheit 451!
I’ve a tendency to back losers. For instance, Prefab Sprout and The The were my favourite bands when I was in my late teens and both were markedly unsuccessful. In keeping with this trend, I used to have a Jupiter Ace computer and I’ve alluded to this a few times on this blog. Jupiter Cantab, the company which designed and made it, had a total of I think five employees, seemed to work out of a garage and a suburban house in Cambridgeshire and had a turnover of around five thousand pounds. They went bust maybe a year or two after releasing the Ace in October 1982 CE and had to sell off all their old stock, a happy thing for me as it meant I could acquire a new computer. Its hardware was very basic even for late ’82, but its firmware decidedly was not. Unlike practically every other home computer of the time, the Ace used not BASIC but FORTH as its native programming language. Perversely, I considered writing a BASIC interpreter for it for a while but it seemed a bit silly so I didn’t.
FORTH, unlike BASIC as it was at the time, was considered a “proper” programming language. It has two distinctive features. One is that it uses a data structure known as the stack, which is a list of numbers in consecutive locations in memory presented to the user as having a top and a bottom. Words in the FORTH language usually take data off the top of the stack, operate on them and may leave one or more results on it. This makes the syntax like Latin, Turkish or Sanskrit rather than English, since instead of writing “2+2”, you write “2 2 +”, which leaves 4 on the stack. The other feature is that rather than writing single programs the user defines words, so for example to print out the character set one writes: : CHARSET ( This is the defined word and can be whatever the user wants except for control characters ) 255 32 DO I EMIT LOOP ; If one then types in CHARSET and presses return (or ENTER) in the Ace’s case), it will print out every character the Ace can display except for the graphics characters whose codes are below 32.
What you see above is the output from the Ace when you type in VLIST, i.e. list every word in its vocabulary. I think there are a total of about 140 words. All of them fit in 8K and show that FORTH is a marvellously compact language compared to BASIC or in fact most other programming languages. For instance, the ZX81’s BASIC has around forty-one words. FORTH on the Ace, and in general, was so fast that the cheapest computer faster than it, the IBM PC, cost more than a hundred times as much. For instance, in order to produce sound it was possible, as well as using the word BEEP, to define a word that counted from 0 to 32767 between vibrations and still produce a respectable note by moving the speaker in and out. By contrast, the ZX81 would take nearly ten minutes to count that high and had no proper sound anyway. This is a somewhat unfair comparison but illustrates the gulf between the speed of this high level language and the other.
Whittling Down The Vocab
As I’ve said, FORTH consists of words defined in terms of other words and therefore some people object to calling code written in it “programs”, preferring “words”. The fact that this definitional process was core to the language immediately made me wonder what would constitute a minimal FORTH system. There are quite a few words easy to dispense with in the vocabulary listed above. For instance, the word SPACES prints whatever number of spaces is indicated by the number on top of the stack, so 32 SPACES prints 32 spaces. However, this word could’ve been defined by the user, thus: : SPACES 0 DO SPACE LOOP ;
The DO-LOOP control structure counts between the two numbers on top of the stack and executes the code between DO and LOOP the number of times it takes to do that. It can be taken a lot further than that though. SPACE and CR are two words with a similar structure: they print out a character. SPACE unsurprisingly prints out a space. CR does a carriage return. Both are part of the standard ASCII character set and the word for printing the ASCII character indicated by the number on top of the stack is EMIT, so they can be defined thus: : SPACE 32 EMIT ;
: CR 13 EMIT ;
Hence three words are already shown to be unnecessary to the most minimal FORTH system, and the question arises of what, then, would constitute such a system. What’s the smallest set of words needed to do this?
The Ace had already added quite a lot of words which are not part of standard FORTH-79 and omitted others which are easily defined, examples being all the floating point words, PLOT, BEEP, CLS, VIS and INVIS. Others are trivial to define, such as 2+, 1- and 1+. Others are a bit less obvious: PICK can be used to replace DUP, SWAP and OVER, ROT is a special case of ROLL and so can be defined in those terms. . , that full stop, which prints a number, can be replaced by the number formatting words <#, # and #> . You can continue to whittle it down until you have a very small number of words along with the software which accepts input and definitions, and you’re done. In fact, if you know the hardware well enough you can make it even smaller because, with the Jupiter Ace for example, you know where the display is stored in RAM, how the stacks work (there are actually two because practically all computers implicitly use a stack for subroutines) and when it comes down to it it’s even possible to define words which accept machine code, the numbers computers actually use which represent simple instructions like adding two numbers together or storing one somewhere.
This is about how far I got until recently I managed to join two ideas together that I hadn’t previously managed.
Logic To Numbers
As you probably know, my degrees are in philosophy and my first degree is in the analytical tradition, which is the dominant one in the English-speaking world. It’s very common for philosophy degrees to be rubbished by the general public and even within philosophy, continental and analytical philosophers are often hostile to each other. What may not be appreciated is that much of philosophy actually closely resembles mathematics and by extension, computer science. When the department at my first university was closed down, some of it merged with computing. It also turns out, a little surprisingly, that one of my tutors, Nicholas Measor, was a significant influence on the theory of computing, having helped develop modal mu calculus, which is concerned with completeness of systems and temporal logic. He wrote a paper called “Duality and the Completeness of the Modal mu-Calculus” in the ’90s. This has kind of caused things to fall into place for me.
The Dedekind-Peano axioms for the set of natural numbers are central to the theoretical basis of arithmetic. They go as follows:
0 is a natural number.
For every natural number x, x=x.
For every natural number x equal to y, y is equal to x.
For all natural numbers x, y, z, if x=y and y=z then x=z.
For all a and b, if a is b natural number and a=b then a is a natural number.
Let S(n) be “the successor of n”. Then for every natural number n, S(n) is a natural number.
For every natural number S(m) and S(n), if S(m) = S(n) then m=n.
For every natural number n, S(n)=0 is false.
If K is a set such that 0 is in K, andfor every natural number n, n being in K implies that S(n) is in K,then K contains every natural number.
You can then go on to define addition, subtraction, multiplication and inequalities. Division is harder to define because this is about integers, and dividing one integer by another may lead to fractions, decimals and so forth. I’ve known about all this since I was an undergraduate but hadn’t given it much thought. It is, incidentally, possible to take this further and define negative, real and presumably imaginary, complex and hypercomplex numbers this way, but the principle of knowing that that’s possible is enough really.
Dyscalculic Programming
If you have a language which can express all of these axioms, you have a system which can do most arithmetic. And this is where I had my epiphany, just last week: you could have a programming language which didn’t initially use numbers at all, because numbers could be defined in terms of these axioms instead. It would be difficult to apply this to FORTH because it uses sixteen bit signed binary integers as its only data type but I don’t think it’s impossible and that would mean there could be a whole earlier and more primitive programming language which doesn’t initially even use numbers. This is still difficult and peculiar because all binary digital computers so far as I know use sequences of zeros and ones, making this rather theoretical. It’s particularly hard to see how to marry this with FORTH.
Proof Assistants
Well, it turns out that such programming languages do exist and that they occupy a kind of nebulous territory between what are apparently called “proof assistants” and programming languages. Some can be used as both, others just as the former. A proof assistant is a language somewhat similar to a programming language but helps the user and computer together arrive at proofs. I have actually used one of these without realising that that was what I was doing, back in the ’80s, where the aforementioned Nicholas Measor wrote an application for the VAX called “Citrus” after the philosopher E. J. Lemmon, who incidentally died 366 days before I was born, whose purpose was to assist the user to prove sequents in symbolic logic. My approach to this was to prove them myself, then just go along to the VAX in the computer basement and type in what I’d proven, although it was admittedly helpful on more than one occasion. While using this, I mused that it was somewhat like a programming language except that it wasn’t imperative but declarative and wondered how one might go about writing something like that. I also considered the concept of expressive adequacy, also known as functional completeness, in this setting in connection once again with FORTH, realising that if the Schaeffer stroke were to be included as a word in FORTH a whole host of definitions could easily provide any bitwise function. It was also borne in upon me that all the logics I’d come across so far were entirely extensional and that it might even be a distinctive feature of logic and mathematics per se that it was completely extensional in form. However, I understand that there are such things as intensional logics, and I suppose modality might be seen in that way although I always conceive of it in terms of possible world semantics and multivalent truth values.
It goes further than that though. I remember noticing that ALGOL 60 lacks input/output facilities in its standard, which made me wonder how the heck it was supposed to be used. However, it turns out that if you are sufficiently strict with yourself you can absolutely minimise I/O and do everything inside the compiler except for some teensy little bit of interaction with the user, and this instinct, if you follow it, is akin to functional programming, a much later idea which enables you to separate the gubbins from how it looks to the user. And there are purely functional languages out there, and at this point I should probably try to express what I mean.
From Metaphysics To Haskell
Functional programming does something rather familiar. Considering the possibility that programming can dispense with numbers as basic features, the emphasis shifts to operators and functions and they become “first-class citizens”. This, weirdly but then again not so weirdly, is exactly what happens in category theory. Haskell is the absolutely paradigmatic functional language, and it’s been said that when you think you’re programming in it, it feels like you’re actually just doing maths. This approach lacks what you’d think would be a crucial feature of operating a computer just as ALGOL 60 can’t actually print or read input, and such things are known as “side-effects” in functional programming. If a function does anything other than take the values, performing an operation on them and returning the result, that’s a side-effect. This makes it easier to formally verify a program, so it’s linked to the mu calculus.
I’ve now mentioned Haskell, which brings me a bit closer to the title of this post and now I’m going to have to talk about monads. Monads are actually something like three different things and it’s now occurred to me that if you put an I at the start rather than an M you get “ionad”, which gives me pause, but this is all quite arcane enough. Firstly, Leibniz’s metaphysics prominently features the concept of monads. In 1714, he brought out a ninety-paragraph book called ‘The Monadology’ setting out his beliefs. It wasn’t originally his idea but he developed it more than others. Leibniz’s monads are indivisible units within reality which has no smaller parts and is entirely self-contained, though not physical like an atom. Anything that changes within it has to arise within itself – “it has to really want to change”. Since monads don’t interact there’s an arrangement called “pre-ordained harmony” where things in each monad are destined to coincide appropriately. I mean, I think this is all very silly and it arises from Descartes and the problem of the interaction of soul and body, but it’s still a useful concept and got adopted into maths, specifically into category theory. In that, it’s notoriously and slightly humorously defined thus: “in concise terms, a monad is a monoid in the category of endofunctors of some fixed category”, and this at least brings us to the functor. A functor is a mapping between categories. Hence two different fields of maths might turn out to have identical relationships between their elements. It’s a little like intersectionality in sociopolitical terms, in that for example racism and sexism are different in detail but are both forms of marginalisation, the difference being that intersectionality is, well, intersectional, meaning that different kinds of oppression do interact, so it isn’t quite the same as either a monad or a functor. Finally, in Haskell a monad is – er. Okay, well at this point I don’t really know what a monad is in Haskell but the general idea behind Haskell was originally that it was safe and also useless because you could never get anything into or out of a program written in it. This isn’t entirely true because it does do work in a thermodynamic sense, so if you take a computer switched on but doing nothing and you run a Haskell program on it which does something, it does get at least slightly warmer. That is, it does stuff to the data already inside it which you can never find out about, but it’s entirely self-contained and does its own thing. So that’s all very nice for it, but rather frustrating, and just now I don’t know how to proceed with this exactly, except that I can recognise that the kind of discipline one places oneself under by not knowing how one is going to get anything on the screen, out of the speakers or off the keyboard, trackball, mouse or joystick has the potential of making one’s programming extremely pure, if that’s the word: operating in an extremely abstract manner.
Home Computing In The 1960s
I do, however, know how to proceed with what I was thinking about earlier. There is some tiny vocabulary of FORTH, perhaps involving a manner of using a language which defines numbers in the terms outlined above, which would be simple enough to run on a simple computer, and this is where things get theoretical, because according to Alan Turing any computer, no matter how simple, can do anything any other computer can do given enough time and resources. This is the principle of the Turing machine. Moreover, the Turing machine can be realised in terms of a language referred to as the Lambda Calculus.
Underneath the user interface of the Jupiter Ace operates the Z80A microprocessor. This has 694 instructions, which is of course quite a bit more than the 140 words of the Jupiter Ace’s primitive vocabulary. Other processors have fewer instructions, but all are “Turing-complete”, meaning that given enough time and memory they can solve any computing problem. In theory a ZX81 could run ChatGPT, just very, very, very slowly and with a very big RAM pack. So the question arises of how far down you can strip a processor before it stops working, and this is the nub of where I’m going, because actually you can do it with a single instruction, and there are even choices as to which instruction’s best.
The easiest one to conceive of is “subtract and branch if negative”. This is a machine which has two operands in memory. One of them is a number, which it subtracts from the number it already has in mind. If the result of this number turns out to be negative, it looks at the other operand and jumps to the number indicated in the memory. Otherwise it just goes to the next memory address and repeats the operation. It would also save space on a chip if the values are stored in memory rather than the chip itself, so I propose that the program counter and accumulator, i.e. where the data are stored, are in the main memory.
Moreover, I’ve been talking about a single instruction but in fact that actual instruction can be implied. It doesn’t need to exist explicitly in the object code of the memory. Instead it can be assumed and the processor will do what that instruction demands anyway, so in a way this is a zero instruction set CPU.
What this very simple computer does is run a program that emulates a somewhat more complex machine which runs the stripped down FORTH natively. This is where it gets interesting, because the very earliest microprocessors, the 4004 and 4040, needed more transistors than this machine would, and it would’ve been entirely feasible to put this on a single silicon wafer in 1970. This is a microcomputer like those found in the early ’80s for which the technology and concepts existed before the Beatles split up.
This is of course a bit of a mind game, though not an entirely useless one. What I’ve basically discovered is that I already have a lot of what I need to know to do this task, but it’s on a level which is hard to apply to the problem in hand. But it is there. This is the way in.
There you go: don’t say I don’t listen to my readers! I don’t want this to seem self-indulgent, so before I start I want to point out that this is a response to a comment, that someone would like me to do something like this, so that’s what I’m doing.
Without tinkering with HTML, it seems difficult to provide links within a document in WordPress, so for now I’ll just give you a table of contents in order to prevent you being overwhelmed with the length of this post:
1. The Eternal Present
2. The Never-Ending. . .December?
3. George Orwell Is Better Than War-Warwell
4. My Secret Diary, Aged 16¾
5. A Collision With The Great White Whale
6. Armageddon
7. The Stereophonic Present
8. Harvest For The World
9. The Ending Story
10. Life Off The Fast Lane
11. Green Shoots
The Eternal Present
To me, the year 1984 CE is a kind of eternal present. I sometimes joke about this, saying that all the years after that one were clearly made up, and someone pointed out to me that that was highly Orwellian, but in fact it really is the case that all years are made up and we just have this arbitrary numbering scheme based on someone’s incorrect guess about the birthdate of Jesus, and yes, here I’m assuming there was an historical Jesus, which considering I’m Christian is hardly surprising.
2. The Never-Ending. . .December?
There is a fairly easy if absurd way to make it 1984 still, which is just to have a never-ending December. It’s currently Hallowe’en 2025, in which case it’s the 14945th December 1984. This wouldn’t be a completely useless dating system and I sometimes think we can conceive of time (in the waking sense: see last entry) differently according to how we choose to parcel it up. Another way of making it 1984 would be to date years from forty years later, and no that’s not a mistake as there was no year zero in the Julian or Gregorian calendars. There was one in a certain Cambodian calendar of course, from 17th April 1975, where it was inspired by the French revolutionary Year One, the idea being that history started on that date because everything that happened before that was irrelevant, being part of capitalism and imperialism I presume. My insistence that it’s always 1984 is the opposite of that, as I’m affectedly sceptical about anything happening afterwards. Coincidentally, I use a day-based dating system starting on 17th July 1975 in my diary, and I don’t actually know why I do this, but it’s only ninety-one days after the start of Year Zero (there are other things to be said about Pol Pot which would reveal the over-simplification of this apparent myth). It’s based on the first dated entry in any notebook and my mother’s suggestion that I keep a diary which I didn’t follow. It’s actually the second dated entry, as the first one is of a series of measurements of a staircase, which isn’t really about anything personal. I’ve also toyed with the idea of Earth’s orbit being a couple of metres wider, which would make the year very slightly longer but which would add up over 4.6 aeons (Earth’s age) to quite a difference, but if that were so, asteroid impacts and mass extinctions wouldn’t’ve happened which did and other ones which didn’t might’ve, so it totally changes the history of the world if you do that. If the year was a week longer, it would now be 1988 dated from the same point, but a lot of other things would also be different such as the calendar. It’s quite remarkable how finely-tuned some things are.
3. George Orwell Is Better Than War-Warwell
Although I could go on in this vein, I sense it might irritate some people, so the rest of this is going to be about my feeling of the eternal present, how 1984 actually was to me and thoughts about George Orwell. I’m just telling you this if you feel like giving up at this point.
I have habitually said that “George Orwell is better than War-Warwell” as a reference to Harold MacMillan’s paraphrase of Winston Churchill, and I wonder if Churchill is one of those figures who is always having quotes misattributed to him, like Albert Einstein. The trouble is, of course, that this is a practically meaningless phrase which I can’t do anything with, although Sarada has published a story with that title. I’ve read a lot of Orwell, although unlike most people who have that doesn’t include ‘Animal Farm’. It’s been suggested that if he’d lived longer, he would’ve gone to the Right and become a rather embarrassing figure like David Bellamy or Lord Kelvin, but of course we don’t know and I don’t know what that’s based on. He was known to be quite keen on the idea of patriotism though, so maybe it’s that.
Within the universe of his novel ‘Nineteen Eighty-Four’, we don’t actually know that it is that year. It does seem to be about that time, because Winston Smith was a small boy just after the end of World War II. The Party is constantly revising history and is now claiming that Big Brother invented the steam engine, so it seems easily possible that it isn’t exactly 1984 and that either new years have been written into history or removed from it, and just maybe it’s always 1984 and has been for many years by that point. Maybe they just want to save on printing new calendars or are trying to perfect the year by repeating it over and over again, for example. Maybe ‘Nineteen Eighty-Four’ is like ‘Groundhog Day’, and what we read is merely one iteration among many of that story. I’ve heard, although appropriately maybe this can’t be trusted, that Orwell simply came up with it by transposing the last two digits of the year he wrote it. Whereas it’s possible to play with this, the truth is probably simply that he needed to give Winston enough time to grow up and reach his forties so he could tell the story.
It interests me that there was a somewhat jocular, artsy attempt to claim that a period called the 19A0s existed between the late ’70s and early ’80s which has been edited out of history, which is similar to the Phantom Time Hypothesis. Just to cover these, I’ve written about this before, and the Phantom Time Hypothesis, so if you want you can read about it there.
A slightly puzzling aspect of ‘Nineteen Eighty-Four’ is why its title is spelt out rather than written as figures, but it seems that this was common practice at the time. It’s one thing that everyone gets wrong about the book, as it’s almost always referred to as ‘1984’. I should point out that one reason I didn’t get any further than A-level with English Literature is that I experience an impenetrable thicket of associations whenever I consider mainstream creative works which make it difficult to respond meaningfully to them. In the case of Orwell’s novel though, since it’s arguably science fiction it might be more appropriate than usual to do so, since that’s also how I respond to that genre but find it more in keeping with that kind of imagination. I’m not alone in this it seems: Orwell’s novel is analysed in such a manner by the YouTube channel ‘1984 Lore’. I myself used Newspeak to write a short story about a kibbutz-like community on another planet where everyone actually spoke Esperanto to explore whether language restricts thought, portraying it in terms of the idea that it does.
4. My Secret Diary, Aged 16¾
My personal experience in the year 1984 represents a peak in my life. Note that it’s just one peak, neither the biggest nor the only one. It doesn’t overshadow the year of my wedding or the births of our children, grandchildren or anything like that. ’82 and ’83 are also significant in their own ways. ’82 I thought of as the “endless summer” characterised by the nice pictures of young people in yellow T-shirts and long blond hair on the envelopes you got back from the chemists with the photos in them, and ’83 had been particularly poignant, but the year after those had been highly focussed on for a long time in various circles by many people. 1984 opened for me hiding under a table in a suburban living room in Canterbury whispering to my friend about when midnight came. I was wearing a navy blue M&S sweatshirt whose inner flock was worn on the inside of the left elbow, a blue and white striped shirt with a button-down collar which I was only wearing because she liked it, and jeans which annoyed me by not having any bum pockets, and she was wearing jeans which did have bum pockets and a white blouse with yellow check-line lines on it, but it was completely dark so neither of us could see anything. I was sixteen and had had a lot to drink considering my age, naughtily, as had she. We eventually conjectured that midnight must have passed and I rang my dad, who came to pick me up and whom I immediately told I’d had some alcohol (Martini, Cinzano and a Snowball) which my friend saw as not only typical of my impulsiveness and indiscreetness but also liable to get me in trouble but it didn’t. The street lights looked rather blurry on the way home. Thus opened my 1984. A few days later I was back in the sixth form and my friend Mark Watts, who was later to go on to found an investigative journalism agency and uncover a number of cases of child sexual abuse, informed me that it was vital that we didn’t fall for whatever spin the media were likely to put on it being the year named after that novel and that whenever he referred to George Orwell it would be under the name Lionel Wise (Eric Blair – Lionel Blair; Eric Blair – Eric Morecambe – Ernie Wise), which was quite clever if also rather adolescent, which is what we were. We were all very conscious that it was 1984 at last. Anne Nightingale played David Bowie’s ‘1984’ and Van Halen’s ‘1984’ on her request show on the evening of New Year’s Day. I didn’t have a hangover, because I don’t get them. I asked my brother to record something off Anne Nightingale because I was about to go out again to see my friends, and it happened that the next track was Steve Winwood’s ‘While You See A Chance, Take It’, which I’d wanted to get on tape for years but he cut it off halfway through the first verse. The machine on which that was recorded was a rapidly failing mono Sanyo radio cassette recorder which my mum was annoyed was deteriorating so fast seeing as it was less than four years old and I’d got it for my thirteenth birthday. Incidentally, I’m writing all this without reference to diaries or any other kind of record. I just remember it, plainly, clearly, in great detail, and I don’t know how this compares to others’ memories. My memories of much of the ’80s are as clear as flashbulb memories because they occur within my reminiscence bump. There are errors, such as the exact name of the Steve Winwood record, but also a lot of clarity. Anyway, later that year on my seventeenth birthday, 30th July, I got a stereo boom box possibly from Sony which I first recorded on on 8th August, namely Tracey Ullman’s ‘Sunglasses’, followed by ‘Smalltown Boy’. In September, I got my first job, as a cashier at the new Safeway, which looked enormous to me at the time but on returning to the Waitrose which it now is seems really tiny nowadays, and lost it after eleven weeks due to being too slow on the till, not assertive enough to turn people away from the “Nine Items Or Less” (now “fewer” apparently) queue, and £2 out on the cashing up on two occasions. Apparently this was a lot stricter than other places, such as Lipton’s where my sister worked and who was much further out than I on many occasions when she first worked there. I could say more about her situation there but probably shouldn’t. Anyway, I got £1.41 an hour from Safeway which I saved up to buy the first big item I’d ever got for myself, which was a Jupiter Ace microcomputer. Which brings me to computers.
I was very into computers in the early to mid-’80s, but also deeply ambivalent about them. At the start of the year, the family had owned a ZX81 for a year and a bit. I found this annoying because it was such a low-spec machine, but restrictions fuel creativity so it was in fact not a bad thing. I was spending a lot of my time reading computer magazines and wishing I had a better computer, which I resolved late in that year, and also writing software, mainly graphically-oriented, which was difficult considering that our computer only had a resolution of 64×48, although I was later able to increase this to 192 on the Y-axis by pointing the I register on the Z80A somewhere else than the character set, so I could make bar graphs which looked quite good. I did also write a computerised version of Ramon Llull’s ‘Machine That Explains Everything’, a couple of primitive computer viruses and an adventure game. Later on, after I got the Jupiter Ace, I got it to display runes and produce screeds of nonsense words in Finnish. As I said though, I was ambivalent. I’ve never been comfortable with my interest in IT for several reasons, and for more reasons at this point. One reason was that at the time I was communist, and also kind of Stalinist, and felt that the use of IT and automation as fuelled by the microchip boom would create massive unemployment and reduce the power of the workers to withdraw their labour. However, it isn’t clear to me now why me not having a ZX81 would’ve made any difference to that. In the middle of the year, I decided that communism was over-optimistic and there was a brief period during which people were very eager for me to adopt their views, but I quickly opted for Green politics. I was not yet anarchist and believed in a Hobbesian state of nature. Besides this perspective, I was also uncomfortable about my interest in computers because it seemed nerdy, something very negative at the time, and unbalanced – obsessive and not “humanities” enough to my taste. It felt too much like my comfort zone and not challenging enough. It did, however, become apparent that I had spent so much time studying computers, with text books as well as mags and experimentation, that I could’ve easily aced the O-level, which was another example of how my formal educational focus was outside educational institutions at the time, and it was also suggested that my aforementioned friend with whom I hid under the table and was trying to learn BASIC at the technical college, would’ve welcomed me teaching her. This got to the point where I helped her with her homework. On another occasion, an acquaintance was trying to write a FORTH programming language interpreter in Z80 assembler and I had a look through it with interest. One of my other friends later went on to write parts of the major GNU text editor “religion” Emacs, already almost a decade old by ’84, which I still use today. However, I found my interest in computers made me feel embarrassed and self-conscious and I felt somewhat ashamed of it. I think I found a lot of my interests at the time to be very personal and not something I felt comfortable sharing with others.
It was also the year of my perhaps most significant cultural shift. I entered the year enthusiastic about mainstream literature and poetry. I had been warned, though, by my O-level English teacher, that A-level English Lit was likely to spoil my appreciation of reading, and this did in fact happen. Early in the year my enthusiasm continued and I came to enjoy reading poetry and literature. I planned to continue my writing on the works of Samuel Beckett as part of my A-level and the fact we were studying Joyce gave me optimism in that regard. We had a fair bit of freedom to do that kind of thing. In the summer exams, my practical criticism of a particular poem was chosen as a model answer for others to emulate and I was able, for example, to uncover themes in poetry which my teacher hadn’t noticed, which was mainly due to my insistence on maintaining a wide education. I was applying to university in the later part of the year, having researched them in the earlier part, and having opted for degrees in English and Psychology or Philosophy and Psychology, I was clearly sufficiently committed to English at the time to consider it as a first degree. However, all of that was about to go to shit.
5. A Collision With The Great White Whale
It may be worth analysing what went wrong in some depth, but the simple facts of how it happened were as follows. My A-levels were in English, RE and Biology, which I want to stress is a very popular combination. At the end of the first year, around June, there was a marine biology field trip which was in itself quite formative for me because I didn’t relish getting stuck in the stinky, sticky black tarry mud encouraged by the anaerobic respiration in Pegwell Bay, an estuary on the edge of Thanet. It was cold and wet, and the water was of course salty, and I thought I’d ruined that sweatshirt I’d mentioned earlier which I was once again wearing. My dissatisfaction was palpable. Anyway, it was assumed by the English department that those who were off on the field trip would, possibly from their friends, learn their summer reading assignments, which were to read James Joyce’s ‘Dubliners’ anthology and Herman Melville’s ‘Moby Dick’. I didn’t get that information, didn’t talk about the assignments with my friends because it wasn’t a priority for us and consequently was confronted with reading an absolute doorstep of a book plus much of the Joyce one, which was less problematic because being short stories it was easy to catch up with that one. I was then confronted, on reading Melville’s novel, with a load of American men murdering whales for a living. Right then, I wasn’t even vegetarian but I did, like a lot of other people, believe in saving the whale. Over my childhood, I’d read a lot of story books about animals, like ‘Ring Of Bright Water’, ‘All Creatures Great And Small’, ‘Incredible Journey’, ‘Bambi’, ‘Watership Down’ and ‘A Skunk In The Family’. Of course there was peril in these and also horrible deaths on occasion, not to mention sad endings, but the focus was on the otter, the bovines, dogs, cats, deer, rabbit and skunk. There is no problem with depicting them being treated badly, suffering and so forth. But in ‘Moby Dick’, there is never any sympathy or focus on the experience of the whales or acknowledgement of them as victims, in a similar manner to the people who had lived in North America before White colonisers turned up. It was all about something else, and there wasn’t just an elephant in the room but a whale. I was unable to bring myself to step into Ishmael’s or anyone else’s shoes. The only bits I could tolerate were the encyclopaedic sections. I could go into more depth here. I think Melville was probably trying to make a whale-sized book, was using the whale as a metaphor for the intractable and incomprehensible nature of, well, nature and the world in general and as a tabula rasa, them being white like a piece of paper, and there’s the angle that the whale is in some way a phallic symbol. Ahab also anthropomorphises the whale, seeing them as a rival in a battle with him when in the end the whale is just the whale and doesn’t even realise the tiny figures above lobbing harpoons at them are even conscious beings. From the novel’s perspective, the whale probably isn’t even a conscious being. Hence I was confronted with what I read as a hostile, nasty and animal-hating, actually animal-indifferent story where I couldn’t work out whether any of the characters were supposed to be sympathetic and,moreover, the only chapters I could actually garner any interest in were dismissed as mere padding by my teachers. I also found, for some reason, that the same approach I’d been taking to poetry up until the summer no longer seemed to work. It probably didn’t help that one of my teachers was a frustrated Classics teacher who later left and taught that at the King’s School, although I was interested in the classics she managed to shoehorn into the lessons such as Oedipus Tyrannus, the Oresteia and Antigone. I would say, though, that I really didn’t get on with the Oresteia because I felt very much that it lacked universalism. None of that was in the exams of course, but I wasn’t ever very oriented towards those. I was more just interested or not.
The autumn of the year was marked mainly by anxious procrastination about submitting my UCCA form, which I handed in a month later than I was supposed to due to indecision about what to put in my personal statement, which wasn’t up to much partly because of not wanting to admit what I was interested in, and partly because of not pursuing it in a public way due to the shame I felt about admitting it. I also got annoyed with universities insisting on being put first, so rather than selecting places I actually wanted to go to, although my first choice, Keele, I was very keen on due to the balanced and eclectic nature of their educational approach, I deliberately listed Nottingham, Reading and Exeter, followed by Sheffield in which I was in fact fairly interested in. I got rejected by all of them except Keele and Sheffield, Exeter apparently by return of post. Among the polys I applied for Hatfield, Oxford and NELP, and would’ve got into NELP in fact. I liked the modular nature of the course at Oxford, which appealed to me for the same reason as Keele did.
6. Armageddon
Another association which arrived in 1984 and which has been with me ever since is the idea of “proper Britain”. I may have mentioned this before, but the notorious nuclear holocaust drama ‘Threads’ was broadcast on 23rd September 1984, notable for being the first depiction of nuclear winter in the mass media, and I remember being edgelordy about it by saying to my friends that it was over-optimistic. I was ostentatiously and performatively depressive at the time. I did not in fact feel this, but my takeaway from it was probably unusual. There’s a scene at the start where Ruth and Jimmy are canoodling on Curbar Edge above Hope Valley which really struck me. It was grey, drizzly and clearly quite cold, even though I think the action begins in May. There’s also the heavily built up large city of Sheffield, where I might be going in a year or so, and it suddenly crystallised my image of what Britain was really like. Not the South with its many villages and small towns densely dotted about with relatively dry and sunny weather, which I was used to, but the larger block of large post-industrial cities with redbrick terraced houses, back-to-backs, towerblocks and brutalist municipal architecture set against a background of rain, wind and greyness. I relished that prospect, and it felt like real Britain. This is how the bulk of the British population lives, and it becomes increasingly like that the further north you get, hence my repeated attempts to move to Scotland, which in a way I feel is more British than England because of many of those features. By contrast, if you go from Kent to France it’s basically the same landscape and climate with different furniture. Maybe a strange reaction to a depiction of a nuclear war, but there you go.
I did, however, also feel very much that it would be strange and foreign to move away to an area dominated by Victorian redbrick terraced houses. I couldn’t imagine that they’d ever feel like home to me and I couldn’t envisage settling down there. I was still very much a Southerner at that time. I was also, however, fully aware of the privileged bubble I was living in and it made me feel very awkward.
Nor am I ignoring the actual content of the film. The Cold War and the threat of nuclear destruction was very high in many people’s minds at the time and it almost seemed inevitable. This made even bothering to make plans for the future seem rather pointless and almost like busy work. We all “knew” we were going to die horribly, as was everyone around us, so doing the stuff I’ve mentioned, like applying to university, seemed more like something I did as a distraction from that worry than something with an actual aim sometimes, depending on my mood. This had a number of consequences. One is that I wonder if a lot of Gen-Xers underachieve because they missed out on pushing themselves into things in their youth, expecting the world to end at any time. Another is that as the ’80s wore on, pop music and other aspects of popular culture began to reflect that anxiety. Ultimately even Squeeze (basically) ended up producing an eerie and haunting post-nuclear song in the shape of ‘Apple Tree’. Alphaville’s ‘Forever Young’ particularly captures the attitude and is widely misunderstood. The reason we’d be forever young is that we’d never get a chance to grow up and live out full lives. That single was released a mere four days after ‘Threads’ was broadcast.
7. The Stereophonic Present
Speaking of music, there were something like four bands in the Sixth Form at that point, the most prominent being The Cosmic Mushroom, clearly influenced by the Canterbury Scene even in the mid-’80s. My own attitude to music was to concentrate on cassettes because I didn’t trust myself to take care of vinyl properly. The advent of proper stereo in my life was on my birthday at the end of July, and there’s something vivid and recent-sounding about all stereo music I own for that reason. This is in fact one factor in my feeling that 1984 is current rather than in the past. The present is characterised by clear, stereophonic music, the past by lo-fi mono, and that switch occurred for me in summer that year. This is actually more vivid than the earlier shift between black and white and colour TV. Incidentally, CDs were out there for sure, but only for the rich, having been first released two years previously. Like mobile ‘phones, they were a “yuppie” thing, like jug kettles. Back to music. Effectively the charts and my perception of them that year were dominated by ‘Relax’, by Frankie Goes To Hollywood. This was released in November the previous year and entered the charts in early January. This got banned as it climbed the charts, which boosted its popularity enormously and got it to number 1. It stayed in the Top 100 until April the next year. We played it at the school discos, the other standard being ‘Hi-Ho Silver Lining’, which we all used to sing along and dance to. My personal preferences included The The, Bauhaus and The Damned at the time, although the ongoing appreciation of the likes of Kate Bush continued.
8. Harvest For The World
On 24th October, the famous Michael Buerk report on the famine in Ethiopia was broadcast. This led in the next couple of years to Live Aid and Run The World, but from that year’s perspective it only just began. There’s been a lot of justified criticism of media framing of the famine, but as a naive teenager I didn’t have much awareness of that and simply saw it as a disaster which required a response from me, which was initially in the form of a sponsored silence for the whole school in the sports hall, then later a sponsored 24- or 36-hour fast supervised by one of my biology teachers in which I also participated. Although I can’t really mention this without pointing out that the whole thing was dodgy, it did start a ball rolling which continued in much later political activism on my part and a passionate youthful idealism to make the world a better place, which I felt confident had to come soon and meant action from me. ‘Do They Know It’s Christmas’ was a further effort in that campaign, satirised by Chumbawumba as ‘Pictures Of Starving Children Sell Records’ and roundly criticised by the World Development Movement, but at the time I knew nothing of this. By the way, it’s remarkable how the unpopular Chumbawumba cynicism managed to get from the political fringe into the mainstream in just a few years with the Simpsons parody ‘We’re Sending Our Love Down The Well’ only eight years later, although that was also linked to a Gulf War song it seems, which however is in that tradition, which I first became aware of, superficially, that year. In fact I can’t overestimate the importance of this sequence of events, even with its grubby and cynical connotations, and my support of it has a simplicity and innocence which I wish in a way I still had. I want the world to be one in which something like that works straightforwardly and simply. As I’ve said before, nobody is Whiter or more middle class than I am.
A rather different aspect of this is that I and someone called Louise almost got the giggles during the sponsored silence and we both spent most of our time doing it, which was I think a whole hour, trying not to laugh. A while after that the same thing happened with the two of us in an English class, though on that occasion we gave into it and there was actually nothing provoking it at all. It then spread through the whole class. Once again, in an English class shortly after that, the teacher, discussing Moby Dick of course, took out a model of a sperm whale on wheels unexpectedly and rolled it up and down the desk, which again led to uncontrollable laughter. This was Thatcher’s Britain, yes, and most of us hated her, but it wasn’t grim or joyless, at least for seventeen year olds, and I actually managed to get some pleasure out of Herman Melville’s writing!
CND was very active at the time. I, however, was not, for a couple of reasons. I was slightly uncomfortable with the idea of unilateral disarmament, and in fact that was the last of the standard lefty/Green causes I committed to, but I had a feeling they were right and wanted to go on the demos but never actually did. This is by contrast with the Miners’ Strike. Kent, like Northern France, was a coalmining area and the strike was very close to us because several of my friends were in coal miners’ families. I asked what I could do but nothing really came to mind. I was also aware of hunt sabbing but was unable to work out how to find out about it. Had I got involved in that, I might’ve gone vegan years earlier than I did.
9. The Ending Story
Then there was cinema. My aforementioned friend under the table rang me up one day and just said we should go and watch ‘Champion’ at the ABC. That cinema, incidentally, was managed by someone I later got to know when he and I both coincidentally moved to Leicester. I was surprised my friend just spontaneously bet on the horses when I’d never dreamt of doing that, at the time because it was gambling. The film, in case you didn’t know as it may be quite obscure, was based on a true story about a famous jockey who has cancer and survives. One impression I got from it was that he looked like Lionel Blair, which is the second time I’ve mentioned him today. At this time it was still possible to sit in the cinema for as long as you wanted while the same films, yes, films plural, played over and over again. This was actually the last year it was possible. The year after, I’d just finished watching ‘Letter To Brezhnev’ and the ushers chucked us all out. It was a real shock, and you don’t know what you’ve got till it’s gone. It meant that parents could use cinemas as babysitting services, though this may have been somewhat reckless by today’s standards. They did the same with swimming pools: Kingsmead had this going on, although specifically in ’84 I didn’t exercise much apart from walking eight miles, to school and back, every day. This lazy year ended immediately with my New Years’ resolution to go running every morning from 1st January 1985.
‘Ghostbusters’ was also quite memorable. I took my younger brother to see it and I wasn’t expecting the whole audience to shout the song when it came on. It’s a good film, with a memorable scene involving a fridge and an unforgettable line which is usually cut towards the end. It also mentions selenium for no apparent reason, and has Zener cards at the start. At the time, rather surprisingly, it seemed to be generally accepted even in academia that some people were psychic. I often wonder whether it’s really good-quality research which has led to received opinion on this changing or whether it’s just a reputational thing that psi is now widely rejected by academic researchers. The other major film I remember watching was ‘Star Trek III’, which is also very good, and at the time there was no plan to bring Star Trek back. It was considered a sequel too far by one of my friends, so at the time it looked like the show was completely defunct and they were trying to revive it beyond all reason. I also saw ‘2010’, which I liked for incorporating the new findings about Europa, but it definitely lacks the appeal of the original. Incidentally, the long gap between Voyager visits to Saturn and Uranus was underway and the remaining probe wouldn’t get there for another two years. The original ‘Dune’ also came out this year, and although I wanted to see it, I don’t think it came to Canterbury. I wouldn’t’ve liked it at the time, having seen it since, and oddly I had the impression it was in a completely different directing style and that it was also a 3-D film. It may also have been the most expensive feature film ever made at the time. ‘1984’, of course, also came out then, but that deserves its own treatment. As other people I’ve since got to know of my age have commented, ‘Neverending Story’ marked the first time I perceived a film as definitely too young for me, and in a way that realisation reflected the twilight before the dawn of adulthood to me.
10. Life Off The Fast Lane
Speaking of marks of adulthood, many of my peers were learning to drive and passing their tests at this point. Although I got a provisional licence that year and my parents strongly suggested I learn, I refused to do so for environmental and anti-materialistic reasons. Although I’ve had lessons since, I’ve never in fact got there and I’ve also heard that an ADHD diagnosis can bar one from driving in any case, if it affects one’s driving ability. I’m not sure mine would but I do think my dyspraxia is a serious issue there. 1984 is in fact the only year I’ve independently driven any motorised vehicle, namely one friend’s scooter and other’s motorbike. Like the underage drinking, it’s apparent that we didn’t take certain laws particularly seriously at the time and I’m wondering if that was just us, our age or whether that’s changed since. I was dead set against learning to drive, and this was probably the first thing which marked me as not destined to live a “normal” adult life. It has on two occasions prevented me from getting paid work.
Television didn’t form a major part of my life at the time. We couldn’t get Channel 4 yet, so the groundbreaking work done there was a closed book to me. ‘Alas Smith And Jones’ started in January and incredibly continued to run for fourteen years. I’d stopped watching ‘Doctor Who’ two years previously when ‘Time Flight’ was so awful that I decided it was a kid’s show and put it away. Tommy Cooper died on stage. The second and final series of ‘The Young Ones’ broadcast. ‘Crimewatch UK’, which would eventually become compulsive but guilty viewing for Sarada and me, started. In a somewhat similar vein, ‘The Bill’ started in October, which I used to enjoy watching years later due to the handheld camera work, which made it seem very immediate and “real” somehow. NYPD Blue is like that for other reasons incidentally. ‘Casualty’ was still two years in the future and ‘Angels’ had just ended, so I was in a wilderness of no medical dramas.
11. Green Shoots
Also, of course, the Brighton hotel bombing took place, and many of my friends felt very conflicted because on the one hand there was the general sympathy and empathy for people being attacked, injured and killed, but on the other they were very much hated for what they were doing. I’m sure this was a widespread feeling, and there is of course the band Tebbit Under Rubble, which very much expresses one side of that sentiment. Greenham Common was in progress and a major eviction took place in March. Although I was later to become heavily involved in the peace movement, at the time I was still very much on the sidelines although some of the people I knew were connected, and I do remember thinking that computer and human error were major and unavoidable risks which meant that the very existence of nuclear arsenals was too dangerous to be allowed to continue.
Then there was the Bishop of Durham, and since I was doing an A-level in RE at the time, his stance was highly relevant. The Sea Of Faith Movement was in full swing, which promoted a kind of secularised Christianity which was largely non-theistic or even atheist in nature, and the foundations were being laid in my mind which I’d later extend but allow the high-control group I became involved in to demolish, almost inexplicably. Over that whole period, I was expected to read a newspaper of my choice and take cuttings from it on relevant religious and moral issues to put in a scrapbook, so my long-term readership of ‘The Guardian’ began a few months before this and persisted through the year. It was either 25p or 30p at the time, and this was before colour newspapers had come to be. I had also been an avid Radio 4 listener since 1980, but unlike later I also listened to Radio 3 a bit, never really managing to appreciate classical music to the full.
This was also the year I finally decided I wanted to become an academic philosopher, and I still think I could’ve followed that through though it didn’t happen. This is the end of a kind of winnowing process probably connected to my dyspraxia, where I became increasingly aware of practical things which I simply couldn’t do, I’d been put off biology by the griminess and unpleasantness of field work and therefore philosophy was the way forward. That said, like many other people I was also very motivated to study psychology in an attempt to understand myself, and as you probably know a lot of psychology undergraduates begin their degrees by being concerned about major issues in their own personalities, so in that respect I’m not unusual. I also presented two assemblies, one on existentialism and the other on the sex life of elephants as a parable of romantic love.
I feel like this could go on and on, so I’m going to finish off this reminiscence in a similar way to how I started. My emotional world revolved around the friend I was hiding under the table with at the beginning of the year and our significance to each other was important to both of us. About halfway through it, having just visited her she became concerned that she and I were going to be found together alone in the house by her parents who were coming back unexpectedly, so I left the house by the back door and crept surreptitiously over the front garden, only to be stopped and “citizen’s arrested” by their next door neighbour. This turned out to make the situation more embarrassing for her and me than it would’ve been if I’d just left when they came back. I don’t know if anything can be made or a picture can be drawn of who she or I was at the time by putting those two incidents together.
I’m aware that I haven’t talked about Orwell’s book and its adaptations as much as I’d like, so that’s something I’ll need to come back to, and there are huge things I’ve missed out, but I hope I’ve managed to paint a portrait of my 1984 and possibly also yours. I may also have portrayed someone who peaked in high school, but I do also think tremendous things happened afterwards. 1984 is, though, the first foothill of my life, which makes it significant. It’s sometimes said that the reminiscence bump is only there because fifteen to twenty-five is the most eventful period of one’s time here, but maybe not. It’s hard to say.
There’s a popular idea in nerd circles that at some point there will be a technological singularity. This means that rates of technological and scientific progress are accelerating, so that if it were possible to plot a graph of such change it would be exponential and eventually become almost vertical. This is the singularity. In graphical form, it looks like this:
This is a bit abstract, so it can be illustrated with a familiar example. From the start of the 1960s CE, the number of transistors which could be fitted in a given area doubled about once every two years. This manifested in various ways. It meant that every couple of years, the RAM available on a computer of, say £1000, doubled, the speed at which they worked doubled and so on. This is called Moore’s Law, and it might no longer apply. The problem with being able to tell whether it does or not arises from the fact that it isn’t in the interests of making a profit to say it has, and in fact commercial interests may always have driven it. There are other areas where acceleration of this kind can be seen, such as the sequencing of DNA. The human genome project was described as like the Apollo missions when it started in 1990. It finished in 2005, but today it can be done in a few weeks for a couple of hundred quid or less, hence 23andme and the others. This could be expected across the board, and different areas of science and technology help each other.
The usual scenario envisaged where this happens is via artificial intelligence, and looking at the likes of Midjourney and ChatGPT, one could be forgiven for thinking it’s about to happen, but Ray Kurzweil first published his book predicting the singularity in 1999. Predictions were made of something similar before that. Murray Leinster’s 1946 story ‘A Logic Named Joe’ told of a point when computers on the internet would achieve sapience and be able to solve any problem, including giving sex advice to small children, planning perfect murders and curing drunkenness instantly (one of these things is not like the other but I’m in a hurry) due to the information available to them online. This story is yet another example, incidentally, of how the internet is one of the least surprising things ever to happen. In 1946, the most advanced computers were – well you know how the routine goes. Massive great room-sized devices less powerful than a digital watch, or whatever.
But the future is not like the past. That’s what makes it the future. That said, things have happened in the past which might be clues as to what will happen. At the moment, one assumption is often that because scientific and technological progress has accelerated steadily, it will continue to do so. There are even deep time-based views which see the current acceleration as a continuation of an acceleration of biological evolution over æons, and this does make some sense. Over most of Earth’s history, life consisted of single-celled organisms who appeared very soon after this planet formed and didn’t develop hard parts until 600 million years ago, almost nine-tenths of the way through this planet’s history, and when modern humans first appeared we spend hundreds of thousands of years not doing anything like cave-paintings and so forth until the last tenth or so of our own history, then there were the twenty-five millennia or more between that and the emergence of agriculture, the thousands of years between that and the Industrial Revolution and so on. However, much of this is very centred on the way we live now being the focus of progress, and it’s a platitude to say that that may not actually be anything to be proud of.
There is another suggestion that progress is slowing down. Neil Armstrong stepped onto the surface of another world less than six dozen years after the Wright brothers achieved the first powered flight. At the time, there was a plan to put people on Mars just over a decade later but the last time humans left Low Earth Orbit was now more than fifty years ago. Project that backwards another fifty-one years and it precedes the first trans-Atlantic flight. Just imagine that projected back forwards. It would mean no flights from Britain to Australia until at least the early 1970s, no communication satellites, no Skylab, no Shuttle. In that particular area at least, progress has ground to a halt. Admittedly, this is partly because of advances in automatic space probes, but it isn’t the only way in which progress has decelerated. For instance, by 1970 the developed world had motorways, good sewers, commercial air travel, mechanised farming, long-distance ‘phone calls and all sorts of other things, and these are seen as the features of modern life over fifty years later, and although there have been disruptive changes since, the difference between life in 1920 and 1970 here at the Western world was surely far bigger than that between 1970 and today in 2023.
That said, there are indeed still new disruptive technologies such as social media, smartphones, 3-D printing and video calls. Many of them, however, are either tightly focussed on ICT or rely on it in some way. Another relatively disruptive piece of tech which arose recently is outpainting, which takes a photograph or other image and imagines its surroundings. Applying that to the graph above would lead to a steepening curve and a singularity.
But what if it’s like this?
We aren’t precisely aware of most technological and scientific trends, although we arguably are in digital electronics. Hence even a subtle deviation from an exponential curve wouldn’t be easy to spot. This is an outpainting of the left half of this curve using the prompt “A curve on a line graph with axes”:
This starts to deviate from the actual curve at about 1 on the X axis. I haven’t made any assumptions which would suggest this in the prompt.
My curve without cropping was a sigmoid. Sigmoid means “S-shaped”, although it’s a bit peculiar because an actual sigma is often not S-shaped. It is, however, used to refer to the letter S rather than the Greek letter. I can recall from when I was eleven years old plotting the temperature rise of ice over a bunsen burner and found it to have this shape. It takes more heating to change ice to cold water than it does to heat cold water to hot and once again more heating to boil hot water than to heat warm water by the same number of degrees. It crops up as the logistic function, which expresses population growth. A few individuals in a closed habitat will initially exponentially increase their population, assuming the gene pool is large enough, but will eventually level out as resources are used up. This exact example may in fact be relevant to progress if that depends on people having ideas and being able to act upon them, because the world’s human population has expanded and education has increased, leading to more scientists and engineers and more people able to put their ideas into production. Now that population growth is decelerating, perhaps progress will as well. In particular, the availability of resources is relevant here as this is being artificially restricted to non-renewables, and failure to follow a plant-based diet also means to some extent that our resources are more limited than necessary strictly on the dietary front. There are many other examples. A game between two players is unlikely to have be won by a player in the first few moves, and if they’re losing they are less likely to turn that around in the last few. Likewise, a tumour is likely to grow exponentially until it’s killing the patient, at which point it will no longer have a hospitable environment to do so because the body keeping it alive is no longer able to function properly.
Another rather salient curve of this shape is the learning curve. It takes a long time to start to learn something, then there’s a smooth increase in skill and experience which levels off again as one completes the task. On the other hand, the more one learns about something, the more one realises there is to learn, so that looks more like an exponential curve. The question is whether there is a limit to human knowledge and ability in general. Are we learning and becoming more capable in a finite space of possible knowledge and skills or are we discovering new vistas all the time? On top of that, are we going to cease to be capable of understanding more or being able to do more after a certain point because of our own limitations? Can we overcome those limitations through what we learn, for instance with cognitive enhancing drugs or AI?
Actual technological product lines do mature in a sigmoid way. Pocket calculators, for example, still exist, presumably for use in exams. They took a long time to evolve from abaci and mechanical adding machines, but in the 1970s and 1980s they became increasingly sophisticated very quickly. Nowadays they’ve levelled off. Mobile ‘phones seem to be similar. Early on, they were brick-like devices which could only be used for voice calls, and it took a long time for them to emerge from that stage. Then there was a period between the early ’90s and the early ‘noughties, also known as a decade I suppose, where they made rapid progress. Once the smartphone became popular, this changed to incremental progress on such things as resolution, camera quality and battery life. Making them a different size would make them less user-friendly and some of the facilities, such as video calls, are not actually that popular. Resolution on any device is a case in point, because the theoretical useful limit might be reached when the pixels become smaller than the angular resolution of the eye, which is determined by cone cells in the retina. In fact it is a little further than that because of what’s known as the Nyquist-Shannon sampling theorem, which is that something has to be at least twice as good as the bandwidth to avoid undesirable artifacts, so actually a pixel has to be at least somewhat microscopic to work properly. This means that any increase in resolution beyond a certain point becomes mere hype, and also wasteful because it needs four times the storage to double the now visually perfect resolution.
Hence there really is a sigmoid function in the improvement of certain devices. Toothbrushes used to go through a cycle where they returned to their well-known “default” form as other things were tried and rejected. Presumably the other versions are “nice tries” or possibly ways to get people to buy expensive new-style trendy toothbrushes. Razor heads are a notorious example of this as they simply seem to involve adding another blade every few years, although there are now bidirectional razors which really do seem to be an improvement. There is conflict between the needs of capitalism and technological progress in certain directions. For instance, there is never going to be a time under capitalism when cars or mobile ‘phones are going to be able to reproduce themselves in some way, and durable products which continue to be useful because they’re not wearing out are not profitable. A couple of years ago, I got interested in a company which sold more sustainable ‘phones, so I went to their website and was asked if I already had a mobile. On answering “yes”, I was sent away again because the greenest thing to do in those circumstances is to hang on to the one you’ve got! This sounds like a terrible business model because the first thing they do, and try quite hard at it, is to turn away customers. I have no idea if they’re still in business or if they have another way of surviving and making a profit.
The bigger question is whether just as progress in specific areas of technology follows a sigmoid curve, technological, and scientific progress in general does. If it didn’t, it would be because radically new forms of science and technology come along to fill the gap left by the mature older theories and devices, or because there is tech which can simply keep improving drastically for centuries. Arthur C. Clarke once said that if a distinguished elderly scientist says something is impossible, they will be proven wrong very quickly. And this does happen. An example which sticks in my head is Lord Kelvin, who in his old age insisted that Earth couldn’t be more than a couple of hundred million years old because of how it would cool over that time, not realising that radioactivity continues to heat the planet from within. David Bellamy’s climate change denial might also be an example. The question arising in my mind as I write this is, have I just got old? Am I saying further progress is impossible just because that’s what old age makes me think? But I’m not that old. I’m four dozen and eight. And in spite of that possibility, or perhaps because of it, it still seems very much that just as there’s a limit on individual tech, there’s also a limit of a similar kind on tech in general.
This would mean we are currently living through an era of rapid progress which will slow down. If that’s so, is there an easy way to estimate where we are in that and when it will reach a plateau? If it’s true that progress has indeed slowed since the 1960s, that might be some kind of inflection point where the curve went from concave to convex and if a measure could be found for when it really took off, that might give an estimate of how long we have until it levels out. The Industrial Revolution started around 1760 and Apollo 11 was in 1969. If history obeyed these kinds of laws, the levelling out can therefore be expected to occur around 2178. Another way of looking at this is similar to the way the Doomsday Argument works. The astrophysicist Richard Gott, from Louisville, Kentucky, visited the Berlin Wall in 1969 and predicted that the Wall would stand for at least 2⅔ years but no more than two dozen years after his visit. This was not based on any special understanding of international relations or politics, but on statistics. At the time, the Wall had been in existence for eight years, and on the basis of this he estimated that it would continue to stand for between a third and three times its then age, based on the principle of mediocrity, i.e. that there was nothing special about his particular visit, and the principle of indifference, that in the absence of information all possibilities are equally probable. This is true if probability is a statement of rational degree of belief. Half of all visits to the Berlin Wall can be expected to occur over half its lifetime, given the second principle. That period is between a quarter and three-quarters of its total lifetime, so it will continue to exist for thrice as long as it already has or a third of the time it already has. In fact it fell in November 1989. This principle has also been used to conclude, probably wrongly as the linked post argues, that one’s own birth is about half way through the total number of human births, and as I measured that from 200 000 BP based on an estimate made in 1976 there had been 75 milliard human births, and assumed population doubling every twenty-eight years would continue, that the last human birth would occur some time around 2134. These estimates, though, are egocentric, as someone born thousands of years ago would be able to estimate that the human race should’ve ended by now and it obviously hasn’t. Also, anyone visiting or being born outside that zone, i.e. near the end of the Wall or the human race, or near their beginning, will be very wrong. It’s just that the chances are that we are inside that zone.
As already mentioned, human population growth is likely to be sigmoid due to loss of resources and because species in a particular habitat have sigmoid population growth as a result. It would be interesting and relevant to know if this applies to omnivores, since one option we have which wouldn’t be available to, say, dolphins or cats, would be to modify our diet, starting to use other resources. Maybe this is what omnivorous species do. This kind of growth also scuppers the Doomsday Argument, and in fact population growth is slowing so for the purposes of that particular graph the line has already become convex. For technological and scientific progress’s sake, though, what are the results? The earlier limit of the take off point is vague, and possibly also in different places for technological progress and scientific progress. The spinning jenny is often mentioned as the start of the former, invented in 1764. Steam engines are a bit of a weird jump off point because they have existed for a surprisingly long time, having been invented in China around a thousand years ago and in Greece about twice that long back. It seems to have been James Watt’s improvements which led to them becoming practical as a source of power. This enabled iron to be refined more efficiently and machine tools are the final piece of the jigsaw. This all points to around 1760. Neil Armstrong stepped off the Eagle 209 years later, when I was almost two. Hence I was born 207 years after the onset of the Industrial Revolution at a time when the global human population was doubling every twenty-eight years. There were around eight hundred million people in 1750 and 3 610 million in 1970. Very approximately, this means that about six thousand million people were born between 1750 and 1970, meaning that by Gott’s argument there ought to be somewhere between two thousand million additional people and eighteen thousand million people born before progress flattens out. The lower estimate means that would’ve happened already but we know it hasn’t, so maybe the half way version works better in this case. This means after the births of six thousand million more people since 1967, the year of my birth, and we could already be more than half way there. Current world population is around 7 888 million, and about half the population alive in 1970 have died, so that’s an increase of 2 473 million, with spurious accuracy. If that rate doubles in the next fifty years, that takes us near the six thousand million point, so we could expect significant technological progress to end by about 2080, probably before.
All that said, there are a couple of ways in which very obvious progress could be made but hasn’t been. It’s been noted that technology is biassed towards able-bodied White cis men of a certain age range, isn’t particularly suitable for marginalised people, and in fact can even kill them. In this respect, we’ve not made much progress. The other way is linked to this: we are not living in an age of progressive politics. Quite the reverse. If a graph could be drawn for progressive politics, it would’ve peaked in about 1978 and is currently back in the 1960s or earlier. Things are going backwards in that respect and don’t show any signs of reversing. This influences technological and scientific progress. The increase in belief in Young Earth Creationism, for example, will have a knock-on effect on cancer research, to pick a fairly clear example, because cancer is effectively independent evolution. The oppression of female, queer and Black people deprives the world of their talents and skills, not only because of their special perspective but simply because they’re human beings who would otherwise be able to exercise and develop them. However, perversely this could mean that progress can continue for longer because it means the curve we currently experience is shallower and more drawn out due to the relative lack of talent. Simply emancipating women to the same degree as men would telescope the curve to half its length.
Why might we want progress to end though? There are a couple of reasons, linked to each other. One is that although humans probably evolved during a time of relatively rapid change, we throve during the flat period extending through the Palæolithic. We got used to the process where wisdom gained by elders could be usefully passed on to future generations. If someone discovered that onions were edible, something which has long mystified me, that could be passed on to grandchildren and we still have that knowledge today. By contrast, if someone in the early twentieth century learnt to write cursive with a fountain pen and that it wasn’t a good idea to share them because the nibs bend according to the individual writer, that information is now almost useless because people don’t even write much with bics nowadays, let alone pens with proper nibs. This means that older people are not so much fonts of useful knowledge and are probably less respected as a result. I can probably put on an LP at the right speed without scratching it, use a rotary dial telephone and other people can drive using manual transmission, but the former two of these are already useless and the latter will be too once cars are all electric. I might sound like an old fogey saying all this but it means that a corpus of a particular kind of skill is constantly lost rather than built up precisely because we are building on our predecessors’ achievements so quickly.
The other, which is again linked, is future shock. Heidi and Alvin Toffler famously dealt with this in 1970 although the term dates from 1963. Many aspects of their work are outdated, but the continuing existence of future shock as a general experience is indisputable. The concept is based on culture shock. I can’t use chopsticks or sign language and I walk most places. These three things would make it hard or impossible for me to adjust to life in East Asia, the deaf community or most of urban America. The same kind of difficulties emerge for us all due to rapid technological change and according to the Tofflers goes hand in hand with social change. It involves confusion, anxiety, isolation and depression. Disposability, built-in obsolescence, the end of tradition and a new kind of nomadic existence provoked by the need to change careers often due to the end of old industries and the start of new ones along with skills becoming outdated are all features of contemporary life. There have also been changing social norms, some of which seem quite positive such as the greater acceptance of homosexual relationships. However, it may be that this kind of change is temporary. We don’t know what will come out of the other side of course, but a new set of traditions could be built up, and in fact that’s nothing new because much of what we think of as tradition was actually invented in the nineteenth Christian century.
Both of these aspects might end at some point, always assuming we last long enough as a species, and we will return to a time which is much more high-tech and scientifically advanced but which doesn’t change as rapidly as today.
Finally, I want to point out how useful this might be to an SF writer. This post was inspired by an observation someone made about Asimov’s stories, in that the kind of robots who exist thousands of years in the future are not in fact very different to the ones which exist in his fictional twenty-first century. Another aspect of this in his writing is how oddly similar the culture and technology of his late Galactic Empire, some thirty thousand years after Hiroshima, are to the time he was writing. Books are on microfilm, people still smoke tobacco, there are apparently no robots and there are voice-operated machines to be sure, but they’re typewriters. Computers don’t seem to have a significant role at all. This looks very dated by today’s standards, but maybe in a way it’s a more accurate view of the future than one in which enormous change is ongoing. It makes it easier to write and imagine, and whereas it does become increasingly dated, it avoids zeerust, Douglas Adams’s concept of datedness which afflicts the now retrofuturistic.
If we survive, we don’t know what the world will be like centuries from now, but it’s also possible that the world in two hundred years won’t be that different technologically than the world in five hundred. Maybe it’s progress which will become dated, though hopefully not before environmental and social progress have made their marks.
Hoarding tends to be frowned upon. Of course, to the hoarder, it seems entirely sensible and “normal” to engage in the practice others describe in this way. Aristotle had something to contribute to this. He was the apparent inventor of the concept of the “happy medium” (which I think turns up in ‘A Wrinkle In Time’ but I may be misremembering). That is, virtues are the ideal position between two pairs of vices. Courage, for example, is between cowardice and recklessness. However, the happy medium is never exactly halfway between its corresponding vices. Courage is more like recklessness than cowardice for example. Likewise, tidiness is going to be closer to one thing than the other. Most people seem to see it as more like obsessive over-neatness where you can’t do anything for fear of causing a mess than slovenliness. To my mind, the happy medium is closer to messiness. Somebody writes psychiatry textbooks and manuals, and those people are likely to normalise their own methodical tendencies, which could manifest as excessive neatness, and therefore regard untidiness as problematic.
Now don’t get me wrong. It is problematic, and it’s also much easier to become untidy than it is tidy. Nonetheless, a couple of observations will be made at this point by that nebulous genetic subject which makes them appear objective by using an impersonal construction. One of them is that I collected old copies of the ‘Radio Times’, not to be confused with the ancient Greek philosopher Θεραδιοτιμης, for six years until my dad got annoyed with the clutter and had me throw them out. I doubt it was exactly six years, but at four dozen editions annually over half a dozen years that’s a couple of gross, and since each one costs £7.50 on Ebay, that’s over two thousand quid’s worth of magazines. I also still have a fair number of ‘2000 AD’ comics from 1977, which are worth a fair bit. I do not believe it was the right decision to throw these things out.
This brings me to the subject of this blog post: the Jupiter Ace, which I’m always tempted to call the “Joobrrace” due to the fact that it’s one of those terms you can use to practice rolling your R’s. I should point out first that the term “Jupiter Ace” has actually been used for two completely separate things. There’s the computer illustrated at the top of this post and there’s a band which had a minor hit in 2005 called ‘A Thousand Years‘. Although this is slightly confusing, I’ve long thought that the sleeve design for this single would work as the cover illustration for a computer manual:
Given the appearance of the ZX81 manual, can you not just see how this would work really well?
Leaving the band aside though, once upon a time, there were a lot of home computers, all unique. Each one had a personality of its own and was usually incompatible with all the others. They did, however, tend to have standard interfaces. I first paid close attention to microcomputers in 1981, and up until that point I’d made various assumptions about them which turned out to be untrue and, to me, rather startling. I had assumed that they would all use the programming language Pascal or something else. I was very surprised to find that they nearly all used BASIC. As far as I was concerned, BASIC was just a language for people just starting out in programming and wouldn’t be used on “proper” computers. This was in fact so on mainframes and minicomputers around this time. The languages I was familiar with, such as Algol-60, COBOL and FORTRAN, were a lot more popular on those, so I just assumed that those would be used on microcomputers, in ROM, so that they would boot into a development environment-like program which would then let you put lines of FORTRAN, say, in and compile and run the program. As I said, I assumed that Pascal would be the favourite because to me that language seemed to have a kind of contemporary vibe to it at the time. It was being pushed fairly hard, but initially, like BASIC, was intended as a language to teach programming rather than having serious use. In particular, the idea behind Pascal was that it should be structured – that the code could be read and planned easily and methodically, with blocks and control structures imposed on the user. By 1981, it had started to fall from grace because this very approach to structure restricted its flexibility. I’m not going to get all technical on you here because that’s not what I’m getting at, but in general I tended to be confounded by programming languages as they were presented because they didn’t seem to have any facilities for using things like sound and graphics, or even interacting with a CRT-style display, because they were designed for a world of punchcards and teletypes. It was all rather puzzling.
There were a few exceptions. For instance, going way back to 1975, IBM had introduced a desktop computer (not a micro as its processor occupied an entire board) which ran APL, “A Programming Language” based on symbols rather than words of which I happen to be a fan due to its lack of language bias and terseness. An APL-native micro also existed in the early 1980s, and APL was used to do the exploding and rotating Channel 4 station ident in 1982. The more expensive business machines also had no programming language firmware and the user would have to purchase a programming language as an additional piece of software, so the situation wasn’t just that BASIC was universal. There were also some home micros, such as the Microtan 65, which could only be programmed in machine code, and others which would boot into a “monitor”, which is a simple program with single letter commands for manipulating and viewing memory contents, and executing machine code programs either loaded or typed in by the user, as a series of hexadecimal numbers.
The standard practice of using BASIC in firmware on home micros usually went further than just the unextended form of the language. It was usually Microsoft BASIC, often in an extended form which constituted a de facto standard. There were other versions of BASIC, used particularly in British as opposed to American home computers, such as Sinclair BASIC used in the ZX80, ZX81 and Spectrum, and BBC BASIC, which began on the BBC Micro and Electron but was later adapted for use on IBM PC clones and other machines such as the Tatung Einstein. It was also possible to buy alternative programming languages such as FORTH. And of course the mention of FORTH brings me to the main object of today’s discussion: the Jupiter Ace.
Clive Sinclair was apparently not a particularly easy person to work with. Shortly after the ZX Spectrum had been designed, a small number of employees, possibly just two, left the company to found Jupiter Cantab, apparently retaining their intellectual property on certain aspects of that phenomenally successful computer, and proceeded to design, manufacture and market a radically new machine, the Jupiter Ace, in autumn 1982. The hardware of the computer in question was not particularly special. It comes across as a cross between a ZX81 and a Spectrum, though without colour or true high resolution graphics. However, the really unusual thing about the Ace was that instead of having BASIC in ROM, it had FORTH. This is a highly idiosyncratic language with two distinctive features. Firstly, it uses Reverse Polish Notation. Instead of “2+2” it uses “2 2 +”. There is a structure in memory in most computers called the stack, which is a series of consecutively stored numbers originally used as addresses in memory to which a program will return. In FORTH’s case, a number typed will be placed on the stack and a “word”, such as “+”, will expect a certain number of values on that stack and operate accordingly, often depositing its own result on the stack for future use. Secondly, words are defined by the user instead of programs, consisting of other words, so for example, squaring a number could be defined thus:
: SQUARED
DUP *
;
“DUP” duplicates the number on top of the stack, “:” opens a definition of a new word, in this case “SQUARED”, and “;” closes it. Thenceforth, typing something like “9 SQUARED” would put 81 on top of the stack and so on.
Advantages of FORTH include structure and speed. The standards at the time didn’t include floating point numbers, but the Ace had a few proprietary extensions which allowed them. They could’ve been defined by the user, but since the stack has to contain ordinary floating point values, it makes more sense to extend the user interface to recognise any series of digits with a decimal point as a floating point number. Unlike the BASIC available on most home micros at the time, Ace FORTH didn’t support text strings in an easily-used way, but it did have arrays and a text buffer and again, it could be modified to allow them.
The Jupiter Ace did very badly. Although it was an interesting device, it was let down by the absence of colour and poor sound. Although the keyboard was similar to the Spectrum’s, this was fairly normal for the time, but because it couldn’t have the Sinclair system of entire keywords being produced by a single keystroke, this meant it was in much heavier use, which made its cumbersome nature much more obvious. It comes across very much as the kind of computer which might’ve been produced in the late ’70s, though in a much better case, with better interfaces and a superior keyboard, such as the TRS80 Model 1 from 1978. Consequently, Jupiter Cantab went bust and sold off their remaining stock to Boldfield Limited Computing, which in turn reduced the price from £89.95 to £30. This happened in 1984.
Another thing which happened in 1984 was that Safeway opened a branch in Canterbury for the first time, leading to my first paid job, as a cashier at the age of seventeen. I was paid £1.41 an hour, which was a huge amount for me at the time. This was before minimum wage, but prior to that I’d only had a pound a week. I lost the job after only twelve weeks due to my unassertiveness. For instance, I was on the “9 Items Or Less” (sic) till but couldn’t bring myself to turn customers away if they brought whole trolleys of stuff, and I didn’t want to ask for extra change so I ended up paying people in pennies. However, in that time I succeeded in amassing enough cash to buy a Jupiter Ace, so around October time I received one, and at the same time I bought a 16K RAM pack to upgrade the memory to 19K. I can’t remember how much that cost, but the initial outlay would’ve been about twenty-one hours work.
Unlike most people who bought an Ace, although I found the FORTH interesting I actually got it as an upgrade. My previous computer, a 16K ZX81, which my father bought the whole family, was the absolute cheapest available computer at the time. It was ingeniously designed to be as cheap as possible, and that design rendered it rather atypical as a computer. For instance, to this day computers use the ASCII character set, although nowadays this is a subset of the much larger Unicode which includes most of what you might ever want to type, although I find it inadequate due to things like its lack of Bliss symbols, which I use extensively in writing. The ZX81, though, only used sixty-four symbols, including twenty-two graphics characters used to draw Teletext-style pictures, and it lacked lowercase letters and a lot of the usual graphemes such as “@” and “\”. It also defaulted to black text on a white background and had an unchangeable white border, and in its 1K version barely had enough memory to display a full screen of text, so it would skip the memory for lines less than thirty-two characters long. The screen also didn’t scroll unless you included an instruction to in the program, when it would scroll a single line, and the cursor for input stayed at the bottom of the screen. There was also no sound. However, because Sinclair had a fair bit of financial oomph behind it, they were able to design a large custom chip which did everything the computer needed apart from processing programs and storing information, and to this day I find this very impressive, because the total chip count is only five:
This is the kind of achievement which is impressive because of the limitations the available technology imposed upon the designers. It’s similar to the helical scan mechanism on a VCR in a way, in that only that inspiration even makes it possible.
By contrast, the Ace had a full ASCII character set with redesignable characters, single-channel pitch-duration sound, a proper scrolling screen and a white on black display like a “proper” computer. It also had slightly more memory. However, Jupiter Cantab were a tiny and impoverished company, so small in fact that their turnover, not adjusted for inflation, actually overlapped with my own turnover as a herbalist in the ‘noughties, though over that period sterling had halved in value. It’s remarkable to contemplate that the size of the company was less than one order of magnitude greater than our partnership. One practical consequence of this was that they were unable to have the kind of custom chip designed and produced for them which gave Sinclair the advantage with the ZX81 a year earlier and had to resort to discrete logic. I’ll come back to that in a minute, but I want to make the observation that this is a good example of how poverty is expensive. Instead of employing one chip, Jupiter Cantab had to use many:
Those smaller components on the right hand side of the board are mainly doing similar jobs to the large chip on the left of the ZX81’s, but there are many more of them. They also need to be soldered onto the printed circuit board, and it makes the design of the board more complex. This makes the whole computer more expensive to make, and unlike the Sinclair computers, only smaller numbers of components could be purchased, making them more expensive per unit. On the other hand, unlike the ZX81 and Spectrum, the Jupiter Ace is not really a “pile ’em high and sell ’em cheap” product because they didn’t have the option to make them cheaply. There are, even so, clear signs of cost cutting. The sound is produced using a buzzer rather than a speaker, which seems to be identical to the Spectrum. An odd design decision exists in a number of British micros, where rather than routing the audio via the TV speaker, a separate loudspeaker or unfortunately a buzzer was used on the motherboard, and I don’t know much about the design but that seems to me to add to the cost of the hardware while interfering with the quality of the sound.
The chips involved were bought off the shelf and are available to the general public even today. In order to replace a ZX81 ULA, the large chip on the left which does “everything” (it actually does less than the discrete logic on the Ace board because much of the work to put the older computer’s display on a TV is done via system software) has to be replaced by another large chip that does “everything”. With an Ace, there is a “right to repair” as it were, because all that need be done is for the malfunctioning chip to be located and replaced by another, very cheap, integrated circuit. In fact it’s still possible to build an Ace today from scratch with pretty basic equipment. It’s possible also to build a ZX80 in the same way, and since a ZX81 is, functionally speaking, just a ZX80 with different firmware, that can be done too, but not with only five chips and a simple motherboard.
The personal significance of the Ace to me, as a milestone in my life, is that it was the first durable and substantial product I bought with my own money. This landmark would for many people be followed by increasingly impressive and expensive things rather rapidly, ramping up over less than a decade to the likes of a car and a house. This never happened for me for reasons I can’t explain, and in fact if I knew why my life considered in such terms failed so badly, the chances are it wouldn’t have done. It’s probably connected to neurodiversity and mental health issues, but in any case it means this very cheap product bought nearly forty years ago has more sentimental significance to me than most others. I have now succeeded in buying a second-hand car, although I can’t drive so it’s for Sarada, and for most people this is the kind of thing they manage to do by the time they’re in their early twenties and they’d be able to drive it themselves. Hence the kind of failed product the Ace is reflects my own sense of failure in life.
There’s another, rather similar, aspect to this. I always tend to back the loser. Probably the most obvious example of this is that I’m a Prefab Sprout fan. This band is known mainly for a novelty song, ‘The King Of Rock And Roll’, which is about a band known mainly for a novelty song. It’s unintentionally meta. There are other aspects of their career which are like this. For instance, the lead singer and songwriter Paddy McAloon once penned and sang the lines “Lord just blind me, don’t let her innocent eyes reminds me”, and proceeded to go blind suddenly as he drove along a motorway. Fortunately he survived. Anyway, there would have been a point, back in 1982, when Prefab Sprout released ‘Lions In My Own Garden’, then some other band, maybe Lloyd Cole And The Commotions or Frankie Goes To Hollywood, had their own debut singles released, and somehow I get into the first and only to a limited extent the other two. Granted, most of this is down to the fact that most undertakings are unsuccessful, but for some reason my interest in something seems to be the kiss of death. Prefab Sprout and the Jupiter Ace computer were both critically acclaimed and enthused about with good reason: both were unsuccessful. I could name all sorts of other things which have a similar trajectory and about which I was quite keen at the time. What does this mean?
All that said, there is a sense in which the fortunes of the Jupiter Ace have now changed. Like the Radio Times, they are now a lot more valuable than they were when they first came out. They can go for more than a thousand quid each now. The trouble is, mine doesn’t currently work. I also suspect it’s fried, but it may not be. This is where something unexpected may come to my rescue.
I am, as you probably know a philosophy graduate. Most people say that it’s an excellent qualification for flipping burgers but in fact it isn’t because like many other people, I examined arguments for veganism while I was studying and became vegan as a result, so the burgers in question should probably be veggie. However, it is in fact useful in various ways, one of which is that you get to understand symbolic logic and Boolean algebra. There are various reasons for this, such as helping one understand the foundations of mathematics and distinguishing between valid and invalid arguments, but in any case logic is central to philosophy. While I was studying the subject, another student found that applying a particular technique to the design of digital circuits helped him simplify them and use fewer components. In general, there happens to be an enormous overlap between philosophy and computing. After the department was closed down, the logic and scientific method subsection of the department merged with computing, and as far as I know survives to this day.
One practical consequence of this is that I have no problems understanding how computers work, at least simple ones such as this, and a possible consequence of that is that it might even be possible for me to repair it and sell it. I should add, however, that mere knowledge of how the logic circuits, for want of a better word, work still leaves a massive chunk of ignorance about electronics in general. I do know why the machine is broken. It’s because the polarity of the power supply was reversed, meaning that current flowed in the wrong direction through the circuit, thereby damaging at least some of the components beyond repair. What I’m hoping, and I’m not terribly optimistic about this, is that the voltage regulator was destroyed but protected everything else. However, the cost of the components is such that it would still be cost effective to replace everything on the board, thereby ending up with a working Ace, since they sell for such a high price. This is, however, a philosophical issue because it amounts to the Ship of Theseus paradox. If everything which makes up the Ace is replaced by something else with the same function, is it still an original Ace? What does that mean about the value?
There’s something out there called a Minstrel:
This is an updated Ace. It costs £200 but has 49K memory rather than 19K and seems to be able to use USB storage. I don’t know much about it, but I am aware that it works with newer televisions. One of the differences between the two boards, other than the larger memory chips, is the absence of the silver and red Astec modulator, whose function is to interface with a conventional CRT television. Unlike many other cheap computers of the time, the Jupiter Ace had the rudiments of a monitor interface available without modification, although the signal needed to be amplified, and nowadays a modulator just gets in the way because it means you have to have an old-style TV as well.
Although it’s tempting to attempt to upgrade this computer I am under no illusions regarding my abilities and it would be good if I even ended up with a working model at the end. It would be interesting to know how much a non-working Ace would go for, but clearly a working one would be worth more.
This is the plan:
Ensure a good connection between the Ace and a CRT TV via a cable.
Use a ZX81 power supply to turn it on.
If it doesn’t work, replace the voltage regulator.
If it still doesn’t work, replace every component until it does.
Sell it.
Right, so that’s it for today. I was going to talk about nostalgia a bit but I’ve probably bored you senseless.
Legs & Co dancing to Kool & The Gang on TOTP in 1981
Sarada does not like Legs & Co. I don’t know what she thinks of Kool & The Gang or ‘Jones vs Jones’ but I do know that Top Of The Pops was a huge influence on her generation in this country, as it was mine. But one interesting thing about our relationship is that she and I are from different generations. I’m a Gen-Xer and she’s Generation Jones. Steve, who also reads this blog, is too.
Phrases such as Baby Boomers, the Beat Generation, Millennials and Generation X are all well-known, and I think probably all coined by journalists. However, there doesn’t seem to be a popular term for the people born between the Baby Boom and Gen-X. Consequently the term “Generation Jones”, which refers to these people, doesn’t seem to be widely known. This actually reflects the essence of Generation Jones as a group of people who have tended to miss out and be ignored by things. The generation before and after them are connected to each other. Here’s a graph of the birth rate in the UK from 1940 CE to the 2010s:
The Baby Boom is really clear. It stands out on the graph, and it also seems to have two peaks, perhaps for when eldest and second children were born. Gen-X is also fairly clear and can be seen as the single, gentler slope up and down peaking in the mid-1960s. It’s a smoother curve because it represents a different kind of generation. People seemed to have had children when they were about twenty according to this graph, although they tended to wait longer before they settled down and the double peak after the War is also manifested in the fact that these are people of varying ages. There is then a rapid decline into the mid-’70s followed by a less regular, shallower and longer peak from about 1979 to 2000, then another even vaguer peak around 2010. This represents the smearing of ages which occurs in generations. If you have a 23andme account, you can see this in estimates of your ancestry, which get longer in duration the further back in time you go. If you imagine the average age of a parent to be twenty-three (this is roughly three score years and ten divided by three) but possibly as young as eighteen or as old as twenty-eight, that gives the generation before you a range of ten years, the generation before that a range of thirty and the one before that of half a century. There was a specific, definite event just after the War which is becoming smoothed out by this effect, meaning that the age distribution of society is returning to how it was before the Second World War.
Gen-Xers are the children of Baby Boomers. This is not precisely true, but it is a fair guide to where the peak of that generation occurs. However, we are the peak generations in terms of our population. Generation Jones is in the trough. This may give them common ground with people born in the late 1970s. It means that culture was more youth oriented before and after they were young, because there were more young people at that time, but not for them. The Swinging ’60s were something exciting happening to older people and the Yuppies and the Second Summer Of Love happened after they’d got past the point when they could enjoy such things. Generation Jones, sadly, occupies a dip.
I’m aware that I’m talking about this second hand. I am not myself a member of this generation, although because my parents were older than average when I was born, and also late adopters, I might have more in common with them than many of my contemporaries. If my mother had me when she was 23, I would’ve been born in 1956. This is a peculiar counterfactual conditional but I’m going to let it pass, because I think you know what I’m saying. I think I’m a mixture for this reason, and it may be a factor in Sarada and I being together. Just to make a general point about the situation, if a couple have a big age difference, maturity and life stages are not the only factors in making productive or problematic differences between them. Being in different generations can be equally important. A fairly trivial example of this in our own relationship is that I like music videos and Sarada hates them, and this is purely a generational difference. In the past I could also have noted that people say exactly the same things about The Smiths and Leonard Cohen except that I actually think Leonard Cohen is bloody brilliant and am completely disillusioned by Morrissey’s recent behaviour, so that doesn’t really work.
The term “Generation Jones” was coined by Jonathan Pontell (I have very little idea who that is by the way, and this time Google is not my friend) as a way of pointing out that Boomers and Gen-Xers peak far apart and there was a distinct experience pertaining to people born between 1954 and 1965. These people were children during Watergate and stagflation, that is, an economic situation where unemployment and inflation are both high, distinctive of the ’70s and having an obvious major influence on family life. Divorce was also becoming more common at this time, as were single mothers. In America, a lot of Gen-Xers would’ve grown up in an atmosphere of cynicism about politics because of Watergate. In Britain it would’ve included the Three Day Week and powercuts, but on a different note we all remember the summer of ’76, though how formative that is I don’t know. Jonesers tend to be pessimistic, cynical and distrust government. This actually doesn’t sound like Sarada at all.
Why is it called Jones? Well, they’re also known as the Lost Generation, which makes more sense to me at least, because they’ve missed out. But apparently it’s because they “jones” a lot, meaning that they hanker after the more prosperous and optimistic past of recent memory which they saw disappear as they reached adolescence. There’s also the aspect of “keeping up with the Joneses”, i.e. trying to be as “good” as the people next door, in this case temporally because their neighbours are the Baby Boomers and perhaps also us lot, the Gen-Xers. They had high expectations as children which were dashed as they reached adulthood. For us, that didn’t happen because we basically can’t remember the ’60s so we are strangers to that wave of optimism and are used to hopelessness. Most of them are not the children of people who fought in the War or were on the Home Front at that time, although some are. Another link with the name Jones is that it’s one of those very common surnames which is used to suggest anonymity, because these people are not seen, recognised or noticed.
Four out of five Jonesers do not identify with either Boomers or Gen-Xers. They tend to be less idealistic than their predecessors. They’re used to struggling to find work or make money from what they do. They experienced the loss of secure employment. After retirement, many of them wish to reconnect with the optimism and idealism they experienced second-hand in their childhood. They want to do it themselves rather than just watching others do it. There’s a sense of constant unrequited craving in their lives. Their reminiscence bumps would range from 1969-79 to 1980-90. The very different characters of the ’70s and ’80s suggests that they themselves might be divisible into two halves.
They’re said to be more practical and rational in their approach to change because they were forced to be pragmatic by conditions in their early adulthood. They dislike high-pressure sales techniques and are more likely to do digital detoxes because they have extensive experience of the pre-Web world as adults. Some of them see themselves as pioneers because they were forced to make things work after the old world had changed due to what the Boomers had done and due to the collapse of Keynesian economic policies.
So far so good then in this outline, but in my mind there’s a problem or two with this idea. One is that it reads a little like a horoscope. It kind of feels sufficiently vague and maybe flattering in a way, perhaps “sympathetic” is a better word, that most people would feel it describes them. The second problem is that to a great extent it feels like it describes me even if it is specific to Generation Jones. This might be due to me being Generation X, but older than most of my cohort, being born in 1967, making me almost a Joneser, and also possibly connected to my parents being older than average for a Gen-Xer’s. I can also see some of it in Sarada but not all, but then why would I? Everyone is also an individual.
I want to end this post by addressing Jonesers personally, as people with direct experience of being from this generation. In particular, I’m talking to you, Sarada, and you, Steve, but anyone else is free to respond too. Do you feel that this is you? Does it chime with you? Or is it more like a load of things cobbled together which could apply to anyone? How do you see me, as a Gen-Xer, as different or similar to this?
In this blog, I’ve made occasional references to what I call my “Reënactment Project”, which is a long-term ongoing thing I’ve been doing since about 2017. The idea is that every day I make an at least cursory examination of the same day thirty-nine years previously. The reason for choosing thirty-nine years is that for the initial year I planned to do it all the dates were on the same days of the week, meaning that the years concerned were substantially similar. The very basic arithmetic involved is of some interest and I’ll be returning to that later in the post. A side-effect of the thirty-nine year difference is that I am thirty-nine years younger than my father, so he would’ve been the age I am now back then, which focusses me on ageing, life stages and how to stay as young as possible by doing things like addressing my balance through Yoga so it doesn’t deteriorate as fast as it has for him. I can see the end result and know some of the things to avoid, which means that if I do reach his current age I’ll probably have a completely different set of health problems from which my own hopefully not estranged descendants will in turn know what they should avoid. And so on.
My motivation for doing this stems from the disconcerting awareness that we edit our memories, and are also only able to experience things as we are at the time. Also, various media and popular misconceptions lead us to forget and mutate the memories we do believe ourselves to have, and this was particularly important for 1978 as it included the famous Winter Of Discontent, also the Winter Of Discotheque, and I feel we may have been usefully manipulated into seeing this particular season in a particular way to justify everything that came after it. I also want to know how I was as a child and adolescent and pay attention to things which are the seeds of how I am now, and also that which was in me which I didn’t end up expressing. There is of course a bit of a risk here because I’m living in the past and to some extent dwelling upon it, but I do have a life outside this project and find it quite informative and enriching for today’s experiences. However, in general it’s just interesting.
I’ve now reached 1982, and am in the depths of the Falklands War, which was a significant historical event in securing Margaret Thatcher a second parliamentary term. Well, I say “in the depths”. In fact an end to hostilities was announced on 20th June and the Canberra was almost home by 7th July, which is when I’m writing this. I more or less stand by the position I had by the mid-’80s on this subject, which is that Galtieri and Thatcher were both aware that a war would be likely to boost their popularity, although at the time I thought it was an actual conspiracy between them whereas now I just think they were both aware of its expediency. It came as something of a shock to me, a year later, when I realised we didn’t have fixed-term parliaments and therefore the Tories could take advantage of their victory by calling an election whenever they wanted. ‘Shipbuilding’ is redolent of the time:
Although I know Elvis Costello wrote and performed the song, the Robert Wyatt version is the one I associate most closely with the incident. Robert Wyatt was part of the Canterbury Scene and an early member of Soft Machine, so I’m obviously more likely to associate it with him. Just in case you don’t know, Wyatt got drunk and fell out of a window in 1973, paralysing himself from the waist down. Jean Shrimpton, my second cousin once removed, gave him a car and Pink Floyd raised £10 000 for him in a benefit concert. Tommy Vance once described him as “a man who has had more than his share of bad luck in life”.
Another association I make with the Falklands from the time is a play about an Irish barman who was accepted as a member of his community in London until the breakout of the war. He finds himself sandwiched between Irish Republicans and his customers, with racism growing against him which culminates in his murder. This was originally a radio play but later appeared on TV. Although the Troubles were significant and also a spur to creativity, there was a long period during which practically every new play was about them, and it became tedious and annoying. This wasn’t yet the case in ’82 though. There’s also the 1988 BBC TV drama ‘Tumbledown’.
1982 was probably the last year there was really any hope that the previous pattern of alternating Conservative and Labour administrations we were used to would continue into the decade. In fact, this had been a relatively recent development. The first Labour government after the Second World War had been followed by thirteen years of Tory rule, and it was only after that that an alternation of parties in power had begun, lasting only fifteen years. Nonetheless, up until 1982 that’s what most people seemed to expect, and that alternation had held policies and the general timbre of the country in the political centre because the next government could be expected to come along and undo much of what the previous one had done, and so on. This was satirised on the Radio 4 comedy programme ‘Week Ending’ which depicted the future of privatisation and nationalisation as permanently oscillating ad infinitum every five years, which was probably one reason I thought we had fixed terms.
I was communist in ’82, and when I say “communist” I mean Stalinist. I took it seriously enough that I attempted to learn Russian and listened regularly to Radio Moscow, and I was very upset when Leonid Brezhnev died. I was completely convinced that what the Soviet Union was saying about us and themselves was accurate and that the BBC and the like was nothing more than propaganda. I was also very concerned indeed about unemployment, racism and homophobia. I considered being called racist to be the worst insult imaginable, which of course misses the point. I was, however, still a meat eater and was, as you can probably tell, quite naïve. I was also a lovesick teenager in love with the idea of being in love.
However, this isn’t just about 1982 and the events of that year, for me or the world, but also the value of the exercise. It’s often been suggested that I have autistic tendencies and I imagine that this kind of meticulous rerun of the late ’70s and early ’80s is going to come across as confirmatory evidence for that. Clearly people do do things just because they want to and then come up with reasons for doing so to justify themselves to other people. My novel ‘1934’ covers a community where they have chosen to relive the mid-twentieth century over and over again in an endless loop because the leaders think everything has gone to Hell in a handcart ever since, and this would not be a healthy attitude. I made the mistake, a few years ago, of re-reading my diary in a particular way and found myself falling back into the mindset I had at the time in a way which felt distinctly unhealthy. Nonetheless, I consider this activity to be worthwhile because our memories are re-written, and history is written by the winners, in this case the winners of the Falklands War, so our memories are re-written by the winners.
It’s been said that films set in the past usually say more about the time they were made than the period they’re supposed to have happened in. Hence ‘Dazed And Confused’ is really about the 1990s, for example. We generally have a set of preconceptions about a particular period within living memory which turn into a caricature of the time which we find hard to penetrate to reach the reality, and it isn’t the reality in any case because it’s filtered through the preconceptions of the people at the time, even when those people were us. This much is almost too obvious to state. However, there’s also continuity. Time isn’t really neatly parcelled off into years, decades and centuries. People don’t just throw away all their furniture at the end of the decade, or at least they shouldn’t, and buy a whole new lot. We’re all aware of patterns repeating in families down the generations. It isn’t really possible to recapture the past as if it’s preserved in amber. But it is possible to attempt to adopt something like the mindset prevalent at the time, or the Zeitgeist, to think about today, and the older you get the more tempting it is to do so. Since the menopause exists, there must be some value in becoming an elder and sharing the fruits of one’s experience, even when one is in cognitive decline. And of course the clock seems to have been going backwards since 1979, making this year equivalent to 1937. World War II was so 2019.
How, then, does 2021 look from 1982? On a superficial level, it tends to look very slick and well-presented, although airbrushing had a slickness to it too. The graphic at the top of this post is more ’87 than ’82, but it does succeed in capturing the retro-futurism. Progressive politics was losing the fight with conservatism at the time, but the complete rewrite of how we think of ourselves had not yet happened. Nowadays, people are wont to parcel up their identity and activities into marketable units because they have no choice but to do so. The fragmentation there is as significant as the commodification. The kind of unity of experience which existed in terms of the consumption of popular culture back then is gone, although it was gradually disintegrating even then. We were about to get Channel 4 and video recorders were becoming popular among the rich, although they were still insisting that there was no way to get the price below £400 at the time, which is more like £1 400 today. It’s hard to tell, but it certainly feels like the mass media, government and other less definable forces have got better at manipulating public opinion and attitudes. This feels like an “advance” in the technology of rhetoric. However, we may also be slowly emerging from the shadow of the “greed is good” ethic which was descending at the time because we’ve reached the point where most public assets have been sold off and workers’ rights have been eroded that reality tends to intrude a lot more than it used to, and I wonder if people tend to be more aware of the discrepancy between what they’re told and what their experience is. Perhaps the rise in mental health problems is related to this: people are less able to reconcile their reality with the representation of “reality”, and are therefore constantly caught in a double bind.
It isn’t all bad. It’s widely recognised now that homophobia, sexism, racism, ableism and other forms of prejudice are bad for all of us and people seem to be more aware that these are structural problems as well. Veganism is better understood but also very commercialised, taking it away from its meaning. Social ideas which are prevalent among the general public today may have been circulating in academia at the time and their wider influence was yet to be felt. This is probably part of a general trend. There was also a strongly perceived secularisation trend which has in some respects now reversed. The West was in the process of encouraging Afghan fundamentalists and they may also have begun arming Saddam Hussein by this point, although that might’ve come later. CND was in the ascendancy, and the government hadn’t yet got into gear dissing them.
Another distinctive feature of the time was the ascendancy of home microcomputers, although for me this was somewhat in the future. I’ll focus more on my suspicions and distrust here. To me, silicon chips were primarily a way to put people out of work and therefore I didn’t feel able to get wholeheartedly into the IT revolution with a clear conscience. I had, however, learnt BASIC the previous year. I don’t really know what I expected to happen as clearly computers were really getting going and it seemed inevitable. There was also only a rather tenuous connection between a home computer and automation taking place in factories. However, by now the usual cycle of job destruction and creation has indeed ceased to operate, as the work created by automation is nowhere near as much as the work replaced by it, or rather, done by computers or robots in some way. My interest in computers was basically to do with CGI, so the appearance of a ZX81 in my life proved to be rather disappointing.
1982 was also the only year I read OMNI. Although it was interesting, and in fact contained the first publication of ‘Burning Chrome’ that very year, it also came across as very commercialised and quite lightweight to me compared to, for example, ‘New Scientist’. It was also into a fair bit of what would be called “woo” nowadays, and it’s hard to judge but I get the impression that back then psi was more acceptable as a subject of research for science than it is today. This could reflect a number of things, but there are two ways of looking at this trend. One is that a large number of well-designed experiments were conducted which failed to show any significant psi activity. The other is that there is a psychologically-driven tendency towards metaphysical naturalism in the consensus scientific community which has little basis in reason. I would prefer the latter, although the way the subject was presented tended to be anecdotal and far from rigorous. From a neutral perspective, there does seem to be a trend in the West away from belief in the supernatural, and the fact that this was thirty-nine years ago means that trend is discernible on this scale.
Then there’s music, more specifically New Wave. For me, because of my age and generation, New Wave doesn’t even sound like a genre. It’s just “music”. This may not just be me, because it’s so vaguely defined that it seems practically meaningless. It’s certainly easy to point at particular artists and styles as definitely not New Wave though, such as prog rock, ABBA, disco and heavy metal, but I perceive it as having emerged from punk, and in fact American punk just seems to be New Wave to me. It’s also hard for me to distinguish from synth-pop at times. British punk could even be seen as a short-lived offshoot of the genre. By 1982, the apocalyptic atmosphere of pop music around the turn of the decade was practically dead, although I still think there’s a tinge of that in Japan, The Associates and Classix Nouveaux. The New Romantics had been around for a while by then. I disliked them because I perceived them as upper class and vapid. I was of course also into Art Rock, and to some extent world music.
In the visual arts, for me 1982 saw a resurgence in my interest in Dalí, who had interested me from the mid-’70s onward, but this time I was also interested in other surrealists such as Magritte and Ernst, and also to some extent Dada. As with New Romantics, Dalí was a bit of a guilty pleasure as I was aware of his associations with fascism. This was all, of course, nothing to do with what was going on in the art scene of the early ’80s, although I was very interested and felt passionately positively about graffiti. I felt that the destruction of graffiti was tantamount to vandalising a work of art. To be honest, although I’m concerned that people might feel threatened by it and feel a lot of it is rather low-effort and unoriginal, I’m still a fan of it, although I wouldn’t engage in it myself.
1982 was close to the beginning of the cyberpunk æsthetic. I’ve already mentioned William Gibson’s ‘Burning Chrome’, which first appeared in OMNI this month in 1982, and there was also ‘Blade Runner’, which was already being written about, again in OMNI, although it wasn’t released until September. The influence of the genre can be seen in the graphic at the top of this post. To a limited extent even ‘TRON’, from October, was a form of bowdlerised cyberpunk, with the idea of a universe inside a computer. Cyberpunk is dystopian, near-future, can involve body modification, does involve VR and has alienated characters and anarcho-capitalism, with a world dominated by multinationals. ‘Johnny Mnemonic’ had been published, also in OMNI, the year before. The question arises of how much today’s world resembles that imagined by cyberpunk, and to be honest I’d say it does to a considerable extent, and will probably do so increasingly as time goes by.
On a different note, although the days and dates match up between 2021 and 1982, this will only continue until 28th February 2023, after which a leap day for 1984 will throw them out of kilter again. It can almost be guaranteed that years twenty-eight years apart will have the same calendar. One thing which can’t be guaranteed is the date of Good Friday and the other days which are influenced by it. This means that there is almost always a difference between calendars even when the days of the week match up. I also said “almost be guaranteed”. Because the Gregorian calendar skips leap days when they occur in a ’00 year whose century is not divisible by four, we are currently in a lng run of matching twenty-eight year cycles which began in 1900 and will end in 2100. Hence up until 1928 the years of the twentieth century don’t match up on this pattern, and likewise from 2072 onward there will be another disruption of the pattern down into the future. There are also other periods which match between leap days, such as the thirty-nine year one I’m currently exploring, which began last year and includes two complete years as well. This also divides up the years a little oddly, because since I was in full-time school at the time, academic years were also quite important to me, and in fact continued to be so right into the 1990s. This makes a period between 29th February 1980 and the start of September 1980 and will also make a further period between September 1983 and 29th February 1984. Finally, astronomical phenomena don’t line up at all really. Solar and lunar eclipses, and transits of Venus and Mercury, for example, won’t correspond at all.
So anyway, that’s one of the possibly pointless things I do with my time at the moment. It does bring home to me how slowly time does in fact go, because to be honest doing this seems to have slowed the pace of the passage of time back to how it was when I was fourteen or fifteen. What other effects it has on my mind I’m not sure, although I think there must be both positive and negative influences.