My Hardware History – A Tale of Mass Disempowerment

(c) Gary Cattell

Not gonna lie: this post is inspired by Brian of Brian’s Blog fame, which you should all of course now visit and whom you should subscribe. But his post is about operating systems, whereas mine is about how it’s all gone to Hell in a handcart.

Ethical decision making in the marketplace often comes down to trust versus self-sufficiency. On the one hand, when you buy something, even from a sole trader, you are kind of out-sourcing your ethical choices to a third party, whom you may not know and in whom your trust may be misplaced. To some extent, it may not even be their fault that they have been forced to take negative and harmful paths in what they do, because it’s the system, not the people. The very fact that we live in a monopoly capitalist society forces us to harm others. To an extent, leaving the larger political picture out of the equation for a bit, one can take control of one’s life to varying degrees on the ethical front, but ultimately this usually amounts to self-sufficiency. The extent to which one is self-sufficient correlates with the degree of responsibility one is taking for how one’s actions affect others, including the planet. However, even there it’s important to recognise privilege. One may have access to an allotment and be able to grow food, but that depends on various factors such as the good fortune of having that access due to much land formerly devoted to allotments being sold off to the highest bidder, and not living in a food desert, having the time to put into raising that food and so on. The same applies to gardening for food, to a greater extent. Not everyone has the luxury of a garden where they can grow crops, and foraging only works even to the degree it does because relatively few people do it. There’s also a knowledge, experience and skills base some people can draw on and others can’t. The fact that I’m a herbalist makes it easier for me to forage, for instance.

In the case of growing food, one usually at least has a fairly good chance of having facilities and skills needed to manage this. In other areas this may not be so. For instance, we can’t make a gate ourselves, or so I tell myself (I do have very limited carpentry skills which I never use and were pretty poor anyway, so in theory I could), but we could buy a gate and fix it onto the wall using the abilities available in this household. This costs about £50. Presumably buying the timber and having invested in the available tools would cost us less, but on asking someone else to do it for us we were quoted a figure of many hundreds of pounds.

The opposite end of this kind of self-sufficiency is found in the likes of computers and allied devices. The degree of skill and automation and the combined effort of innumerable people, is what makes devices like laptops, tablets and smartphones, and their associated storage media and support items such as chargers, cables and Wi-Fi, possible. That huge morass of people doing things for you is difficult to replace with your own skills because there is only so much that can be fitted inside a single head. Given a few lengths of wire and some relays, I could probably put together an arithmetic and logic unit which worked on a very short binary word length, and it isn’t often appreciated that one major reason I can do this is that I’m a philosophy graduate. I’m also dyspraxic though, so it would be a struggle for different reasons than the mere knowledge of how to do it. Consequently I rely on others for the digital electronics I use, and that reliance means that, as usual, I’m handing over responsibility for ethical choices to other people, whose own choices are compromised by working within a capitalist system.

I need to get down to specifics.

In the 1970s, the microprocessor was seen as a threat to manual labour. Since I was entering a Marxist phase at the time, it simply seemed wrong to have anything to do with microcomputers back then, since it would be supporting companies whose products were putting people out of paid employment. Compared to my peers, our family were relatively late adopters of all sorts of technology, such as colour TV, cassette recorders and stereo record players. I knew one early adopter whose family bought a ZX80 when it came out. My family and others were rather disdainful of this and saw it as kind of “uppity”, I suppose. We got a ZX81 in November 1982, some time after the price had come down owing to the introduction of the ZX Spectrum. After only a week, we purchased a 16K RAMpack. However, by the time we had one, I’d known BASIC for about a year, having taught myself from textbooks with no hands on experience. I was still almost superstitiously suspicious of even home micros at the time. After all, Clive Sinclair had claimed that the ZX80 was powerful enough to control a nuclear power station, so taking that at face value even that had the potential to replace human workers aplenty. At the time I had no concept of automation creating jobs to replace the ones that had been destroyed, so all I could see for the future was mass unemployment.

The ZX81, and to an extent its predecessor, has a kind of nostalgic charm to it even for the time. For instance, like older and larger computers it has its own character set, exclusively upper case letters and only sixty-four printable characters, and in particular it uses “**” to denote raising to a power rather than “^”, or actually “↑” for most micros of that vintage if I remember correctly. It also displays black characters on a white background, giving the impression that the output is all coming out of a printer onto the old-fangled paper which looked like it had green musical staves on it and holes up the sides, and was folded with perforations separating the sheets. It was also, in practical terms, silent out of the box. My enduring impression of the ZX81 is that it’s an early ’60s minicomputer trapped in a plastic matchbox, and as such it had a flavour of former glories about it.

To me, the most mystifying thing about this computer was that it somehow seemed to be able to produce sufficiently detailed characters as would appear on a much higher resolution display but could not actually address those pixels directly. Why couldn’t it draw in as much detail as it displayed text? There were 2×2 graphics characters but they only afforded a resolution up to 64×44. It also didn’t scroll unless you told it to. I attempted to combine characters by printing them in the same location, expecting a kind of overstrike effect, but that didn’t work. Then there was the question of machine code. At the time, I assumed that computers directly acted upon the BASIC code they were given. When I saw the table at the back of the manual showing the machine code instruction equivalents to the alphanumeric and other characters, I drew the erroneous conclusion that the microprocessor simply read the BASIC line by line off the display file and executed it to achieve the ends, so for example SIN was a series of three instructions which together could achieve a floating point trig function, and so could COS, and so forth. This is kind of how the human brain operates with language, so I would defend this naïve view.

Another unexpected feature of microcomputers was that they almost all used BASIC. I had expected that the relatively cheap home computers such as the VIC-20 or TI-99/4A would use that language, but because it’s Beginner’s All-purpose Symbolic Input Code, more expensive computers would have built-in PASCAL, FORTRAN or ALGOL. This was, however, only true to a very limited extent.

It was eventually borne in upon me that programming languages were themselves software, written in a more fundamental language called machine code which controlled the microprocessor directly, so I learnt Z80 machine code and programmed in it in a limited way. I discovered there were ways of getting the ZX81 to produce sound and increase its vertical resolution, and even managed to produce a program which played the drum machine bit of ‘Blue Monday’. This suggests that I traversed a very steep learning curve very quickly since we acquired the computer in late November 1982 and New Order’s single was released less than five months later. I felt uncomfortable with the extent to which I seemed to be fixated on computer programming and tried to maintain interest in other topics. My attitude to computers has always had this ambivalence to it. It’s also very likely that my pursuit of this hobby adversely affected my O-level results, and it’s a shame that the knowledge I was acquiring couldn’t have been used to get an O-level in computing. I actually used to help pupils who were studying computing with their homework, and I’ve often wondered what the disconnect is here. It reflects a pattern in my life of not being able to integrate my skills and experience with formal accreditation or recognition, and I suspect it’s linked to neurodiversity but I don’t know how.

Much of the time I spent programming the ZX81 was also spent envying the specifications of more powerful computers, but at the time I think my parents were trying to motivate me to find paid work, which I did in fact do and proceeded to buy, of all things, a Jupiter Ace. This is a fairly famous computer designed by the team who did the ZX Spectrum at Sinclair Research Ltd and then left to form their own company, and is chiefly known for the fact that it had the programming language FORTH in ROM rather than BASIC. This was almost unique. There was an attempt to design a more advanced FORTH-using micro called the Microkey 4500, which was basically a wishlist fantasy computer which sounded excellent but hardly even got to the drawing board stage, but to me the main appeal of the Ace was that it behaved like a “proper computer”. It has the complete ASCII character set, displays white text on a black background and has a Spectrum-style keyboard. It is in fact very similar to the Spectrum even down to the font it uses, but lacks colour and point-addressable high resolution graphics. However, by the time of its release, October 1982, consumers were expecting computers to have high resolution colour graphics and proper sound. For some reason I’ve never understood to this day, most British micros at the time had built-in speakers for sound rather than using the speaker of the TV they were plugged into, which sometimes compromised the sound production to a barely audible beep and necessitated the addition of built-in sound hardware while the audio capability of the TV was just sitting there unused. A strange decision which would probably make more sense if I knew more about electronics. Jupiter Cantab, the company which made the Ace, went bust after less than two years and this enabled me to buy the Ace. This has a different special place in my life because it was the first durable product I ever bought with money I’d earned myself, and I still own it today.

FORTH is sufficiently close to the machine that it can actually be implemented as machine code in itself, and much of the language as supplied is definable in terms of more primitive words in that language. FORTH’s appeal to me was that it enabled me to work very closely with the hardware of the computer without having to use actual op codes. I attempted to design a prefix notation version of the language, but although I wrote a complete description which came quite naturally to me, I never completed it. I noticed also in myself a tendency to attempt to reduce user-friendliness to ease programming: for instance, I opted to use signed sixteen bit integers as the only data type and express them in hexadecimal with leading zeros.

By this time I’d been aware of operating systems for about two years. I was at first only cognisant of CP/M, which had been devised in 1974, and Unix, which I think dates from 1969 although clearly not in its later form as the dates begin in 1970. MSDOS and PCDOS were also out there somewhere but since IBM PC compatible computers cost upwards of £3000 at the time I regarded them as permanently out of reach. Oddly, we did in fact have a PC clone briefly in our house in 1983, although it was never switched on and was simply being stored there for my father’s workplace. Incidentally, at the time that workplace had been using a confoundingly simple minicomputer they’d bought in the 1960s which appeared to have only four op codes. I found this hard to believe even at the time but extensive study of the technical manuals showed that this was indeed the case. I have no idea what it was now, but it was very strange and sounds like it would’ve been a challenge to program, though an interesting one.

For me, Unix was associated with minicomputers and wouldn’t even get out of bed for what at the time seemed like a ridiculously vast amount of RAM. However, there were also versions for the 32:16-bit 68000 and I think also the failed Z8000, although oddly CP/M was also rewritten in C for the 68000, which seemed ridiculously underpowered to me at the time. It was at that point that an annoying trend became apparent to me, which had been going on since at least the 1960s when I’m aware it was implemented on the PDP-11 range of minicomputers. There were privileged and user instructions. On the 6809 CPU, there had been a system and a user stack pointer, though both available to the user via machine code and assembler (a direct one-to-one programming language slightly friendlier to the user). On its more powerful successor, the 68000, the system stack pointer was unavailable to the user and only the user stack pointer was accessible. Other tendencies also came into play, such as many of the system flags being alterable only by the operating system and whole areas of memory locked out from user access. This is done for security and stability purposes, but to me it felt patronising and hand-holding, and also like an in-built class system. There’s hoi polloi, the likes of us, and there’s the Programmer, who wrote the operating system and has total control. We are merely their minions, second-class computer users, and this was the start of a trend to lock people out of controlling their own devices which continues today. It’s the opposite of self-sufficiency and it means you have to trust whoever wrote and often sells or licences the operating system.

There was also another trend which drives me round the bend: virtual memory. When I first learned about multiuser systems, I was astonished to find that they would sometimes save the whole program the user was running and switch to another user to load their program and run that, continuing in that cycle depending on how many users were on the system. Since hard drive storage is mechanical, it’s many orders of magnitude slower than solid-state RAM or ROM, and this makes things very slow, so I assumed that this was a soon-to-be-superceded by cheaper and larger memory sizes. This didn’t happen. What happened instead was that later operating systems were designed to pretend there was more physical memory than there actually was, with the result that it was all too easy for a computer to get lost in its own internal musings and kind of forget there was some person out there trying to use the bloody thing. Fortunately we now have solid state drives and the situation is somewhat better.

Way into the late 1980s I would still doodle Z80 assembler programs in notebooks meant to do various things, though without being interested in implementing them. By that time, GUIs were starting to take over, and thereby hangs another tale.

I liked the Xerox Star and the Apple Lisa, which stole the former’s user interface and preceded the Macintosh. That to me did seem to be the way forward with computers at the time. Later on, Microsoft tried to copy it, and it seemed like a pointless thing bolted onto the front of the operating system which slowed it down so much that it wasn’t worth whatever benefits it might bring. The same applied to GEM, its main competitor. To me, a GUI feels like another way computer design is taking control away from the user. This was just the beginning. I am used to imperative and procedural programming. As far as I’m concerned, a computer program is a list of orders or instructions organised into smaller subroutines which tell the computer what to do. Putting it like that, it seems absurd to me that anything else would ever be so. Object-oriented programming began to become very popular, and at no point have I remotely understood it. Every description of what it is seems to use a metaphor which fails to describe what’s actually going on inside the device. It will do something like say that there’s a class of vehicles which have the properties of size, number of wheels and so forth which can be applied to create new concepts of vehicles, such as distinguishing a car, motorbike and lorry. That’s all very well, but I can never make a connection between that metaphor and computer programming, which looks like something completely different. It also uses a declarative paradigm, where you just seem to tell the computer what’s there and leave it be, which baffles me because how can anything actually be happening if you haven’t made it happen? I’ve attempted to describe my understanding in terms of variables and procedures, but people in the know have always told me I haven’t got it right. It’s been said, and I’ve come up against this, that if you’ve learned imperative and procedural programming, it makes it so much harder to learn OOP (Object-Oriented Programming). I also can’t shed the impression that a lot of it is obscurantist technobabble hiding a naked emperor. And if you can’t see what’s going on inside, how can you have control.

Another annoying trend has been created by the easy availability of memory. Back in the old days, computers needed to be programmed efficiently in terms of memory. For instance, a spell checker would be based on the rules of English spelling and object to, say, a Q not followed by a U or the use of “AY” in the middle of a word, but it didn’t have a dictionary. Nowadays, it does, and that takes up many megabytes in its uncompressed form although I’m sure it is compressed. Likewise, chess games have tended to store whole lists of possible moves and try to find their way up a tree to the best result, using up huge amounts of memory, whereas previously they would’ve used the rules of chess and, I presume, some cleverly-written strategy to get to win. To me, this seems lazy and disappointing. I want programs to be optimised to the same extent as they were when 64K seemed like impossible luxury. So much processing power is also wasted on running the GUI. We don’t need this because it isn’t really using the computer. It’s just making the computer easier to use by tying up processing power in unnecessary trinkets.

So: you might think I’m a Linux person. I did too for a long time, but I’m not. As far as I’m concerned, you have to take care of your hardware and ensure it remains useful for as long as possible for a number of reasons:

  1. It takes a lot of energy, resources and environmental damage to make a computer and the working conditions of the people who mined the minerals, refined the compounds and put the device together are often questionable. Once all of that is done, you should be able to hang onto it for a long time.
  2. We ourselves haven’t necessarily got that much money and we should expect our devices to be as useful as possible. That means access to hardware and no hand-holding.
  3. When we finally discard our hardware it goes to landfill or some poor Third World community where it poisons the rivers and gives the people disassembling it physical disabilities, causes cancer and congenital birth defects, not to mention what it does to the biosphere.

All of this has got to be the number one priority when we consider computers and other devices, and to be fair Linux does go a considerable way to addressing this. But there’s a problem. A lot of the people who are involved in designing and coding for Linux are very focussed on supporting up to date hardware. It’s a big community and they aren’t all doing that, but many of them are and it’s often hard to find people who are more genuinely concerned on the E-Waste and other sustainability aspects of the issue. The other thing, and this may be less problematic today than it used to be, is that Linux people are often not people people, and in the end this amounts to user interfaces which are not very friendly. I’m reminded of the Dilbert cartoon strip of the computer programmer saying “it’s my philosophy that the computer interface should hurt the user”, and employing samples of birds being killed by cars as the main notification sound. I’ve had to use Linux GUIs which are unwittingly displaying at 320×200 and put the OK button completely outside the display, or don’t recognise that the memory map for the video RAM is organised in such a way that everything looks like incomprehensible multicoloured vertical stripes. And yet a perfectly good, low-end 640×480 sixteen colour display could’ve been used which somehow is not the default. Why?

Don’t get the impression that I’m not as susceptible as most other people to the appeal of Windows. I was very impressed by Windows 3.1, which I didn’t come across until 1998 due to forcing myself to go cold turkey on computers for a decade or two. As far as I’m concerned, I’d be happy for the GUI to look like that today and it all seems a bit unnecessary to make it any fancier, particularly because in doing so you’re consigning millions of PCs to landfill for no good reason. I think Program Manager, the main shell for Win 3.1, hung around until at least Windows XP although it didn’t retain its look. It may be due to the fact that our 486-based 33 MHz PC with VGA graphics and non-working sound card was just less messed-about than other computers we’ve used since, but it was the most stable version of Windows I’ve ever used in earnest. It crashed once the whole time we used it, and that was a major and worrying event. Incidentally, there used to be an Explorer-style shell for Win 3.1, which made it practically indistinguishable from classic 32-bit Windows, which I used for a short period of time, and in fact I’ve even installed it as the default shell on later versions due to it being more compact and stable than the default shell.

We then leapfrogged over Windows 95 to use Windows 98 on a computer which was a total mess. It was a 120 MHz Pentium with SVGA, a working sound card and 16 Mb RAM. This is below the recommended specs for Windows 98, but there were other imponderable issues with that PC at the time. It only lasted a few months before we handed it over to a friend who was more au fait with computers, who got it to work. We replaced it with our final desktop format computer, an AST Bravo MS P/75 with an ATI Mach-16 graphics card, which incidentally has 2-D acceleration but not the 3-D which had become practically universal by that point. This was actually a downgrade, but was more reliable, and at that time I was really into the themes and skins available on Windows 98. I also struggled endlessly to get Linux to work on it. QNX was fine, but then it always is, BeOS also worked okay. It ultimately got upgraded to 128 Mb RAM and a piggy-back 133 MHz Pentium if I remember correctly. This was the first computer to have a reliable dial-up internet connection. It was fine at running Quake but it was only barely capable of showing DivX videos even when upgraded to the nines.

The next stage came in 2002, as the start of our plan to give up television, partly for the children’s sake. This meant ultimately having access to DVDs and therefore we bought a new computer, though with a CD-ROM drive, mainly in order to get broadband internet. By this time we were accumulating E-Waste enormously because I was reluctant to let the old hardware go the way of all silicon. This PC had an Athlon and ran Windows XP. Windows XP was a good operating system on the whole but was the first O/S to break compatibility with 16-bit applications and drivers, which necessitated the slinging out of various bits of hardware. This was also our first PC to have USB slots, which I upgraded as well. The initial specifications of this machine were 128 Mb RAM, 40 Gb hard drive, 1 GHz Athlon of some description, on-board graphics, Silicon Integrated Systems motherboard. It started off with Windows 98 and we upgraded it in 2004. One thing I didn’t like about Windows XP was its childish-looking graphical scheme, but it still had Windows Classic, whose appearance was far better. This was also the first computer we used a TFT screen with, and the amount of space taken up by a 22″ CRT monitor is something to behold.

In 2007, we got a Windows Vista machine. This was because its predecessor had exploded due to me installing a graphics card whose power requirements exceeded that of our previous computer. Apparently it emitted something like ball lightning, although I wasn’t there at the time. This we persisted with for a further seven years. My chief issue with Windows Vista was that left to itself it seemed to spend too much time making the windows look translucent. Interestingly, the system requirements for Windows after a certain point went into reverse, probably because people were no longer impressed with the eye candy. In 2015 we acquired the computer which is currently sitting upstairs and of course runs Windows 10. Ironically, the stability of Windows 10 has made it possible to install Linux properly and use it on that PC. I have a history of using Live distributions of Linux on memory sticks as my preferred operating systems because they’re more secure that way.

Windows PCs have become very much secondary in our lives in recent years, as is probably true for many other people. We mainly use Android and Chromebooks, although Sarada still uses a Windows laptop from time to time. There no longer seems to be the pressure to upgrade, or maybe we’ve just become sidelined and lost interest in using them in other ways. I still program using FORTH and I have the modern version of the Commodore 64, which I don’t use as much as I should. To be honest, I’m not a big fan of the 64 because its BASIC doesn’t support the hardware properly and the palate is very odd for no apparent reason, but again all these are challenges to which I should rise. I’m vaguely considering writing a few arcade games for it, although I should stress that I’m very much in the Z80 camp rather than the 65 series one. I’ve never found the 6502 (technically the 6510 in the case of the 64) easy to program because I find its address modes and the complete impossibility of dealing with sixteen-bit integers irksome, although again it’s down to possibly admirable minimalist design.

We’ve also owned a few computers I haven’t mentioned. I bought a Tandy Color in 1999, there was a UK101 in our airing cupboard for a while, we had a Windows ME PC which was technically actually an ACT Apricot of all things, and also a plasma display 286-based Toshiba laptop I used DOSLynx on as a browser. That actually ran Windows 1.0!

Although I’ve never done it, I am curious as to the possibility of programming ARM or 64-bit Intel processors in machine code or assembler without an operating system as such. It would need a computer dedicated to that purpose of course. I would also imagine that attempting to use an ARM in that way would be a bit of a tall order, although I don’t know, because my understanding is that its instruction set is optimised for use with code no human has directly written nowadays, but I presume that Intel-like CPUs are another matter. But just as I’ve never really got into coding in a major way, I doubt I’ll ever get round to it. One thing I do still do with coding is occasionally write FORTH or BASIC programs to work out maths problems, and I’ve long preferred to use APL to spreadsheets, which quite frankly I can’t get my head round.

I very much doubt I’ll ever make the transition to object-oriented programming.

Subtract One And Branch

In case you’re wondering why I’m not talking about the murder of Sarah Everard and the fallout from that, I try to avoid covering gender politics on this blog because I have another blog devoted to that alone, and I also tend to avoid saying anything (see Why I’ve Gone Quiet) because cis women need to be heard more than I do on these issues and I don’t want to shout them down. In fact the other blog is quite infrequent for the same reason. It doesn’t mean I don’t care or don’t consider it important.

You are in a living room, circa 1950. In one corner sits a bakelite box, lit up from within behind a hessian grill with a design like rays from a rising Sun at the front, and a rectangular panel carrying names like “HILVERSUM” and “LUXEMBOURG”. In the other corner sits another somewhat similar bakelite box with a series of lamps at the top and a slot at the bottom, into which cards with rectangular windows punched out of them can be inserted. There is a row of push buttons above it. This is the domestic computer. It didn’t exist, but could it have?

In this post I mentioned that some say the last computer which could be “completely understood” was the BBC Micro, released in 1981, and expressed my doubt that this was true because apart from memory size, even an upgraded IBM PC would probably be about as sophisticated. However, this got me to thinking about a tendency I have to minimalise in terms of IT, and wondering how far it could be taken and still leave useful hardware, and that also brings up the question of what counts as useful.

In this other post, I described a fictional situation where, instead of the Apple ][ being one of the first mass market microcomputers, Sinclair ended up bringing out the ZX Spectrum six years early, that is, just after its Z80 CPU was released. That isn’t quite what I said: read the post if you see what I mean. The point is that the specifications of the two computers are very similar and if the ULA in the Speccy is realised via discrete logic (smaller and simpler integrated circuits), all the hardware necessary to construct a functional equivalent to it except for the slightly faster microprocessor was available already, and if someone had had the idea, they could’ve made one. Then a similar mind game arises: how much further back is it possible to go and still manufacture a reasonaly small but workable computer? Could you, for example, even push it back to the valve era?

Apologies for already having said things which sound off-putting and obscurantist. As an antidote to this, I want to explain, just in case you don’t know, how digital computers work. Basically there’s a chip which fetches codes and data from memory. The codes tell the computer what to do with the data and the chip can make decisions about where to look next from the results of what it’s done with those data. It’s kind of like a calculator on wheels which can read instructions from the path its travelling on.

There are two basic design philosophies taken with microprocessors, which for the sake of brevity I’m going to call Central Processing Units, or CPUs. One is to get it to do lots of sophisticated things with a large number of instructions. This is called the CISC approach – Complex Instruction Set Computer. The CPUs in most Windows computers are CISCs. The other is to get it to do just a few things with a small number of instructions, and this is called a RISC – Reduced Instruction Set Computer. Chips in tablets and mobile phones are usually RISCs. In particular they’re probably going to be using one of the CPUs in the ARM series, the Acorn RISC Machine. The Pentium and the like are kind of descended from the very early Intel 8080, which had a large number of instructions, but the ARMs are the spiritual successors of the 6502.

The 6502, which was the CPU used in the Apple ][, BBC Micro and the similar 6510 used in the Commodore 64, were designed with simplicity in mind, and this involved using far fewer transistors than the Z80(A) found in the Spectrum. The latter had 694 instructions, many of which could only be accessed by prefix bytes acting a bit like shift keys on a typewriter. By contrast, not taking address modes into consideration, the 6502 only had fifty-six. Processors usually have single or double cells of memory used to store numbers to work on or otherwise use internally called registers. The Z80 had nineteen, I think, but the 6502 only had six. To be honest, I was always a lot more comfortable using the Z80 than the 6502, and as you may be aware this was a constant source of debate between nerds in the early ’80s, and to some extent still is. The 6502 always seemed really fiddly to me. However, it was also faster than its competitor, and comprised fewer transistors because of the corners which had been cut. Unlike the Z80, it used pipelining: it interpreted the next instruction while executing the previous one, but it was also faster because it used fewer instructions. It needed less time to work out what to do because the list of instructions was much shorter.

Research undertaken in the 1980s counted the proportions of instructions used in software, and it was found, unsurprisingly, that for all processors a small part of the instruction set was used for the majority of the time. This is of course the log-normal distribution which applies to a wide range of phenomena, such as the fact that eighty percent of the income from my clients used to be from twenty percent of them, and that one herb out of five is used in four out of five prescriptions, and so on. In the case of CPUs this meant that if the jobs done by the majority of instructions could be performed using the minority of often-used instructions, the others could be dispensed with at the cost of taking up more memory. An artificial example of this is that multiplying six by seven could be achieved by adding seven to itself six times, and therefore that an integer multiply instruction isn’t strictly necessary. Of course, this process of performing six additions could take longer than the multiplication would in the first place, but choosing the instructions carefully would lead to an optimal set, which would all be decoded faster due to the smaller variety.

The question therefore arises of the lowest possible number of instructions any CPU could have and still be Turing-complete. A Turing-complete machine is a computer which can do anything a theoretical machine Turing thought of in 1936 which consisted of an infinitely long strip of tape and a read-write head, whose behaviour can be influenced by the symbol underneath the head. It more or less amounts to a machine which, given enough time, can do anything any digital computer could do. My description of the calculator on wheels I mentioned above is effectively a Turing machine. What it means, for example, is that you could get a ZX80, rewire it a bit and have it do anything today’s most expensive and up-to-date PC could do, but of course very, very slowly. But it can’t be just any digital computer-like machine. For instance, a non-programmable calculator can’t do it, nor can a digital watch. The question is, as I’ve said, which instructions would be needed to make this possible.

There used to be a very simple programming language used in schools called CESIL (I created and wrote most of that article incidentally – there are various deeply obscure and useless articles on Wikipedia which were originally mine). As you can see from the link, there are a total of fourteen instructions, several of which are just there for convenience to enable input and output such as LINE and PRINT. It’s a much smaller number than the 6502 but also rather artificial, since in reality it would be necessary to come up with code for proper input and output. Another aspect of redundancy is the fact that there are three JUMP instructions: JUMP, JINEG and JIZERO – unconditional jump, jump if negative and jump if zero. All that’s really needed there is JINEG – jump if negative. It can be ensured that the content of the accumulator register is negative in advance, in which case JINEG can perform the same function as an unconditional jump, and since zero is only one greater than a negative integer, simply subtracting one and then performing JINEG is the same as JIZERO. Hence the number of instructions is already down to six, since multiplication and division are achievable by other means, although as constituted CESIL would then be too limited to allow for input and output because it only uses named variables. It would be possible, though, to ensure that one variable was a dummy which was in fact an interface with the outside world via some peripherals.

It has in fact been determined that a machine can be Turing-complete even if it has only one instruction, namely “Subtract One And Branch If Negative”. Incidentally, with special arrangements it can even be done with Intel’s “MOV” instruction, but it needs extensive look up tables to do this and also requires special address modes. Hence there can be a SISC – Single Instruction Set Computer. It isn’t terribly practical because, for example, to add two numbers in the hundreds this instruction would need to be executed hundreds of times. It depends, of course, on the nature of the data in the memory, and this has an interesting consequence. In a sense, this is a zero instruction set computer (which is actually something different officially) because it can be assumed that every location pointed to by the program counter has an implicit “SOBIN” instruction. The flow and activities of the program are actually determined by a memory location field which tells the CPU where to go next. This means that the initial value of the data in the accumulator needs to be fixed, probably as an entirely set series of bits equivalent to the word length.

This would all make for a very difficult machine to work with, but it is possible. It would be very inefficient and slow, but it would also reduce the number of logic gates needed to a bare minimum. It would simply consist of an arithmetic unit which would in fact be a binary adder because of two’s complement. Negative integers can be represented simply by treating the second half of the interval between zero and two raised to the power of the word length as if those numbers are negative.

This is a one-bit binary adder:

It works like this: 0+0=0 with no carry, 0+1=1 with no carry, 1+0=1 with no carry and 1+1=0 with carry. The symbols above, by the way, are XOR – “either – or -” at the top, AND in the middle and OR (actually AND/OR) at bottom right. These conjunctions describe the inputs and outputs where one, i.e. a signal, is truth and zero, i.e. no signal, is falsehood. Incidentally, you probably realise this but this logic is central to analytical philosophy, meaning that there are close links between philosophy, mathematics and computer science and if you can understand logic you can also understand the basis of much of the other two.

Most processors would do all this in parallel – an eight-bit processor would have eight of these devices lined up, enabling any two integers between zero and two hundred and fifty-five to be added and any two integers between -127 and 128 to be added or subtracted, either way in one go. But this could also be done in series if the carry is stored and the serial format is converted to and from parallel to communicate outside the processor. This reduces the transistor count further. All logic gates can be implemented by NAND gates, or if preferable by NOR gates. A NAND gate can be implemented by two transistors and three resistors and a NOR gate with the same components connected differently. Transistors could also be replaced by valves or relays, and it would also be possible to reuse some of the components in a similar manner to the serial arrangement with the adder, although I suspect the point would come when the multiplication of other components would mean this was no longer worthwhile.

I can’t really be precise as to the exact number of components required, but clearly the number is very low. Hence some kind of computer of a reasonable size could been implemented using valves or relays, or discrete transistors. ROM could be realised via fuses, with blown fuses as zeros and working fuses as ones, and RAM by capacitors, Williams tubes, which are cathode ray tubes with plates to feed the static charge back into the cathodes, or with rings threaded onto wires, all methods which have been used in the past. Extra parts would of course be needed but it is nonetheless feasible to build such a computer.

I feel like I’m on the brink of being able to draw a logic diagram of this device, but in practical terms it’s beyond me. I haven’t decided on input and output here, but that could be achieve via arrays of switches and flashing lights. And if this could be done with 1950s technology, who knows what the limit would be?