• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. Programming Early Computers Was Very Different from Tod…
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
ibm 1130
atlas II
icl 1904
ict 1904
history

Programming Early Computers Was Very Different from Today

15 Apr 2021 • 10 minute read

 breakfast bytes logoIn my post "I Couldn't Imagine Being Too Poor for Servants, or Rich Enough for a Car" I wrote about that quote from Agatha Christie. Today, when a flashlight app on your smartphone needs "only" 250KB, it is hard for people, and by people I mean engineers and programmers, to understand just how limited were the capabilities of early computers. One example that I'll get to later in this post is the IBM 1130, which could have as little as 4K words of memory (16-bit words, so we are talking 8 kilobytes). But it could compile Fortran programs in that space...even Fortran programs that required more memory than that to run. It had a 15-bit address space of 16-bit words, so what we would call 64KB today.

A computer of very limited power I've written about before (in my post The First Computer on the Moon) was the Apollo guidance computer, This had 2800 chips, each of which contained two 3-input NOR gates, so a total of about 8.5K gates. That number is based on the fact that a 3-input NOR gate needs six transistors in CMOS, and so would be 1.5 gates using the normal rule of thumb of four transistors per gate, based on a 2-input NAND gate—but, of course, this wasn't CMOS, so you are welcome to do your own calculation by some other metric. When I came to the US and started work at VLSI Technology, the largest ASIC design we could consider was around 10K gates, so a little bigger than an Apollo guidance computer (and the name ASIC had not yet been invented, we said "custom" as opposed to "off-the-shelf" like suits).

The computers I know best are the ones I actually spent time programming. So I'm going to be self-indulgent and use my own experience as the story of the development of computers. I realize there were lots of other computers at each stage in this story, but I didn't program them so I don't know much about them.

ICT/ICL 1900

I learned to program when I was 14. That was in FORTRAN IV and also in a sort of fake assembly code known as City & Guilds Mnemonic Code. Initially, it was a bit theoretical since this was the mainframe era so not easy to get access to. Cheltenham Technical College, about 20 miles away, had an ICL 1904. In fact, it was an ICT 1904 since it had been acquired before the company changed its name to International Computers Limited from International Computers and Tabulators. They ran courses on the two programming languages and so the head of the math department at my school managed to get some of us enrolled in the courses. Of course, we got to run our programs, but only when we were on-site—this was before the days of modems.

We then persuaded ICL at its headquarters in London to give us free computer time. And we persuaded the school to purchase a teletype so we could prepare punched paper tape. So running a program consisted of punching it out on paper tape, and once a week we packed up all the tapes and mailed them (by snail mail) to Putney, in the West of London, where ICL was located. They would run the programs, which would produce output on line-printers, and then mail the printouts back. So if you had a syntax error in your program, that took you a week to fix.

It is hard to comprehend how impractical it is to debug a program when you only get to try and run it once per week, and there are no interactive facilities for debugging.

Somehow, we also got access to an ICL 1904 on a local RAF airbase nearby, with the delightfully Harry-Potterish name of RAF Quedgeley. I'm not sure how—probably one of the pupils at the school had an RAF officer parent who pulled some strings. Now we didn't have to mail tapes to London, two evenings a week we got an hour of time there. I was usually the person who had to go and get the programs run. At first, I was just allowed to load the tapes in the reader, and the operator (mainframes had operators) would type the commands to read the tape, compile the program, and run it. He soon got bored of that and let me just do it myself while he had a coffee before he had to do all the magnetic tape backups of the disks. It was definitely fun being a 16-year old with full control of a million-dollar mainframe computer. Yes, I was that woman in the picture above.

But apart from the operator's console, which couldn't be used for editing or debugging programs, we still only got a couple of runs of our programs each week with only print statements (actually WRITE in FORTRAN) for debugging.

I wouldn't get to use an interactive text editor until I left high school. In those days, Cambridge University had its own entrance exam in December for the following September, which meant doing an extra term of high school and then having nine months to do something else. Nowadays, if you take a year off between high school and university, it is called a "gap year" but wasn't back then. Anyway, for part of the time I went and worked for a friend of my father's (actually my godfather) at the Cambridge University Engineering Department.

Atlas II

The main machine I used there was called Atlas II and it was installed at the Computer-Aided Design Center somewhere outside the city. I don't remember ever seeing it. Finally, I got to use a teletype to edit and debug programs. This still took place at just 10 characters per second, but it was a wonderful step up from twice a week, and punching paper tapes. I worked out of a shared area since I didn't get an office, on the top floor of the engineering building.

It was a very odd machine when I think about it. Most of the time I was programming in FORTRAN, so the compiler hid the architecture. It had 48-bit words. There were 128 registers, some of which had dedicated purposes (like the PC, or register 0 always held 0). Surprisingly for the time, it had virtual memory (2 million 48-bit words, so 16MB in modern terminology). But the physical memory was just 16K words or 96KB. One result of this was that when you tried to run a program, it often required waiting for 30 minutes to get access to enough virtual memory for the operating system to load the program. It would take forever to get anything done if the text editor was treated as a normal program like that—instead, it was built into the operating system. In a tour de force by whoever had written it, it fitted into a single disk page.

I taught myself to program it in assembly code, which was a challenge. There was no real mnemonic assembler. You had to learn the numeric opcodes (121 was load, I still remember). You also had to keep track of what was in the 128 registers since there was no real assembler. The only concession to modernity was that you could label storage for both variables, jump targets, and subroutine calls. So very primitive.

In the unlikely event that you want to know more about this computer (of which only three or four were made), Wikipedia has a page.

IBM 1130

In the basement of the engineering building was an IBM 1130 that had some primitive graphics and could be booked for exclusive use for periods, especially in evenings when no courses were being run. There's one in the Computer History Museum, too, that I took the above photograph of. I think the Cambridge Engineering Department 1130 had 16K words of memory (16-bit) so 32K bytes. As I said at the start of the post, it had a FORTRAN compiler, which was another tour de force since it could run in just 4K words. It seems it had a 3.6us memory access time, so perhaps an average of 5us per instruction, or 0.2 MIPS. But compared to the Atlas it was blazingly fast since you were not sharing it with anyone. It also had disks with exchangeable cartridges which each held...be still my beating heart...one megabyte. There were two drives. One held the operating system and applications, the other could be used for a class to hold the students' code, or, if you were an employee, you might have your own cartridge as I did.

I suppose it was like my first PC, a computer that I had exclusive access to (well, along with whatever friends were there, too), and could edit and run programs. I don't even remember much of what I wrote. One program I remember creating had the London Underground ("the tube") map in a data file and would work out reasonable routes between destinations (such as getting to Victoria Station from Acton Town). I was pretty pleased with that.

I think to debug programs you could set breakpoints and single-step. So this was a true luxury compared to only being able to run a program once or twice per week. And no waiting for your program run to start.

IBM 370/165

Once I arrived as an undergraduate, the computer we were meant to use was an IBM 370/165 run by the university computing service. It was on an inaccessible floor of the building attended to by a priesthood of operators. I think we got a guided tour once as computer science majors, but most people never saw inside the room. The best I can do for a picture is the 360 at the Deutsches Museum in Munich (obviously it would normally have its covers on in use).

For most people, the main means of access was via what was known as the cafeteria system. You prepared your programs or data on punched cards. You went to a public lobby, put your cards in the card reader, and it would read them, run the job, and then you walked over to the line printer to get your printout. You could then go to the punch room where there were lots of card punch machines, make any changes, and then go back and repeat. The same computer ran dozens of terminals all over Cambridge. I know it had one megabyte of memory. Because it had a cache and virtual memory backed by a fixed-head-disk it is hard to work out the performance. The IBM archive says:

Both Models 155 and 165 have two-level memory systems - - a very high-performance buffer storage backed by a large main core storage. This hierarchy, in which the CPU gets data directly from the faster buffer most of the time, significantly reduces the effective main storage cycle and closely matches the memory cycle to CPU cycle. Model 165 main storage has a 2-microsecond cycle and its buffer storage has an 80-nanosecond cycle. The Model 155 main storage has a 2.1-microsecond cycle while its buffer has a 115-nanosecond cycle. In operation, the 155 accesses four bytes in two cycles—230 nanoseconds.

Later, when I was enrolled in my final year doing computer science, I finally got to use interactive terminals on the 370. Around then, they added the...ta-da...second megabyte of memory. Unusually for the time, it was not ferrite core, like the first megabyte, but "solid-state". I don't know what the memory technology was, but anyone today would be surprised that it was across the room from the CPU. Since light travels a foot in a nanosecond, you could not do anything like today.

Later Computers I Programmed

I planned to write about the Vax 11/780, Apollo workstations, Sun workstations, PCs, and more. But this post is long already, so enough of my reminiscing, and comparing how hard things used to be to today when you have stacks of open-source libraries, fast computers, IDEs, effectively unlimited memory. And that's just in your smartphone.

Yes, I have programmed that, too. See my 2009 EDAGraffiti post EDA on the iPhone.

Four Yorkshiremen

Of course, a blog post like this comes across as "you young people today don't know how things used to be". Of which the perfect example is Monty Python's Four Yorkshiremen sketch:



 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.