Table of Contents
- EXTENDED ITEMIZED FOOTNOTES
- HOW RARE ARE THESE COMPUTERS? (an estimate)
- SYSTEM STATISTICS (summary of units sold, etc. of the systems covered)
EXTENDED ITEMIZED FOOTNOTES
VIDEO URL: Domesticating the computer: how the appliance computer came to be – YouTube
The following notes were in the original video transcript, which was deemed too verbose. But it does highlight some additional details and background information about why these systems were pertinent.
The Datapoint was designed by a couple of ex-NASA engineers in San Antonio, who hired a professional industrial designer out of New York for the case. At first glance, this machine looks like just another terminal, which was something you would use to connect to a larger mainframe. Well, turns out, this Datapoint was a standalone general purpose programmable system. Problem was, nobody knew how to market it – everyone leased computers in those days and CTC didn’t want to ruffle the feathers of IBM.
But there was a company, Pillsbury Farms, that took notice of these Datapoint systems. They got the idea to put these small units right in their poultry farms, and programmed them to do payroll overnight, then mailed updates to their main computers in Minneapolis. That is, they programmed for it to do something beyond what it had been originally built to do.
During development of the DP2200, CTC contracted with Intel to make a processor that would run this system. At the time, Intel made EPROMs and memory chips, it was this outside company that fed them the idea of making a microprocessor. The contract agreement for that work is still in archives at San Antonio.
The engineers between CTC and Intel collaborated to define the instruction set, and that work eventually became the 8008 chip.
Now, Intel ended up being late (apparently there was a recession and they needed to concentrate on existing products that were making a profit). CTC needed to get their own product out the door. So, CTC decided to go with TTL chips instead, and developed a kind of serial-processor, where they chose to receive low order bits first. Basically, that means they were little-endian. This was later on replaced by Intel’s eventual chip, but to maintain backwards compatibility with the fielded CTC systems, Intel continued to adopt little-endian even to this day.
Next up is the KENBAK-1. Only about 40 to 50 of these were ever made, but they were fully assembled and were $750 when sold in 1972.
The KENBAK was only intended for educational purposes, to learn the concepts and ideas of digital processing. It had no microprocessor, but it did have a complete instruction set. So you
could program this machine, like a modern day head-less Arduino, but it only had 256-bytes of
Years later, in 1986, the KENBAK-1 was officially designated as the first personal computer by a
panel of experts of two prominent computer museums. And it’s true, this was a “pure-computer” that an individual could buy and do stuff with right out of the box. But as far as a consumer appliance, we weren’t quite there yet.
Intel Intellec 8
Now, most everyone knows the Intel 4004 and 8008 microprocessors came out at about the same time in early 1972. They were a new kind of chip, and didn’t do much on their own. TTL chips continued to be overall faster than microprocessors for a few more years.
But the 8008 was a starting point. To show developers how to use this chip, Intel had this expensive kit called the Intellec – they had a version of this kit for all their processors.
Intel provided RAM and EPROMs in the kit on some prebuilt cards, along with lots of
documentation describing the instruction set and how to put a system together.
But more than that, Intel also provided development tools, like a PL/M or FORTRAN compiler. So the kit was expensive, and you also needed some good hardware already to do your development on. Intel stayed focused on their patents and core chip production and left it to someone else to figure out how to make a computer system out of it.
Which we’ll get to that shortly, first there are a few other systems to get through.
One of those steps is this little device called the Odyssey. Developed by Ralph Baer in 1973, it had this interface adapter to connect the video output to a standard television set that people had at home.
The Odyssey box would depict a paddle and ball image onto the television set, and you could interact with those graphics to virtually bounce the ball across the screen.
What’s important here is that the Odyssey was one step into peoples homes. It let open the idea of having a small electronic gadget at home, interacting with the television set – and how to market such a thing.
And once other engineers saw this, naturally they expanded upon this idea, which as we’ll see, they did in a big way a few years later.
Next is the Hewlett Packard model 9830A. This device was the peak in a series of programmable calculator from the prior several years.
The 9830A had a full implementation of BASIC in a built-in ROM, along with several other software packages in ROMs that could be inserted into the side.
With a full keyboard and tape storage, this is starting to look like a practical personal computer. Except for two issues: the machine only had one row on the screen, and it cost about $6000 in 1973, which was more than the cost of an average car in the 1970s.
Still, these HP calculators served many NASA engineers very well throughout the 60’s and 70’s, since they could crunch numbers on their own without being dependent on any mainframe connection.
Another great system of 1973 is the Wang 2200.
It looks like it was the first liquid cooled computer, but afraid not: it just had a very large
external CPU. In the very first versions of the Wang 2200, it also had an external power supply.
They solved that fairly quickly and revised the power supply to fit within the main cabinet.
Speaking of power supplies, we consumers tend to take all that for granted. But to the electronics folks, stable and reliable power is a big deal – just ask them about ripple. Power supplies were difficult to make in the early 1970s, and small reliable switching power supplies were relatively new. On top of that, power lines of a building were also not quite as reliable or consistent as they are today.
Anyhow, the main cabinet here is starting to look like a functional desktop computer: a full keyboard and screen, up front storage device. And the system started right up to a prompt where you could enter BASIC programming code.
But one issue here is that even the BASIC is implemented in TTL logic. As they found bugs,
which they did, it became difficult to fix or improve that software. One such bug, or limitation,
is why the BASIC could only address up to 32KB instead of a full 64KB.
Still, the Wang-series of systems were useful to many small businesses, especially in
word processing. The founder, An Wang, was able to afford this business since more than a decade prior, he had sold several patents to IBM – including one on core memory, a
technology that ended up in the famous IBM System/360 of 1964.
Next up is the Micro Computer Machines model 70, developed in Toronto and first sold in 1974.
The designer was Mers Kutt, who was motivated to build a more efficient computing system after
experiencing the agonizing delays of punched cards and batched processing.
Like the HP9830A, the MCM/70 was often viewed as a fancy programmable calculator. But with two key differences. First, it was programmed using APL instead of BASIC, which is one of the only portable devices to do so. And second, it used the new Intel 8008 microprocessor. So by all accounts, it was a portable computer, it could fit under the seat in an airplane.
Since the 8008 wasn’t yet available during development, Kutt and his team used an 8008 emulator written in FORTRAN to produce the APL interpreter.
If you ever come across an MCM 70, one thing to notice is that it has no apparent ON/OFF switch. There is a START key on the keyboard for that purpose. Another interesting feature of the MCM 70 is the use of virtual memory, where the tape drives are used to give the system about 100KB of programming workspace, which is huge for 1974, and especially using an 8008 14-bit addressing.
By 1974, the idea and term of “personal computer” was very much in the air and on people’s mind,
especially after the July 1974 article in Radio Electronic’s about the Mark-8 computer.
While the MCM/70 was both personal and portable, it still cost more than a car. Plus, with the
screen being a single line, the overall system was mostly a niche unit for engineers.
So our story now continues, to the IBM 5100.
One thing to know about the 5100 is that it was released right after the trial portion of the famous IBM antitrust case had begun. This case had begun 6 years earlier and would end up being dismissed not until nearly 6 years later.
The case centered around the Sherman Act of 1890, related to corporate monopolies and non-competitive practices. In this specific lawsuit, it related to the idea of “bundling” of both software and hardware.
Since the case started in 1969, shortly afterwards IBM began leasing (and eventually selling) software separately from the hardware. Keep in mind, that was a radically new concept in early 1970s.
Historically, the price of a computer was both the hardware and necessary software it needed, generally produced by those who made the hardware. The idea of software as a separate media, like books or movies, wasn’t quite there yet. There was no “packaged software” or “software industry” as of yet.
So, for 6 years, IBM had been on relatively good behavior, to avoid giving the US Government anymore evidence to practices that might be considered overly non-competitive.
This is partly why IBM systems were still so expensive. During the trial, IBM raised prices to deliberately reduce their market share percentages, to help make them look like less of a monopoly.
And, it turns out this is why the IBM 5100 supports BASIC.
Let’s back up a little here… The IBM 5100 prototype was called SCAMP, and the “A” in SCAMP meant APL. That prototype was started in 1973, directly because an IBM engineer rolled a Wang 2200 into an executive’s office to show the idea of a “portable computer.”
Of course, with that dangling external power supply and CPU, the Wang system didn’t seem very portable. The IBM folks figured they could do better, and APL was very much in fashion at the time. The SCAMP prototype was a great success – although they did have their own power supply issues, which they got that sorted out.
But another IBM executive voiced one concern: supporting only one language might appear to be non-competitive. If they could show that the system supported multiple languages, it would at least appear as if additional languages could also be supported.
And conceptually, that is true, given how the Executive ROS of the system was setup. One could produce a FORTRAN card, or COBOL card, if it was written using one of the instruction sets that the IBM 5100 Executive already supported. The microcode technology used in the IBM 5100 made that more easily possible.
But more practically speaking, during review of the SCAMP prototype, another engineer also suggested that the 5100 might not be successful if it didn’t support BASIC just due to BASIC being so much easier to use.
So, it was settled, despite costing some development time (that pushed released from 1974 over into 1975), support for BASIC would be added to the system, essentially using the same technique they used to support APL, but with a different instruction set.
To explain a little bit about that: the IBM 5100 uses its own native instruction set to emulate the instruction sets of other systems. This is somewhat like the concept of Java, where byte code gets translated on the fly by the JVM to something the native system understands.
So, the IBM 5100 did have lots of existing software support borrowed from the System/360 and System/3. But, just like with Java, the system runs slower as it does a double translation of the operations.
It is only slower when those languages are involved. The 5100 also has a built in “monitor”
called the Diagnostic Control Program, which you can be used to directly enter native machine code, with the option to save and load those programs to the tape. I show a demonstration of this here.
The biggest complaint about the 5100 was the size of the 5″ CRT. But the 5100 also supported an external BNC connector, that could daisy chain a mirror image of several monitors – like the typical 9″ or 14″ RCA monitors available at the time. So, for example, this could be helpful to review lab results of an experiment running in a different room, like maybe a room full of radiation. There were several “scientific instrument adapters” and plotters made for the 5100.
One last thing about the IBM 5100: around the year 2001, there were some mysterious forum messages from a John Titor looking for an IBM 5100, stating that it was needed for some kind of time travel. That’s beyond scope for this discussion, but just to say that years later in 2011, a half-decent anime (Stens;Gate) was created based on this time travel theme and the use of an IBM 5100 (which, in the story, they relabeled as IBN 5100).
It is somewhat of a Back to the Future like story, but with a more serious tone and involves CERN, the big research organization over in Europe.
(more details about the IBM 5100 can be found here)
So now we get to the very famous Altair 8800.
The Altair is well written about and is widely said to be the first “personal computer.” Now, as we’ve seen, there are some earlier systems that were in that category of being a personal computer. (and near the end, we show samples of even earlier personal computers)
But what makes the Altair special was the price and the S-100 bus (which was briefly also called “Altair bus”). A company called MITS developed the Altair 8800, and they were able to get the price down by making a special deal with Intel on a bulk order of 8080 processor chips. We’re talking $75 per chip instead of over $300, so it was a significant discount.
As for the S-100 bus, this meant the Altair could be expanded to do many more things. So, once again, let’s back up a little bit.
Before the Altair 8800, there were two other famous kits: the TV Typewriter in 1973, and the Mark-8 in 1974. The Mark-8 was based on the more primitive 8008, but it wetted the appetite on the idea that a hobbyist could build their own computer.
But also, one could build their own keyboard and TV interface. Combine all that together, and you’ve got a personal computer. Although, you do still need a practical way to program the thing (EPROMs were still difficult and expensive to work with).
The Altair 8800 packaged up a lot of this into one kit, and it was way cheaper than the Intel Intellec – except recall, the Intellec came with lots of EPROM and software development tools. At the very least, the Altair had flip-switches to use for programming. But with the S-100 bus, you could acquire or build other ways to interface with the system.
And by mid-1975, Bill Gates, Paul Allen, and Monte Davidoff had developed and sold to MITS a BASIC interpreter for the Altair 8800. They didn’t invent BASIC — as we saw, BASIC was present on some earlier systems — but they were first to bring it to a microcomputer. While the Altair kit was around $450, to pull off running BASIC was still probably $1000 to $1500 of kit at the time.
Gates aggressively promoted the idea that software should be a paid-for product – which was a big paradigm shift and began the idea of software licenses. This naturally caused a rift in the homebrew computing community, but that’s a whole other discussion.
MITS soon had the problem of not being able to make their product fast enough for the orders being received. And therein was the problem with the Altair 8800: it was a viable personal computer, but no process or preparations for mass production.
And to expand the system into that personal computer, one would still spend several thousand dollars, or be savvy enough on acquiring the necessary parts. So, the Altair wasn’t quite yet a
practical home computer for the masses. Also, not long after, there were Altair clones that cut into MITS profits.
So then about a year later, in 1976, the SOL-20 started to address some of those issues. To complete the system still required several S-100 cards, but the SOL-20 started to refine the idea of what a “home personal computer” would look like.
And, more software was becoming available, including the CP/M operating system and a word processing title called Electric Pencil. The SOL-20 and its accessories were still fairly expensive,
since there was still no streamlined approach to building them.
Another note here, Dennis Hayes also designed one of the first S-100 modems around this time. An asynchronous serial IO card lets two computers communicate with each other, but a modem allowed two distant computers to communicate across a phone line.
These modem devices led to the creation of electronic Bulletin Board Systems, with the first one going online in 1978. AT&T still had very tight control over the phone-network, related to what kinds of devices could be used across their network – like even answering machines (specifically they were concerned about electrical pulses being sent over the lines). But the FCC reigned them in and AT&T was eventually split up in 1984.
TRINITY OVERVIEW – 1976
Now we enter the pivotal year of 1977. But before doing so, it is important to talk about the “single board computer” work of 1976. This becomes the “Model-T” moment of personal computers. To produce 1,000’s of the same product in reasonable time and quality, the process needs to become as efficient and as least expensive as possible.
The more functionality that can fit onto a single board, that means one pass through an assembly line, and less parts to be tested and to subsequently fail. It also means the overall computer cabinet can become smaller, and all this helps reduce the cost.
There were several single board designs that were complete enough to be sold in 1976. They still required some technical know-how to setup and operate. Only about 200 Apple-I boards were sold, but that was enough to help kickstart financing the more well known Apple 2 a year later. The less expensive KIM-1 boards sold over 10,000 units, the “dash 1” meant 1K of memory.
Inspired by the KIM-1, a group of NEC engineers in Japan made the TK-80, where TK means Training Kit. Over 20,000 of those were sold. Development of the TK-80 eventually led to the PC-8001 system in 1979.
But besides being a fancy calculator, what could these systems do? How would they be marketed to everday consumers?
Well, other than BASIC and Electric Pencil, another major piece of software made for these early microcomputers was Microchess by Peter Jennings. Using just 1K of memory, you could actually play a full game of chess against the computer. The famous chessmaster Bobby Fischer was also
involved in that project, making the program better for many years later on a wide range of platforms.
Anyhow, in 1974-1975, the fashion was that an amateur computer would generally be at least six separate boards. By 1976, this got refined and consolidated down to a single motherboard.
With that in mind, now we’re ready to talk about The Trinity…
TRINITY – 1977
The original three microcomputers that ended up being the most successful were the Commodore PET, Apple 2, and Tandy TRS-80.
The term “Trinity” came many years later, around the mid-1990s. But back in 1977, there was a lot of microcomputer competition, and nobody yet knew which ones would end up becoming successful.
One competitor was IMSAI, who had basically upscaled the Altair design and made it into a more PDP-8 like minicomputer style. They were successful for what they were, but weren’t sticking with the affordable-computing ideal that started the homebrew computing clubs.
SPHERE was another almost-was competitor, they had a functional personal computer by 1975, that was based on the Motorola 6800 chip. Allegedly only about 1000 units were sold before they had to fold up shop.
But as one looks at this variety of competitors, you see that the form and shape of a personal home computer is still up in the air.
One main theme in this generation of microcomputers was having the built in ROM BASIC on startup: you could turn the system on and start programming right away, or load in an existing program from tape. Nobody had disk drives yet, since nobody sold any software on disks anyway. Disks became more common by 1980, when Zork-1 was first ported from a PDP-10 to the TRS-80. Zork on a cassette just wouldn’t work.
The other difference was that these were prepackaged standard models, which could then have standardized training and technical manuals. The all-in-one Commodore PET was an easy system to order for schools, and Apple followed suit in catering to schools since there was a corporate tax incentive for doing so.
Still, for the first time, an individual could go to a computer store, and come home with a personal computer. No soldering iron or electronics know-how was needed, you could set it up in the dining room or living room, or a home office space. And these pre-packaged units were called “appliance computers” – you set them on a desk, like other standard home appliances and just plug them in and turned them on (like a toaster, microwave, cloth iron, coffee maker, etc.).
Commodore and Apple negotiated retail sales at various computer stores, or “CompuMart”‘s.
You can see in the old BYTE articles, they list the address of these stores across many states. In contrast, Tandy was able to use its already established Radio Shack stores.
So, here we are, the first generation of appliances that could be called personal home computers. You didn’t need to lease anything, you just power them on and write whatever software you wanted. And they were affordable enough that a friend at school or in the neighborhood probably had one too, so you could start sharing software and ideas on what to use the system for.
1977 was just the beginning of these desktop sized personal home computers. Sales didn’t really
start until 1978, as the companies involved ramped up the production capacity, acquired chip supplies, and began to build up a reputation in what software was available. But each of them were all-in on making this new home appliance a reality, offering a warranty, lots of documentation, and encouraging third party software and technical manuals.
The Zilog 80 processor was a cheaper alternative to the Intel 8080. But the CPU is just one aspect of the computer system as a whole, like the main engine in your car. What features were added
impacted the price and success of a system. The Z-80 ended up being used in a long line of successful systems.
Gary Kildall, author of the CP/M operating system, also particularly favored the Z-80 processor, which we’ll come back to that a little later.
Notice how these systems are attached to a television, which was a concept we mentioned earlier with the Odyssey. While this reduced cost of the system, the standard television resolution made it difficult to present 80 column text. That was ok for now, but just like only using a single 64KB address space, it limited what the systems could be used.
Similar to the Z-80 and 8080 relationship, the 6502 was a cheaper alternative to the Motorola 6800.
And the 6502 was one of the most successful processors of all, being used in the Apple 2 and later in the Commodore C64 (which was made famous because of its SID audio chip).
While these were all fun and affordable systems, the reality is they were still “trainer systems”. The TV resolutions were hard on the eyes for any serious document work, and each system had different scan codes or character set, making them all largely incompatible with each other. BASIC helped bring some uniformity, but things like vector graphic modes or software that needed to use interrupts for specialized timing ended up needing specialized extensions to BASIC.
Up until 1980, the TRS-80 remained the most popular system in terms of units sold. It was the least expensive and the Radio Shack stores did very well at advertising and providing a reliable kind of service shop for those systems.
But two things changed Apple’s fortunes right around 1980:
First was the development VisiCalc, a groundbreaking piece of software that introduced the concept of spreadsheets and was first made available on the Apple 2. NOTE: Leonard Tramiel (of Commodore) states during a 2022 interview that he had an early review of VisiCalc, but as a proficient programmer himself he didn’t see much value in it at the time. And so this is why VisiCalc first ended up on the Apple2 instead.
Second, Paul Allen of Microsoft had the brilliant idea of taking advantage of a feature on the
Apple2 bus slot 0, that would let them insert a Z-80 processor. At the time, Microsoft was focused on supporting upcoming 16-bit processors and didn’t have the resources to port all their software to the 6502-based systems like the Apple2. But now with the Z-80 SoftCard, all of Microsoft development tools could be used on an Apple2, which included BASIC, FORTRAN and COBOL compilers — and in addition, users of the Apple2 could also run the CP/M operating system and all its established software (WordStar and dBase being two prominent ones).
This card ended up being very successful, and was a win-win for both Apple and Microsoft
in popularizing both of their main line products. Of course, Steve Jobs wasn’t too happy about it,
since Apple wanted to maintain tighter control of the software on its platform — and Microsoft
had just broken right through that.
ATARI 400/800 AND TI-99 (sadly, discussion of these systems got cut in the final, instead we did retain a longer segment about VisiCalc)
Before moving on, there is one other important detail to mention about 1979: this was year that both Atari and Texas Instrument also got into the microcomputer business.
The founders of Apple, Steve Jobs and Steve Wozniak, had both worked at Atari. And by this time,
Atari already had the very popular Atari 2600 game console.
Meanwhile, Texas Instrument was still the largest manufacturer of microchips at the time.
They were rather late to the microprocessor party, but their TI-99 offering was the first true 16-bit microcomputer.
And with so many players now involved, it was just a matter of time that IBM would react in one way or another.
IBM PC 5150
So, that brings us to the next chapter in the story which is the dramatic entrance of the IBM PC.
Recall the story of the Wang 2200 being rolled into an IBM executives office years earlier, and that resulted in the SCAMP prototype and the IBM 5100?
Well, a similar thing happened again in 1980: a group of IBM engineers demonstrated an Atari 800 to an IBM executive, while also discussing how VisiCalc had been ported to nearly all these platforms.
So, a team was given the go-ahead and make a prototype within a few months and a final
product within a year. But this time around, the team was able to use any off the shelf parts they wanted, rather than strictly in-house IBM parts.
There is much already written about the IBM PC and its development decisions. But a key
reminder here is that IBM was still under the antitrust lawsuit from 1969.
So IBM was still on their best behavior, and every aspect about the new IBM 5150 system was described in the manuals — so there was no mystery on how it worked.
Most computers manuals at the time would include wiring diagram schematics, that had been a tradition for many years. But IBM also included the full source code of the BIOS. That had some huge ramifications. It’s one thing to provide a binary, but should others be able to copy your source code even if you completely print it out for them?
Well, within a year, various companies were starting to make IBM PC clones. We’ll just say some companies were more successful at handling the legal issues of making an IBM clone than others.
NOTE: Eagle PC tried to use an exact copy of the IBM BIOS. Whereas COMPAQ went through great pains to derive a compatible substitute using “clean room” techniques.
The other interesting story about the IBM PC is the operating system. It would seem almost by
default they would use CP/M, as it had already established itself in the 8080 community.
But recall how the Z-80 SoftCard allowed Microsoft to continue focusing on 16-bit software development?
Well, when the IBM and CP/M collaboration fell thru, Bill Gates was ready with a backup plan: he quickly acquired 86-DOS before the ink had even dried on their user manual. He offered this up to IBM for use as PC-DOS, while Microsoft was allowed to maintain rights and continue developing their own MS-DOS. This was largely due to that antitrust case, where IBM wanted to avoid again being accused of overly bundling software. In fact, IBM executives rejected the idea of buying Microsoft, even though the offer was well in front of them.
Many in the industry got hints on what was coming from IBM – but few thought IBM could actually pull it off so quickly.
Like nearly all first-edition microcomputers, the IBM PC wasn’t perfect on launch. Some BIOS updates and refinements were needed over the next year. But even so, the presence of IBM in the microcomputer domain sent a shockwave. The 80 column MDA screen allowed crisp word processing work. This was finally a professional machine, yet still affordable enough to be used at home.
In addition, the overall “look” of the home computer was established with front mounted disk drives, detached keyboard, and detached high resolution monitor. And PC-DOS for sure had its limitations, but Microsoft was quickly expanding its capabilities – some of the most important being the software interrupts and support for device drivers.
A proper DOS helped push the system beyond 64KB – in fact, up to 10 address spaces were available, meaning up to a full 640KB of RAM could be used. The remaining upper 384KB of address space was reserved for other functions – this design approach had some negative ramifications later, but for now was good enough since hardly anyone could afford that much memory anyway. Most mainframes didn’t even have that much memory.
With all the other microcomputer competition now eating up market share, that pesky antitrust lawsuit was abruptly dropped just a few months after the IBM PC 5150 launch. Despite the competition, the IBM PC soon had the same problem as the MITS Altair from years earlier: they couldn’t make the systems fast enough! But with the extensive resources of IBM, soon the IBM PC was the first computer to reach over 1 million units sold. And IBM was encouraging industry participants and partners to make aftermarket peripherals and software for this new system.
As mentioned earlier, there were a lot of IBM PC compatible clones that came on the market.
Tandy was one of the successful companies to do this, but with a bit of irony in doing so.
In 1984, IBM began to market a less expensive system called the IBM PCjr. With a wireless keyboard, 16-color graphics, 3 voice audio, and a cartridge system, the system was intended for younger users — but the keyboard was awful, and several cost cutting design aspects made the machine overall noticeably slower than the original IBM PC (despite using the same 8088 processor).
And the use of proprietary expansion, the sidecards, was also awkward. On top of all that, the IBM PCjr wasn’t all that much cheaper.
Now, a year earlier Tandy had a model 2000 that also wasn’t very successful. It is one of the only systems that ever had the 80186 CPU. It wasn’t quite IBM PC compatible and had other technical issues. But Tandy learned from those mistakes, and the subsequent Tandy 1000 was said to be what the IBM PCjr should have been.
IBM discontinued the IBM PCjr within a year, and the Tandy 1000’s were improved and sold well
for many years afterwards.
Now we come to the famous Macintosh. It’s 1984 superbowl commercial remains memorable to this day, which is themed off the famous short story that is also titled “1984.”
The Macintosh advertisements set a kind of uppity tone that contrast with the grassroots humble beginnings of the homebrew computer clubs.
But like the Lisa system before it the Macintosh was striving for design excellence and had nailed it with the overall look of the system and the graphical operating system.
The Macintosh finally brought together concepts from over a decade earlier into an affordable desktop appliance…
(the portion of ENGELBART video about describing the mouse is played)
The Xerox Alto was a legendary system from 1973, that included a graphical interface, mouse,
and network capabilities – but the system cost more than an average house.
As some neat trivia, the base version of both the Alto and the Macintosh had 128KB and both systems could be expanded to 512KB.
The Macintosh was a great achievement and did show the “art of the possible” in a microcomputer package. But Apple itself had other product offerings that were less expensive and more popular in terms of sales.
In general, the text-based operating systems continued to dominate for the remainder of the
decade, until OS/2 WARP and Windows. But that’s another story.
The last chapter of this story is how computers became not only personal, but also truly portable.
The shape and form of a desktop computer had been established, and microprocessors used in
integrated systems had proven themselves performant enough to provide a great deal of utility at an affordable price.
Now manufacturers began to shrink the computer even further, trying new ideas in industrial design, and integrating with more power efficient technologies like LCDs, so that the entire computer could run on batteries for many hours.
The Model 100 was another successful Tandy product, with over 6,000,000 being sold – used by
students, journalist, authors, it was very convenient for typing notes, keeping schedules, and simple games while on the go.
The Sharp PC-5000 was itself far less successful, as its 8 row screen was hard to read, and its
bubble-memory tech ended up being overly expensive. The thought was bubble-memory was more reliable, as it was used in the earlier GRiD Compass that was flown and operated in the NASA space shuttle. Sharp came out with a much improved model a year later, the PC-7000, which did not use any bubble memory.
(note, the year 1976 was the bicentennial year of the United States; the idea of “revolution” was in the air – not so much in a violent revolution, but in this context a revolution of bringing down high priced computers and making the technology more widely available)
So, that’s the story of how the home computer came to be.
This is by far not the whole story. Minicomputers continued to progress and be used, such as by
CompuServe or in the making of movies like Tron, as well as many other medium-business needs.
And game consoles were also evolving during this journey, pushing the envelope of audio/visual experiences.
But microcomputers did introduce the idea of store packaged software. This was an entirely new form of media that began to appear in stores, in small boxes with extensive printed manuals, that collectively taught people how to use these new computers.
It is this software that answers the question of what anyone would do with a microcomputer — aside from productivity packages like spreadsheet, word processing, charting tools — it was
interactive games that offered an entirely new way for stories to be told — like with the Sierra graphical adventure games — and also allowed much more complex strategy games than traditional boards games. The Ultima series was a great early examples of that.
Digital logic combined with complex math modeling could be used to present simulated virtual experiences, with one of the first prime examples being Microsoft Flight Simulator.
Another example is AutoCAD, a software title that continues to be used today for civil engineering projects.
And it was the single board design facilitated by microprocessors, combined with affordable CRTs, more efficient power supplies, and SRAM technologies that was key to making all this available to everyone.
Ever since the invention of the lightbulb, electricity has been used for many household
consumer devices. And almost exactly 100 years later, the desktop computer is one of the last major members to that list of appliances that helps to make our lives more convenient.
HOW RARE ARE THESE COMPUTERS?
I am an engineer, not a collector, so the following is purely my own biased estimate/opinion. But I did study antique upright pianos for a while (ones from 1880 or earlier), including an examination into the world of antique piano restorations. One thing I learned is the folks in that domain value the materials involved, that is the “denser woods” that are much more difficult to obtain these days. Another thing they value is “100% original parts” (to the reasonable extent possible). There is general agreement that those antique pianos simply will no longer be “great instruments” (requiring constant re-tuning), but are valued as artistic and historic pieces (like old paintings).
The equipment before 1977 was largely NOT mass produced (with one exception probably being the Odyssey). This means limit production runs, as workers largely put the systems together “by hand.” In this context, “by hand” means the individual components or the arrangement of those components might be different between one build and the next (whereas “factory produced” tends to be more consistent, although sometimes there can be batch variations, or US vs Asian factory production runs). As an example, the IBM 5100 has internal parts that are identified by hand-written numbers.
The “price” or “value” of these items is proportional to their rarity or how difficult it is to find. For example, only about 120 Xerox Alto’s were made, but were very capable systems and highly influential, so they are extremely valuable. In contrast, only 40-50 KENBAK-1 units were made. These are more rare, but they were not as capable of influential systems (so they are still valuable, but would not be as valuable as an Alto despite being relatively more rare)
In later systems, there were many variants of subsequent models made. This make it hard to precisely capture a “units sold”, since is it original models or also a few years of subsequent model updates? For example, the IBM PC 5150 is said to have sold between 200,000 and 700,000 in the first year, but the model in 1984 was slightly updated and sold up to 1987 (meaning over its life, many millions of 5150’s were produced and sold).
My general estimate is this:
- The models from 1970-1974 were about 3000-5000 units sold. This includes the Datapoint, Wang, Intellec, HP programmable calculators, MCM/70. This may have been roughly that many number per year, but that’s also including subsequent models or variants.
- Models from 1975-1977 were about 10,000 units sold. This includes the Altair 8800, SOL-20, and IBM 5100. The number of Altair and SOL kits is not known, nor the number of completed kits. For the IBM 5100 and subsequent 5110 and 5120, it is said about 10,000 of each of those models was made (about 30,000 units in total, but the percentage of APL vs BASIC configurations is unknown – it said only about 10% of them had APL).
- NOTE: It is known that only about 180 Apple-1 boards were made, in contrast somewhere between 10,000 and 20,000 KIM-1 boards were made (as it was about 1/3rd the price and arguably more capable). This does mean the Apple-1 boards are more valuable, since each one was handcrafted and/or inspected and handled by either Jobs or Wozniak themselves. Whereas to sell that many KIM-1 boards, they couldn’t have possibly been made by just a few individuals (though the KIM-1 remained useful and popular for many years, with ads for it as late as 1980), so they were a production run.
- NOTE: Prototype boards, which are generally non-functional (that is, no longer function) are highly sought after. They are like the “original draft” of a famous novelist – they represent “the thought process.” In 2022, there was a famous auction for a “cracked Apple-1 prototype board” that sold for quite a bit. The engineer in me doesn’t see much value in these, but they do correspond to rather “unique moments in human history” and so collectors do appreciate them for that.
- Between 1977 to 1980, the “trinity” units were available but not yet quite that popular. Estimates are roughly 30,000 units per year (with the majority going to the TRS-80). Sales didn’t really begin the very end of 1977, so really there are only about two years of production here. The TRS-80 may have been closer to 50,000 units per year and did achieve 100,000 units sold before 1980.
- NOTE: An important point here is that, by far, Apple’s success was not guaranteed in 1977-1979. There were numerous other (cheaper) options and the community was weary of startups that suddenly disappeared and unable to support their product (Sphere, for example). Apple had gambled a lot on aggressive full-page advertisements, and smartly took tax-advantages in donating some systems to schools (which Commodore was doing as well).
- On 1980, the combination of both VisiCalc and the Z-80 SoftCard did change Apple’s fortunes. There was also an FCC issue where Tandy had to stop production of the TRS-80. But “appliance computer” sales in general were still small. The Apple2 Plus was a good improvement and may have achieved 100,000 unit sales per year, though Apple sales didn’t really propel until the Apple2+ Enhanced in 1983.
- The VIC-20 is indeed the first one to reach 1,000,000 units sold in 1981. But this is a bit of a fluke. (re: $50 coupon).
- The IBM PC 5150 didn’t go on sale until late 1981, so its 1981 sales can’t really be counted. But essentially its first year of sales was more than all the other computers combined (including Apple’s), it really was a huge hit. IBM did struggle to ramp up production, as they were caught off-guard as well since they didn’t expect it to be so successful either. But, IBM did have the resources to ramp up that production. Aside from the fluke of the VIC-20, the IBM PC was the first to reach 1,000,000 units sold. Apple didn’t reach these numbers until one or two years later (with the Apple II+ Enhanced).
From there, “the rest is history.” To summarize:
- 1970-1974 systems: 3000-5000 units per year (roughly speaking; not over 10K/yr)
- 1975-1977 systems: 10,000 units per year (not over 20K/yr)
- 1977-1980 systems: 30,000 units per year (except IBM 5110/5120, was 30K total)
- 1981: VIC-20 1,000,000 units fluke
- 1982: first year of IBM PC, nearly 1,000,000 units per year
- 1983: Apple II (and variants) reached 500,000 units per year
And this can help give a rough idea on the rarity and why some of the older systems are more valuable (by being much harder to find). Consider that a full football stadium is roughly 50,000 people. We don’t have accurate information on how many units remain. Many systems (Wang 220, IBM 5100) were passed down to college or high schools after being replaced by new systems.
NOTE: Another thing to consider is that systems before 1975 may have “difficult” power supplies. The Datapoint 2200 is particular had a “troublesome” power supply. The development of reliable switch power supplies was still fairly new at that time.
SYSTEM STATISTICS (TBD)
Prices are in the “as released” year (not adjusted).
|Index||Name||Release Date||Price Low||Price High||Memory||Processor||Units Sold|
|6||Intel Intellec 8||1973-06||$2,395||5KB-16KB||Intel 8008|
|10||Micral N||1973||$1,750||Intel 8008|
|12||MITS Altair 8800||1975-01||$650||$1,239||Intel 8080|
|13||IBM 5100||1975-09||$8,975||$19,975||8KB-64KB||TTL Microcode|
|14||Tektronix 4051||1975||$5,995||Motorola 6800|
|18||Apple I||1976-04-11||$666||MOS 6502|
|20||Commodore PET||1977-01||$795||4KB-32KB||MOS 6502|
|21||Apple II||1977-06-10||$1,298||$2,638||4KB-48KB||MOS 6502|
|25||Atari 400/800||1979-11||$550||$1000||4KB/8KB||MOS 6502B|
|26||Sinclair ZX80||1980-01||Zilog Z80|
|27||Tandy Color Computer||1980-09||$399||4KB-32KB||Motorola 6809E|
|29||IBM PC 5150||1981-08-12||$1,565||$6,500||16KB-64KB||Intel 8088|
|30||BBC Micro||1981-12-01||$250||$350||16KB-32KB||MOS 6502|
|31||Commodore C64||1982-01||$595||64KB||MOS 6510|
|33||Sharp PC-5000||1983-11||$1,995||128KB||Intel 8088|
|34||Tandy Model 100||1983||$1,099||$1,399||8KB-24KB||Intel 80C85|
|35||Apple Macintosh||1984-01-24||$2,495||128KB||Motorola 68000|
|36||Tandy 1000||1984-11||$1,200||128KB-640KB||Intel 8088|