You often hear people describe modern smartphones as a “supercomputer in your pocket.” There’s definitely some truth to that, especially when you compare today’s phones with the supercomputers of 40 years ago. In the mid-1980s, supercomputer manufacturers were still struggling with concepts like parallel processing, i.e., breaking down a program into smaller tasks that could run simultaneously on multiple microprocessors. Today, in contrast, multi-core, multi-threaded CPU cores are the norm on just about every personal computing device.
For this episode, I’m going to handle things a bit differently. There was only a single guest–Stanford University computer science professor Edward Feigenbaum–and the subject is one that, quite frankly, does not strike me as all that interesting. So rather than do an extended point-by-point recap of the episode, I’m just going to summarize in broad strokes. Trust me, if you had watched the episode, you’d thank me. Stanford Professor Discusses AI, Future of Japanese Computing Initiative You may recall that Feigenbaum appeared in an earlier episode, which I recapped in Part 20.
Personal computers of the early 1980s were often limited to just a few colors for on-screen graphics. The Apple IIe, for example, could display up to 16 colors at one time depending on the screen resolution. And of course, no home computer of this era could produce genuine 3D graphics. That capability was limited to very high-end machines designed for industrial or commercial use. The Special Talents of Computer Graphics Which brings us to our next Computer Chronicles episode from 1984.
In Part 14 of this series, the Computer Chronicles first discussed the subject of “expert systems.” This referred to computer knowledge bases that purported to replicate a human’s expertise in a particular field. This next Chronicles episode revisits the idea of expert systems as part of a broader discussion of artificial intelligence. The Link Between the Mechanical World and Abstract Concepts Herbert Lechner is back as Stewart Cheifet’s co-host for this episode.
When Apple released the Macintosh–later known as the Macintosh 128K–in January 1984, its main selling point was the graphical user interface (GUI). Although the original Macintosh operating system’s GUI was largely based on what Apple deployed on the Lisa a year earlier, the company believed the new machine’s lower price point would make the interface more accessible to a larger audience. Of course, the Macintosh was not exactly cheap, even by 1984 personal computer standards.
Today, Python is probably the most popular computer programming language taught in elementary and secondary schools. (There’s even a terrific podcast, Teaching Python, on this subject.) But back in the 1980s, BASIC was the language of choice for many introductory computer classrooms. Specifically, versions of Microsoft BASIC came with many popular 8-bit microcomputers, including the Apple II and Commodore 64, which were also commonly used in schools at the time.
The episode I’m covering today taped on January 18, 1984, which was four days before Super Bowl XVIII. That game would go down in computing history for the famous Apple “1984” commercial that announced the launch of the original Macintosh (later known as the Macintosh 128K). As this Chronicles episode aired the week after the Super Bowl, Stewart Cheifet devoted a good portion of the post-show “Random Access” segment to the new machine and what it might mean for Apple for the rest of 1984.
Today’s episode contains what Stewart Cheifet would later describe as one of the classic “near disasters” involving a product demonstration on The Computer Chronicles. The subject was the first Xerox Color Laser Printer, which was actually a prototype not yet available for sale when this Chronicles episode taped in October 1983. Cheifet recounted the event to Tonya Hall of ZDNet in a November 2020 interview: We introduced the very first color laser printer on the show by Xerox.
Computer architecture is usually described in terms of bits. For instance, we often speak of early personal computers from the late 1970s and early 1980s as 8-bit machines. In simple terms, this means that the CPUs in these computers could only address 8 bits of data at a time, with each bit representing a single binary digit (0 or 1). But even when the first episodes of The Computer Chronicles started to air in late 1983, there were already 16-bit processors on the market, such as the Intel 8086, and 32-bit machines had started to become a reality.
David Murray, who goes by The 8-Bit Guy on YouTube, had a great video a couple of years back on “How Speech Synthesizers Work." He explained that early devices like the Texas Instruments “Speak & Spell” were not true speech synthesizers, as they relied on a limited vocabulary of pre-recorded words. But even in the mid-1980s there were speech synthesizers that could build words out of basic sounds. Today’s episode of The Computer Chronicles from early 1984 also examined the status of speech synthesis during this time period.