Computer Chronicles Revisited 32 — UNIX System V SVR 2, BSD 4.2, and the HP Integral PC


In 1969, a computer scientist at Bell Labs named Ken Thompson wrote an operating system for a Digital Equipment Corporation PDP-7 minicomputer that he initially dubbed Unics. This was later renamed UNIX and continued to be developed at Bell throughout the 1970s, with each numbered version corresponding to the current edition of the accompanying UNIX Programmer’s Manual. In fact, just a few days ago marked the 50th anniversary of UNIX Version 1, dated November 3, 1971.

Under a 1956 antitrust consent decree with the U.S. Department of Justice, the Bell System (which owned Bell Labs) could not market UNIX as a commercial product. Nevertheless, Bell offered highly restricted UNIX source code licenses to corporate and educational customers. Basically, these licenses offered UNIX “as is” without any warranty or commercial support.

One of these early UNIX licensees was the University of California, Berkeley. While on sabbatical from Bell Labs in 1975, Thompson helped Berkeley install what was then UNIX version 6. Berkeley then created its own derivative that became known as the Berkeley Software Distribution (BSD) 1.0.

Fast forward to 1985. Another federal antitrust case led to the breakup of the Bell System into AT&T and several regional telephone companies. One consequence of this was that the 1956 consent decree’s prohibition on marketing commercial products no longer applied. This meant AT&T, which retained ownership of UNIX, could now offer it as a commercially supported product. This led to the release of UNIX System III in 1982, followed by UNIX System V in 1983. (Don’t ask me why there was no System IV.)

This leads into our next Computer Chronicles episode from early 1985, which focused on the current state of UNIX. At this point the most recent AT&T release was UNIX System V, version 2 (SVR2 or 5.2) and BSD was now at version 4.2. There were also several other UNIX derivatives active on the market, such as XENIX (originally developed by Microsoft and based on System III), SunOS (a BSD 4.2-derivative used by Sun Microsystems), and the newly created HP-UX (a Hewlett-Packard System III derivative), which was featured in one of this episode’s product demonstrations.

Eunuchs on a Computer Show?

Stewart Cheifet did his cold open for this week from some unidentified facility running a bunch of UNIX machines. We see one man running an Amdahl 580 mainframe. Cheifet noted that while there was a time when UNIX was only used on these types of large computer systems. But nowadays, UNIX was showing on personal computers, such as the Radio Shack TRS-80 Model 16B. Could this mean UNIX will become the next MS-DOS?

In the studio, Cheifet relayed this anecdote to Gary Kildall: “Gary, I told a friend of mine we were doing a show on UNIX. And he said to me, ‘Why in the world would you want to feature a eunuch on a computer show?’” The moral of the story, Cheifet said, was that while some computer people were talking about nothing but UNIX, many personal computer users didn’t even know what UNIX was. So why all the sudden excitement about UNIX? Kildall replied that the excitement wasn’t really sudden. UNIX itself had been around since the late 1960s. The problem was that microcomputers didn’t have the power to support UNIX in terms of memory, hard disk capacity, processor speed, and so forth. Now there were micros that did have such power, which made UNIX a serious contender for an operating system standard.

Could UNIX Prove Itself on Micros?

Wendy Woods’ first remote segment for the week featured B-roll of UNIX-based computer systems in use at the University of California, Berkeley. Woods noted the original UNIX developed at Bell Labs in the 1960s was later adopted by the University of California for academic research and publishing. Here at Berkeley, the history of science department relied on a network of UNIX terminals to typeset their journals and directories.

Why use UNIX? Woods said that apart from the fact a major version of UNIX was developed at Berkeley, UNIX was an eminently portable system that ran on everything from micros to minis to mainframes. UNIX could be easily adapted to these different systems with few or no changes required.

Woods said that the typical UNIX network ran on minicomputers. User-initiated tasks were distributed among the computers through port selectors. But the advantages of UNIX were not limited to multi-user portability. The system was also fast and powerful. Once the user had mastered UNIX’s “unorthodox” commands, Woods said they were rewarded with some very sophisticated features such as background processing. This meant that after initiating a task, the user could initiate a second task while the system continued to work on the first.

Surrounding the kernel of the UNIX system, Woods said, was a shell or central command interpreter that redirected applications, controlled input and output, manipulated files, and stored command sequences. The shell structure was user-friendly and allowed even a first-time user to perform complex tasks. Indeed, Woods said that UNIX users gave the system high marks for versatility, but up until now the operating system had been largely confined to a small group of specialized users. The question was whether or not UNIX could prove its value to the “rest of us.”

“The Normal Behavior Was to Find a Guru or Wizard and Chain Him to the Machine”

In the studio, Mark Sobell and John Mashey joined Cheifet and Kildall. Cheifet asked Kildall to provide a brief background on UNIX. Kildall explained that UNIX had “come down from” the world of minicomputers to microcomputers. UNIX carried with it the C programming language, the UNIX file system, and the UNIX interface. Kildall noted the C language in particular was wildly successful on a lot of operating systems, although the user interface still needed to be addressed, as it was considered unfriendly.

With that, Kildall asked Sobell, the author of A Practical Guide to the UNIX System, to demonstrate an application that aimed to make UNIX more user friendly. Sobell explained that UNIX involved using a lot of short commands, as it was originally designed for terminals that ran slowly off of teletypes. Today, you were seeing more applications built with user interfaces derived from MS-DOS. Using an AT&T-branded UNIX workstation, Sobell demonstrated one application, called File It, which he said incorporated UI elements from Lotus 1-2-3.

Kildall said that one of the ideas pioneered by UNIX was the idea of software portability–i.e., you could move an application around to various types of computers. Mashey, who worked on UNIX at Bell Labs in the early 1970s and was now with MIPS Computer Systems, said that was probably the most important aspect of UNIX. He said UNIX started out on some Digital Equipment Corporation machines and had since been moved to between 50 and 100 different kinds of processors. That was important because if you were investing a lot of effort to build software, it was helpful to know that you could move it around, say from a personal computer to a large mainframe.

Kildall added this was also true for software designers, because they could take a language like C and its standard runtime library and write an application that would run on UNIX as well as MS-DOS or Digital Research’s Concurrent DOS. Mashey noted that it was the fact you could move C programs very easily that also made it possible to move UNIX to other machines.

Kildall noted that one criticism of UNIX was that it had been unreliable. Was that still the case? Mashey said he thought UNIX was fairly reliable these days. He (semi) joked that, “originally when you got your UNIX system, the normal behavior was to find a guru or wizard and chain him to the machine,” so that someone was there to pick up the pieces when the computer broke. That was around 1972-1973, he said. But as UNIX started moving into computer centers and more people depended on it, a lot of work was done to make it a much more reliable system. Today, UNIX was used in some fairly demanding applications. He added that one thing many people didn’t realize was that UNIX touched them through the telephone system almost every day–if you wanted to get your phone fixed, the system that tracked all that work was run by UNIX.

Cheifet chimed in at this point, noting that while Sobell’s earlier demo showed how you could “friendly up” UNIX to make it look like MS-DOS, why should the average PC user care? What did they get from a UNIX-based application that they couldn’t get from a DOS application? Sobell replied that everyone didn’t want UNIX. If a user had a DOS system running their applications and it did what they needed it to, then maybe UNIX wasn’t for them. As for the advantages of UNIX, Sobell said it was a multi-user operating system. For example, he could perform a query and look at data while at the same time another user could change that same data or run a graphics or word processing package on the same machine. So everyone didn’t need to fight over one machine.

Sobell also reiterated that UNIX was a multi-tasking operating system. Kildall pointed out that UNIX was no longer the only microcomputer operating system with that feature. He noted that IBM’s forthcoming TopView would have multi-tasking, and his own Concurrent DOS did as well. Sobell added that at some point in the future, local area networking would be a viable alternative to the traditional multi-user environment. Right now, he said, “You can’t plug every computer into the same network.” But that was something to watch for in the future.

Kildall ended the segment by bringing up the subject of graphics. He said that assuming TopView would address the problem of a standardized graphics interface, at least as far as the PC was concerned, what would that mean in terms of the evolving UNIX standard for graphics display? Mashey said at this point it was hard to tell. UNIX was originally designed for fairly slow-speed terminals. That was still its standard interface. UNIX had also become fairly standard in its handling of CRT displays. But work still needed to be done in the graphics standards.

Bill Joy on the Future of BSD and Its “Open Source Code”

Before returning to the studio for the final round table, Wendy Woods presented her second remote feature, this time on Sun Microsystems and one of its co-founders, Bill Joy, who was also behind the Berkeley Software Distribution (BSD) 4.2 version of UNIX. Woods said the success of BSD could be measured by the success of Sun itself, which produced a line of minicomputers and workstations based on the 4.2 release. Woods noted that Sun had grown from 6 to 1,000 employees in just three years. The company currently shipped more than $7 million in equipment each month, primarily to the technology and engineering communities, which found BSD 4.2 the ideal operating environment for their specialized software needs.

Despite this success, however, Woods said Joy wasn’t allowing BSD 4.2 to stagnate. He continued to work closely with his university roots to improve upon UNIX. Joy told Woods that UNIX had hit the point where we were “starting to get standardized by standards committees” and there was an attempt by major corporations like IBM and AT&T to control development. That inevitably meant slowing down the rate of change in the operating system.

Regarding the future of BSD, Woods said its “open source code,” versatility, and ability to work on a variety of machines meant it would remain popular with scientists and engineers for some time. Joy added that UNIX would be important if it could deliver applications that people wanted to run. The technology market needed more operating system capabilities than you were likely to see from the smaller systems. In that sense, Joy said there really wasn’t an alternative application delivery vehicle at the moment.

But would UNIX ever make it big in the business market, Woods asked in narration. She said Joy doubted it. He said what would be needed was a UNIX-based program like Lotus 1-2-3–something big and successful. And until that was written, Joy was not super hopeful about UNIX’s future as a business OS.

Why Were There So Many Versions of UNIX?

Back in the studio, Doug Hartman and Mike Denney joined Cheifet and Kildall. Hartman was a UNIX specialist with Hewlett-Packard, which had recently released a new UNIX-based microcomputer, the HP Integral PC. Kildall opened by noting that microcomputer users now expected some form of graphical animated communication with their programs. Hartman said the Integral relied on “Windows menus” to give the user a friendlier UI.

The HP Integral PC, a portable computer featuring a detachable keyboard, a small black-and-amber screen, and a 3.5-inch flopp disk drive attached to the side of the monitor.

Sticking to his main talking point from earlier, Cheifet asked how this made UNIX special, given that non-UNIX computers could do some of this stuff as well. Hartman, also sticking to the talking points, said UNIX offered compatibility with other machines and software portability. He said the idea that you could take programs written on one machine and bring them to another UNIX machine with very little effort made by the software developer’s job easier.

Cheifet then asked if his guests could clear up the confusion regarding the different UNIX variants. He listed just a few: BSD 4.2, Xenix, UNIX, HP-UX, PCIX, System III, and System V. “Why are there so many versions of UNIX?,” he asked. Denney replied, “Just luck.” Denney, the co-founder of a company called Basis, Inc., explained that the proliferation of names besides UNIX was dictated by AT&T’s requirement that you not call any of the derivatives UNIX. This meant every manufacturer had to come up with their own name. For the most part, however, everything was fairly standard. A Xenix machine looked like UNIX to him. It might have a few extras here and there, but to move even between the BSD and AT&T camps of UNIX showed they were very strikingly the same.

Kildall asked why system programmers felt the need to make local changes? He then managed to answer his own question by citing something that happened at Digital Research. Kildall said he walked into his computer center one night and asked one of the programmers about an I/O statement that wasn’t working right. The programmer then made changes right there and compiled it on the spot. This meant they were now running a different operating system. Denney noted that everyone “improved” the UNIX recipe that they received from grandma–or in this case, Ma Bell–and in particular, computer manufacturers received original UNIX source code and implemented it for different target markets.

Kildall asked about AT&T’s role in all this. Were they trying to bring all of these different versions back into one standard? Denney said it was quite apparent that AT&T would like to define System V SVR 2 (5.2) as the standard. But there was also the competing BSD 4.2 standard. They were actually quite different in terms of the underlying code, although they looked similar to the outside user, with the difference of some utilities. Indeed, Denney said it was quite common for manufacturers to adopt System V but put BSD utilities on top as extensions.

Kildall added that IBM also supported a couple different versions of UNIX. Was this just to fill the market out or were they really getting behind it? After all, IBM and AT&T were competitors. Denney said he wasn’t sure but pointed out that AT&T did use IBM computers in its organization.

Cheifet turned back to Hartman and asked about HP-UX, which was Hewlett-Packard’s implementation of UNIX on the Integral PC. Cheifet noted this was a $5,000 PC. And earlier, Cheifet said Mark Sobell used an AT&T machine priced between $12,000 and $15,000. Did we lose anything in the implementation of UNIX on these lower-cost machines? Hartman said the one thing left out of the Integral was the hard drive, which would allow the user to do their own program development. But by leaving out the hard drive, that allowed HP to reduce cost while leaving the user enough memory to run applications.

Cheifet said he’d read in the trade papers recently that Commodore had plans to come out with a “Mac lookalike” running UNIX. Was that possible? Hartman said the technology was certainly there to do such a machine. It was a question of if the applications were there to support the home user. Kildall remarked that if this rumor were true, Commodore’s machine would have to be different than the standard computer they sold at K-Mart.

The main part of the episode then concluded with yet another discussion of the relative advantages of UNIX over other operating systems, which largely reiterated the earlier talking points.

“The Wrong Tool in the Wrong Place”

Paul Schindler’s commentary can be summed up by this statement: “UNIX is the wrong tool in the wrong place in the micro world.” He argued that UNIX was nothing more than a development environment for minicomputers. But using it on a micro was akin to using a screwdriver to open a can of tomato juice instead of a can opener (which he demonstrated). Schindler largely dismissed UNIX’s advantages in terms of software portability, noting that there were “incompatible UNIX dialects galore,” and UNIX was ultimately a “user-hostile operating system” that was dramatically short on software.

Summer CES Features Robots, New PCs from Commodore and Atari

Before delving into this episode’s “Random Access,” a brief note on the timeline. Computer Chronicles would rerun episodes on PBS. In fact, I recently learned there were separate episode numbers assigned to the reruns. My theory was that this was likely because they would have to cut or swap out “Random Access” segments for the reruns.

That would appear to be the case with the recording of this episode that was posted to the Internet Archive. Although this episode has a broadcast date in late February, the context clues place this “Random Access” segment sometime in June.

With that clarification, here is the “Random Access” segment presented by Stewart Cheifet:

H-P Managed to Pack UNIX Down Onto a Single ROM Chip

UNIX was a heavy operating system by the microcomputer standards of the time. For example the Commodore 64, the most popular home computer around 1984-1985, had a built-in ROM that contained both the computer’s operating system and the BASIC programming language on a single 16 KB chip. In contrast, the HP Integral PC seen in this episode came with a 256 KB ROM that included the HP-UX operating system and the accompanying user interface. Of course, the Integral PC started somewhere around $5,000 while the Commodore 64 could be had for around $150 by early 1985.

And Hewlett-Packard had to put quite a bit of work to make a UNIX operating system fit onto a 256 KB ROM. Keep in mind, UNIX traditionally required a hard drive, as the typical size of a full installation was around 6 MB. As Doug Hartman explained, HP decided to forego the hard drive to keep costs down, so instead the engineers decided to build what they described as a “RAM-based emulation of a disc device” to mount the UNIX root file system.

In a Hewlett-Packard Journal article detailing the Integral PC’s development, Ray M. Fajardo, Andrew L. Rood, James R. Andrews, and Robert C. Cline claimed they had developed the “first RAM disc for a personal computer to manage its own space” dynamically. This meant unused RAM disc space was not “statically allocated” and could be “periodically freed for use by the rest of the system.” The authors claimed this actually made the RAM disc perform better than a hard drive as there was no “time-consuming mechanical head movement.”

The authors also decided to eliminate the traditional UNIX practice of storing data in an “intermediate buffer” before writing it to disc, which they said made the file system “extremely susceptible to corruption,” especially if the user suddenly removed a disc. For this reason, the Integral PC “immediately posted” all user data to disc without using any buffer, so that if the user pulled the disc out suddenly it would not damage the file system. The Integral also contained special processes for automatically mounting removable discs, which was not standard UNIX practice at the time.

I don’t get the sense that HP managed to sell that many Integral PCs. As one writer, Nick Walker of Personal Computer World, noted in his June 1985 review of the Integral, “If a UNIX machine was what you are looking for, then this may well be the cheapest way of obtaining one with the added advantage of portability, although higher-priced UNIX machines will be muti-user and higher performance.” Walker added that he was initially skeptical about a ROM-based UNIX system, but he thought HP had managed to create a “machine that makes an awful lot of computer power easy to use.”

While the Integral PC might not have been a market blockbuster, the HP-UX derivative of UNIX proved to have significant staying power. Hewlett-Packard Enterprise continues to develop the operating system for servers up until the present day.

The Looming UNIX Wars

Although Bill Joy came from Berkeley, the other co-founders of Sun Microsystems were from Stanford University. (“Sun” was derived from the initials of the “Stanford University Network.”) During the 1980s, Sun was known as the place to buy UNIX-based workstations. According to a history published by FundingUniverse, Sun was the “fastest-growing company in the United States” between 1985 and 1989, with a “compound annual growth rate of 145 percent.”

As Wendy Woods noted, much of that success was powered by Bill Joy’s work on BSD 4.2 and its associated utilities. This ultimately led to a 1987 alliance with AT&T in an attempt to unify the BSD and System V code bases. That, in turn, led to the so-called “UNIX wars.” I won’t go into detail on this right now, but suffice to say there were other UNIX licensees who were opposed to the AT&T-Sun alliance. These licensees created a new organization, the Open Software Foundation, to promote a competing standard.

To make a very long story short, the UNIX wars led to the creation of new BSD derivatives that did not use any of the copyrighted AT&T UNIX code. These BSD derivatives are still in active use and development today under a number of projects, notably FreeBSD, NetBSD, and OpenBSD.

As for AT&T, it ended development of System V after version 4 and transitioned to a joint venture with Novell to make a new UNIX system called UnixWare. Novell later took over the project on its own before transferring it to the Santa Cruz Operation in the mid-1990s. Meanwhile, Sun continued to develop its own SunOS, re-basing its later releases on System V and renaming the project Solaris. After Oracle acquired Sun Microsystems in 2010, it renamed the operating system Oracle Solaris.

Notes from the Random Access File