Received: from delta.eecs.nwu.edu by MINTAKA.LCS.MIT.EDU id aa04164; 27 Dec 93 7:19 EST Received: by delta.eecs.nwu.edu id AA15949 (5.65c/IDA-1.4.4 for telecom-recent@lcs.mit.edu); Mon, 27 Dec 1993 03:20:34 -0600 Received: by delta.eecs.nwu.edu id AA07995 (5.65c/IDA-1.4.4 for /usr/lib/sendmail -oQ/var/spool/mqueue.big -odi -oi -ftelecom-request telecomlist-outbound); Mon, 27 Dec 1993 03:20:01 -0600 Date: Mon, 27 Dec 1993 03:20:01 -0600 From: TELECOM Digest Message-Id: <199312270920.AA07995@delta.eecs.nwu.edu> To: telecom@delta.eecs.nwu.edu Subject: Special Report: Early History of Unix Here is a special report I think will be of interest to telecom readers on the forthcoming 25th anniversary of the invention of the Unix kernel at Bell Labs in 1969. PAT From: ronda@umcc.umcc.umich.edu (Ronda Hauben) Subject: Early Days of Unix - Draft for Comment Date: 26 Dec 1993 18:48:31 -0500 Organization: UMCC, Ann Arbor, MI I am in the process of working on the current draft and I would appreciate any comments, suggestions, additional information, etc. regarding the early days of unix development and the work to develop computer science that this early work on unix represented. Thanks. Ronda DRAFT On the Evolution of Unix and the Automation of Telephone Support Operations (i.e. of Computer Automation) by Ronda Hauben Abstract: 1994 is the 25th anniversary of the invention of the UNIX kernel at Bell Labs. The following article is a chapter in a longer paper documenting some of the events that have contributed to the development of a Global Computer Network in the past 25 years. This article describes how the need to automate telephone support operations in the U.S. in the late 1960s and the early 1970s nourished the birth and developement of the UNIX operating system and how academic computer science contributed to and gained from the development of UNIX. This article is intended as a contribution to a 25th anniversary commemoration of the significance of the UNIX breakthrough and the lessons that can be learned for making the next step forward. "I don't believe UNIX is Utopia. It's just the best set of tools around." -- Dick Haight, Unix Review, Jan. 1985, p. 117 "What does industrial computer science research consist of?....Although work for its own sake resulting, for example, in a paper in a learned journal is not only tolerated but welcomed, there is strong though wonderfully subtle pressure to think about problems somehow relevant to our corporation....Indeed, researchers love to find problems to work on; one of the advantages of doing research in a large company is the enormous range of puzzles that turn up....Thus, computer research at Bell Labs has always had a considerable commitment to the world...." -- Dennis Ritchie, "Reflections on Software Research," Communications of the ACM, vol 27, no. 8, August 1984, p. 759 "Bell had already gained some field support experience switching machines and their software. Supporting a network of mini computers would be a significantly different problem." -- August Mohr, "The Genesis Story," Unix Review, Jan. 1985, p.24 "From hence it necessarily follows...Rich and Poor, Young and Old, must must study the Art of Number, Weight, and Measure. Sir William Petty," Political Arithmetic," in Collected Works, vol 1, p. 261. During the formative years in the creation of the Arpanet, which was to become the backbone to the Global Computer Network, there were similar seminal developments taking place at the Bell Laboratories, the Research and Development unit of the Bell System. These developments were to have a significant impact on the future course of computer science research and networking in the world. As early as 1957, Bell Labs found they needed an operating system for their inhouse computer center which was then running lots of short batch jobs. Describing the situation facing the Labs, Victor Vyssotsky, who had been involved the techanical head of the Multics project at Bell Labs and later Executive Director of Research in the Information Systems Division of AT&T Bell Labs, explains, " We just couldn't take the time to get them on and off the machine manually. We needed an operating system to sequence jobs through and control machine resources." (from "Putting Unix in Perspective", Interview with Victor Vyssotsky, by Ned Pierce, in Unix Review, Jan. 1985, p. 59) The BESYS operating system was created at Bell Labs to deal with their inhouse needs. When asked by others outside the labs to make a copy available, they did so but with no obligation to provide support. "There was no support when we shipped a BESYS tape to somebody," Vyssotsky recalls, "we would answer reasonable questions over the telephone. If they found troubles or we found troubles, we would provide fixes." (Ibid., p. 59) By 1964, however, the Labs was adopting third generation computer equipment and had to decide whether they would build their own operating system or go with one that was built outside the Labs. Vyssotsky recounts the process of deliberation at the time, "Through a rather murky process of internal deliberation we decided to join forces with General Electric and MIT to create Multics," he explains. The Labs planned to use the Multics operating system "as a mainstay for Bell Laboratories internal service computing in precisely the way that we had used the BESYS operating system." (Ibid., p. 59) The collaborative project by GE, MIT and AT&T to create a computer operating system that would be called Multics (1965-68) was to "show that general-purpose, multiuser, timesharing systems were viable." (See Douglas Comer, "Pervasive Unix: Cause for Celebration," Unix Review, October, 1985, p. 42) Based on the results of research gained at MIT using the Compatible Time-Sharing System (CTSS), AT&T and G.E. agreed to work with MIT to build a "new hardware, a new operating system, a new file system, and a new user interface." (Ibid.) Though the project proceeded slowly and it took several additional years to develop Multics, Doug Comer, a Professor of Computer Science at Purdue University, explains that "fundamental issues were uncovered, new approaches were explored and new mechanisms were invented." (Ibid) The most important, he explains, was that "participants and observers alike became devoted to a new form of computing (the interactive, multiuser, timesharing system.). As a result, the Multics project dominated computer systems research for many years, and many of its results are still considered seminal."(Ibid.) Evaluating the influence of the MULTICS research on Bell Labs researchers, Comer points out that top researchers in computer science and mathematics from the world's premier industrial research center, Bell Labs, were able to work with top researchers from academia. When Ken Thompson, Dennis Ritchie and their "Bell Laboratories colleagues," writes Comer, "later began work on their own implementation of a Multics-like time-sharing system, they drew heavily from the Multics experience. So, despite popular myth, UNIX was not an accidental discovery at all -- it evolved directly from experiences with academic research." (Ibid., p. 41-42) By 1969, however, AT&T made a decision to withdraw from the project. Describing that period, Dennis Ritchie, another of the inventors of unix at Bell Labs writes, "By 1969, Bell Labs management, and even the researchers came to believe that the promises of Multics could be fulfilled only too late and too expensively." (from Dennis Ritchie, "The Development of the C Language," ACM, presented at Second History of Programming Languages conference, Cambridge, Mass, April 1993, p. 1) Detailing the reasons for the decision, Vyssotsky responds, "It turned out that from our point of view the Multics effort simply went awry. In the first place, we were naive about how hard it was going to be to create an operating system as ambitious as Multics. It was the familiar second system syndrome. You put in everything you wished you'd had in the other one."(Vyssotsky, pg. 59) Also he details how GE, MIT, and AT&T each had different goals for the project, which made it difficult for them to work together. While GE wanted to develop Multics to "strengthen its product line," MIT wanted Multics "to advance the state of art" of computing, and Bell Labs' purpose was to have a good environment for our people to work in." (Ibid.) Given these different objectives, Vyssotsky explains, "It turned out that under the stress of slipping schedules and the increasing realization that we had difficulty agreeing on a common course of action, we ended up simply pulling out of Multics. We said, `OK, it's too wet to plow. We aren't going to get from here to there'."(Ibid.) When the decision to pull out of the Multics project was made by AT&T, Vyssotsky explains there was an operating system that he called a "precursor of Multics" running on their GE 645 computer. "From the point of view of the few people who could use it," he notes, "it was a very nice programming environment. In particular, Ken Thompson thought it was a very nice programming environment."(Ibid.) However, when Bell Labs pulled out of the Multics project they took the Multics precursor off their GE 645 computer and put up GECOS, a much less state of the art operating system. "If you were an old line Spanish American War type computer user like me," Vyssotsky admits, "GECOS was a perfectly satisfactory system for getting from here to there in a well-designed application. You knew what it was going to do." (Ibid., p. 60) But for a research computer scientist like Ken Thompson, GECOS was inadequate. According to Vyssotsky, "It was nowhere near as satisfactory if you were trying to do things that were technically difficult and imperfectly defined, which is the main task of research."(Ibid.) Not only for Ken Thompson's work, but for the research purposes of the Labs, an operating system more like what Multics had promised was needed. "I wanted a much more flexible system than BESYS or GECOS or OS360 or anything I could see," Vyssotsky recounts, "I had various things that I was trying to do with computers that were just plain hard to do with existing operating systems."(Ibid.) "Moreover, for people like Ken Thompson," Vyssotsky emphasizes, "having this embryonic version of Multics taken away and GECOS slapped down in its place was something of a disaster. Suddenly they were back to square one."(Ibid.) With the loss of the Multics experimental operating system, Ken Thompson, Dennis Ritchie and the others at the Labs who began work on UNIX, realized they had to focus on creating an operating system for their programming needs. "I don't think," Vyssotsky relates, "that either of them was particularly fascinated by operating systems until they found themselves cast back upon GECOS. They sort of got interested in the subject out of self defense."(Ibid.) In his account of this period, Dennis Ritchie writes, "Even before the GE-645 Multics machine was removed from the premises, an informal group, led primarily by Ken Thompson, had begun investigating alternatives." ( Ritchie, pg. 1) Thompson and Ritchie presented Bell Labs with proposals to buy them a computer so they could build their own interactive, time sharing operating system. Their proposals weren't acted on. Eventually, Ken Thompson found a little used and obsolete PDP 7 computer. According to Vyssotsky the orphaned PDP-7 computer was a tiny machine, "more nearly in the class of a Commodore 64 than the class of a PC-AT." (Vyssotsky, pg. 60) Ritchie explains that Ken Thompson was attempting to create a programming environment which included "many of the innovative aspects of Multics," such as "an explicit notion of a process as a locus of control, a tree-structured file system, a command interpreter as a user-level program, simple representation of text files, and generalized access to devices." (Ritchie, p. 1-2) Describing the primitive conditions that Thompson faced, Ritchie writes, "At the start, Thompson "did not even program on the PDP itself, but instead used a set of macros for the GEMAP assembler on a GE-635 machine. A postprocesser generated a paper tape readable by the PDP-7. These tapes were carried from the GE machine to the PDP-7 for testing until a primitive UNIX kernel, an editor, an assembler, a simple shell (command interpreter), and a few utilities (like the Unix rm, cat, cp commands) were completed. At this point, the operating system was self- supporting; programs could be written and tested without resort to paper tape, and development continued on the PDP-7 itself." (Ibid., pg 2) The result, Ritchie explains, was that "Thompson's PDP-7 assembler outdid even DEC's in simplicity; it evaluated expressions and emitted the corresponding bits. There were no libraries, no loader or link editor: the entire source of a program was presented to the assembler, and the output file -- with a fixed name -- that emerged was directly executable.(Ibid., pg. 2) The operating system was named UNIX, to distinguish it from the complexity of MULTICS. Vyssotsky recalls that in addition to Thompson and Ritchie, "the two most active contributors at that stage were Joe Ossanna and Rudd Canaday. I should also add," he explains, "that Doug McIlroy was tremendously influential on their thinking."(Vyssotsky, pg.60) Vyssotsky elaborates, "I don't think that Doug actually contributed much of the programming, but for example, the appearance of pipes in UNIX was clearly a result of Doug's discussions with Ken and Dennis." (Ibid. ) Ken put them in, but "it was McIlroy who said, "Look you ought to do it. Pipes, like most things in UNIX were not a radically new idea. Co-routines had, after all, shown up in SIMULA by the end of 1967."(Ibid.) As work continued on the Bell Labs operating system, the researchers developed a set of principles to guide their work. Among these principles were: "(i) Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features. (ii) Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input. (iii) Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them. (iv) Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them." (from M.D. McIlroy, E.N.Pinson, and B.A. Tague "Unix Time-Sharing System Forward", The Bell System Technical Jounal, July -Aug 1978 vol 57, number 6 part 2, p. 1902) By 1970, Ritchie writes, the UNIX researchers were "able to acquire a new DEC PDP-11. The processor," he remembers, "was among the first of its line delivered by DEC, and three months passed before its disk arrived." (Ritchie, p. 5) Soon after the machine's arrival and while "still waiting for the disk, Thompson," Ritchie recalls, "recoded the Unix kernel and some basic commands in PDP assembly language. Of the 24K bytes of memory on the machine, the earliest PDP-11 Unix system used 12K bytes for the operating system, a tiny space for user programs, and the remainder as a RAM disk." (Ibid., p. 5) "By 1971," Ritchie writes, "our miniature computer center was beginning to have users. We all wanted to create interesting software more easily. Using assembler was dreary enough that B, despite its performance problems, had been supplemented by a small library of useful service routines and was being used for more and more new programs."(Ibid., p. 6) "C came into being in the years 1969-1973," Ritchie explains, "in parallel with the early development of the Unix operating system; the most creative period occurred during 1972."(Ibid., p. 1) "By early 1973," the essential of modern C were complete. The language and compiler were strong enough to permit us to rewrite the kernel for the PDP-11 in C during the summer of that year. (Thompson had made a brief attempt to produce a system coded in an early version of C -- before structures -- in 1972, but gave up the effort.)" (Ibid.) Each program they built developed some simple capability and they called that program a tool. They wanted the programs to be fun to use and to be helpful to programmers. Describing the achievements of the lab, Doug McIlroy, one of the researchers and Thompson's Dept Head when they created UNIX, describes the atmosphere at the lab: "Constant discussions honed the system....Should tools usually accept output file names? How to handle demountable media? How to manipulate addresses in a higher level language? How to minimize the information deducible from a rejected login? Peer pressure and simple pride in workmanship caused gobs of code to be rewritten or discarded as better or more basic ideas emerged. Professional rivalry and protection of turf were practically unknown: so many good things were happening that nobody needed to be proprietary about innovations." [from M.D. McIlroy, "Unix on My Mind," Proc. Virginia Computer Users Conference, vol 21, Sept. 1991, Blacksburg, p. 1-6.] The research done at the Labs was concerned with using the computer to automate programming tasks. By a scientific approach to their work and careful attention to detail, Bell Labs researchers determined the essential elements in a design and then created a program to do as simple a job as possible. These simple computer automation tools would then be available to build programs to do more complicated tasks. They created a UNIX kernel accompanied by a toolbox of programs that could be used by others at Bell Labs. The kernel consisted of about 11,000 lines of code. Eventually, 10,000 lines of the code were rewritten in C and thus could be transported to other computer systems. "The kernel," Ken Thompson writes, "is the only UNIX code that cannot be substituted by a user to his own liking. For this reason, the kernel should make as few real decisions as possible." (from K. Thompson, "UNIX Implementation", "The Bell System Technical Journal," vol 57, No. 6, July-August 1978, p. 1931) Thompson describes creating the kernel: "What is or is not implemented in the kernel represents both a great responsibility and a great power. It is a soap-box platform on `the way things should be done.' Even so, if `the way' is too radical, no one will follow it. Every important decision was weighed carefully. Throughout, simplicity has been substituted for efficiency. Complex algorithms are used only if their complexity can be localized." (Ibid., p. 1931-2) The kernel was conceived as what was essential and other features were left to be developed as part of the tools or software that would be available. Thompson explains: The UNIX kernel is an I/O multiplexer more than a complete operating system. This is as it should be. Because of this outlook, many features are found in most other operating systems that are missing from the UNIX kernel. For example, the UNIX kernel does not support file access methods, file disposition, file formats, file maximum sizes, spooling, command language, logical records, physical records, assignment of logical file names, logical file names, more than one character set, an operator's console, an operator, log-in, or log-out. Many of these things are symptoms rather than features. Many of these things are implemented in user software using the kernel as a tool. A good example of this is the command language. Maintenance of such code is as easy as maintaining user code. The idea of implementing "system" code and general user primitives comes directly from MULTICS." (Ibid., p. 1945-6) Evaluating the achievement represented by the kernel, Vyssotsky explains, "I would say that the greatest intellectual achievement embedded in UNIX is the success Ken Thompson and Dennis Ritchie had in understanding how much you could leave out of an operating system without impairing its capability."(Vyssotsky, pg. 60-62) "To some extent," he continues, "that was forced by the fact that they were running on small machines. It may also have been a reaction to the complexity of Multics...It took some very clear thinking on the part of the creators of UNIX to realize that most of that stuff didn't have anything to do with the operating system and didn't have to be included." (Ibid., p. 62 ) Eventually the unix operating system was adopted in other departments at AT&T to do a variety of work. "There is one piece of history that I think is very important to understand," explains Vyssotsky, "When UNIX evolved within Bell Laboratories, it was not a result of some deliberate management initiative. It spread through channels of technical need and technical contact ... this was typical of the way UNIX spread around Bell Laboratories. You had MTSS Supervisors and Department Heads saying we had to go in this direction while Executive Directors were saying, `Well, I'm awful nervous about it. But if you guys say that is what we've got to do, I'll back your play."(Ibid, pg. 62-64) Explaining the importance of how unix was implemented organizationally within the Bell System, Vyssotsky comments, "There are a lot of organizations that do not work that way. I brought out that little hunk of history to point out that the spread and success of UNIX, first in the Bell organizations and then in the rest of the world, was due to the fact that it was used, modified, and tinkered up in a whole variety of organizations within Bell Laboratories ... the refinement of UNIX was not done as the result of some management initiative or council of vice presidents. It was the supervisors saying, "This thing is already better than our other options and flexible enough for us to make it a go." (Ibid. p. 64) During the same period that the search for an operating system to replace the promise of Multics had begun by Bell Labs computer programming researchers, the Bell System was faced with the problem of automating their telephone operations using minicomputers. Describing the problem facing the Bell System during this period, August Mohr, in an article in Unix Review, "The Genesis Story"(January 1985, p. 22), writes "Bell was starting to perceive the need for minicomputer support for its telephone operations." (Mohr was editor of /usr/group 's CommUNIXations newsletter.) "The discovery that we had the need -- or actually, the opportunity -- in the early '70s to use these minis to support telephone company operations encouraged us to work with the UNIX system," confirms Berkley Tague. ("Interview with Berkley Tague," Unix Review, June 1985, p. 59) "We knew we could do a better job with maintenance, traffic control, repair, and accounting applications." (Ibid.) "The existing systems were made up of people and paper," he relates, "The phone business was in danger of being overwhelmed in the early '70s with the boom of the '60s. There was a big interest then in using computers to help manage that part of the business. We wanted to get rid of all of those Rolodex files and help those guys who had to pack instruments and parts back and forth just to keep things going." During the late 1960's, AT&T was under pressure from regulatory bodies like the New York Public Service Commission, to solve what was termed as a "service crisis." (See especially, "Wrong Number," by Alan Stone, N.Y., 1989, p. 145) This pressure encouraged AT&T to explore technological advances that would make its support operations more efficient. Tague explains that there had been local mechanization of processes but not large scale integration of the mechanization. "Take repair," he suggests as an example, "A lot of it deals with keeping the connections straight between what we call the main distribution frames in the central office and the wires that tie residential telephones into the switch. Prior to the use of computers, `mechanization' consisted of somebody on a remote test bench using electrical meters and instruments to test lines. To get those connections made, an intercom was used to broadcast requests to a bunch of people standing around with alligator clips and soldering irons down in the wire center. The requests went something like, `Would you kindly connect jumper x to terminal y?' to get testing done."(Ibid, p. 60) Tague describes how the mini computer made it possible to automate this process. "First, we were able to get more instructions out to the people actually making the connections. And, at the other end, we were able to centralize information about entire systems and end-to-end circuits." "This meant," he elaborates, "that if I was responsible for keeping the Superbowl broadcast on the air between New Orleans and New York, I could -- with a single console -- view all the connections on that link and have access to all of the information automatically being collected about it. If something broke, I could immediately recognize that and orchestrate the process of getting it repaired. The repair itself would ultimately be left to a person working in much the same way as before." (Ibid.) This change affected workers like those "plugging in an alternate module or pulling a manual switch and going to a backup system," he clarifies. "Suddenly, their work became much faster because the information was all in one place -- unlike earlier days when eight guys would have had to collect and sort out the trouble data in a series of phone calls before actually being able to get down to the business of working on solutions." (Ibid.) Other applications were affected as well, he explains. "in areas like cable and wiring layouts. The algorithms applying to these layouts were well known here at the Laboratories, but they were not the sort of thing you could usefully put into a manual. They were, however, easily put into computer programs. Optimum layouts could thus be generated using the computer to assess all the complicated engineering tradeoffs."(Ibid.) Not only did they need a good programming environment, but Mohr emphasized that the Bell System applications required, "Operations Systems, not Operating Systems. With the number of systems under consideration, the possiblity of being tied to a single vendor, or having each site tied to a different vendor, induced a kind of paranoia. There just had to be another way." (Mohr, p.22 ) Tague elaborates, "If we faced the phone company with 18 different vendors and 19 different environments, neither the developers nor the phone companies were going to be able to maintain the thing once it got out in the field in large numbers. As a planner, I was trying to focus on a few vendors. At that time, it was primarily Hewlett-Packard and DEC, plus a few IBM systems." (Tague, pg. 60) This led to the realization of a need for an operating system. "Vendor operating systems were available as a starting point", he adds "but a number of people had already started to build their own when they realized that what the vendors had was not adequate." (Ibid.) Tague explains that his role in planning for the transition meant that he tried to warn those involved that they would need a good software environment to do the development of the software needed to use the mini computers for these new roles. "I observed," he comments, "that people were starting to put these minis out in the operating company, and saw that it was an area of both opportunity and potential problems. I found," he adds, "that some of the people in development had never built an operating system for any computer before; many of them had very little software background. They were coming out of hardware development and telephone technology backgrounds, and yet were starting to build their own operating systems. Having been through that phase of the business myself, it seemed silly to go through it another hundred times, so I started pushing the UNIX operating system into these projects." (Mohr, pg. 22) Tague was familiar with UNIX and its capabilities and tells the variety of reasons ranging from inadequate file systems, to inadequate performance, to poor user interface that he recommended the initial adoption of UNIX to start the work. "We sold those first application developers on UNIX simply by pointing out that the first job they were going to have to do was program development and that by using the UNIX operating system they could get that job done more easily. I did not argue with them about whether or not they should develop their own operating systems -- knowing in my heart of hearts that once they got on UNIX they wouldn't be able to do any better with the experience and the schedules they had. Indeed, that is what happened." (Tague, pg. 60-1) Tague's backing of UNIX, as a development system for operations, was not just a personal preference. "I had every confidence in the people who built it because I'd worked with them on Multics," he explained. "With their experience and training, I figured they could build a much better operating system than somebody who's building one for the first time, no matter how smart that person is." (Mohr, pg 22) Tague describes how UNIX had been functioning in the research environment and thus had demonstrated that it could be used as a beginning basis for this important job. Also, he knew that there would be a need to develop a support system for those operating companies around the country that would begin to use UNIX: "We were starting to put these things in the operating companies all around the countryside," explains Tague, "and the prospects were that there were going to be several hundred minis over the next few years that were going to have to be maintained with all their software and hardware." (Ibid., pg. 24) Bell had created the needed field support system to maintain the electronic switching machines and software that were now being upgraded. "Supporting a network of minicomputers would be a significantly different problem, though," August Mohr explains. "Maintaining an operating system is not at all like maintaining an electronic switching system. The minicomputers had different reliablity demands, requiring a different support structure in the organization -- one that did not yet exist in any form. In many ways, the operations group was breaking new ground," writes Mohr. (Ibid.) As head of the Computer Planning Department, Tague had been responsible for systems engineering. In 1971 Tague garnered support for UNIX to be adopted. Then he pushed to have UNIX made the internal standard and to provide central support through his organization. By September, 1973, he was able to form a development organization to provide support for a "standard Unix." This group, called UNIX Development Support worked with Bell Labs Research. Though the two groups sometimes diverged regarding their priorities, Mohr explains that they agreed on the need for UNIX portability. According to Mohr, "Tague foresaw the possiblity of UNIX becoming an inteface between hardware and software that would allow applications to keep running while the hardware underneath was changing." (Ibid., p. 24) "From the support point of view," he continues, "such a capability would solve a very important problem. Without UNIX and its potential portability, the people building the operations support systems were faced with selecting an outside vendor that could supply the hardware on which to get their devlopment done. Once that was complete, they would be locked into that vendor." However, according to Mohr, "Portability obviated this limitation and offered a number of other advantages. When making a hardware upgrade, even to equipment from the same vendor, there are variations version to version. That could cost a lot of money in software revisions unless there were some level of portability already written into the scenario." (Ibid., pg. 24-25) Just as Operating Systems people in the Bell system had come to recognize the need for portability in a computer operating system, Ritchie and Thompson and the other programming researchers at Bell Labs had created the computer language C and rewritten the majority of the UNIX kernel in C and thus had made the important breakthrough in creating a computer operating system that was not machine dependent. Describing their breakthrough with UNIX, Thompson and Ritchie presented their first paper on UNIX at the Symposium on Operating Systems Principles, IBM Thomas J. Watson Research Center, Yorktown Heights, New York, October 15-17, 1973,(reference from UNIX(tm) Time-Sharing System: Unix Programmers Manual, 7th edition, vol 2, Murray Hill, f/n pg 20). See also Ritchie's account of the creation of C by early 1973 in "The Development of the C Language," ACM, presented at Second History of Programming Languages conference, Cambridge, Mass, April 1993, p. 1) Describing this important achievement by Bell Labs researchers, Mohr writes, "the integral portability of the system developed by Research proved adequate to make UNIX portable over a wide range of hardware." With the research breakthrough of a portable computer operating system, "the first UNIX applications were installed in 1973 on a system involved in updating directory information and intercepting calls to numbers that had been changed. The automatic intercept system was delivered for use on early PDP-11s. This was essentially the first time UNIX was used to support an actual, ongoing operating business." (Mohr, pg. 26) Different operations sites had taken on to create computer software to meet similar needs, such as print spooling, mail, help, etc. Tague's group's assignment was to gather the software and to determine what the standard should be and send the standard back out to the sites. Tague credits the technical strength of UNIX for making software standardization possible. UNIX "made it easy," he explains, "to get the right stuff in without upsetting the whole world." Establishing a standard UNIX, according to Tague, was "a process of negotiation and compromise with the UNIX-using community -- not a unilateral decision." (Ibid.) His group and the people at the variety of Bell sites "often ended up arguing things out until everybody understood the issues and a suitable compromise was made," he relates. (Ibid.) Tague describes how his group the UNIX Support Group (USG) which had been established in September of 1973 "released the first C version of UNIX internally. [Generic I, II, and III were produced by these intitial efforts.] In parallel with our efforts," he notes, "the Programmer's Workbench gang under Rudd Canaday worked the same vein over in the BIS [Business Information Systems] area.(Tague, p. 61) The application of UNIX to automating the operating systems at Bell also involved automating the monitoring, measurement, help for routing and ensuring quality of calls. That was a "tall order," writes Tony Culwick, "given the standards people have come to expect...but the fact remains that the fundamental integrity of the national telecommunications network depends on more than 1000 real-time, mini-computer-based systems that are built on a version of the UNIX operating system." (from "Reach out and Touch the Unix System," by Tony Cuilwik, "Unix Review," June 1985, p. 50. Cuilwik was the head of the Operations Systems Development Department at Bell Laborators and then director of AT&T Information Systems Laboratories in Columbus, Ohio.) Describing the functions that UNIX makes possible, he writes, "Among the varied and wide-ranging functions these systems perform are network performance measurement, automated network testing, circuit order planning, circuit order record-keeping, automated trouble detection, automated or directed trouble repair, service quality assurance, quality control, inventory control, customer record-keeping, and customer billing -- as well as any number of other operational and administrative functions. These functions all require," Cuilwik explains, "the ability to present data to users in real-time." (Ibid.) The object in these systems is "to guarantee a minimal acceptable human response time. This challenge has been met by tuning the underlying UNIX system." (Ibid.) Cuilwik describes how the need for such real time applications was determined in the 1969-70 period, just when UNIX was being created. Development, he reports, "began in earnest in 1971. Early in this period," he writes, "it was determined that an operating system and environment should be provided to system designers, who would then only need to develop application-specific software." By 1974, he reports "several sites had chosen the UNIX operating system as this development environment. A few, meanwhile, had also selected it as an execution environment and were busy designing enhancements and improvements for the system." (Ibid.) The need was also recognized for "a common operating environment between projects." (Ibid.,p. 50-52) "Major additions" he writes, "necessary to move the timeshared UNIX system into real-time applications included interprocess communications (name pipes, messages, semaphores, and shared memory), file access (logical file system, record access system), error recovery, power fail/restart, and line and terminal disciplines. These additions were developed, integrated or donated to the common good by people developing specific systems. By 1979," he reviews, "there was an enhanced real-time UNIX system that was centrally supported, offering a collection of tools and a number of human/machine interface designs to protect system users from direct contact with UNIX primitives." (Ibid, p. 52) The process of the development of UNIX so it contained such a range of options involves its adoption and development by the academic research community. Early in its development, word of the UNIX operating system and its advantages spread outside of Bell Labs. (Several sources attribute this to the paper that Ritchie and Thompson presented on UNIX at the Symposium on Operating Principles at Purdue in November, 1973. See for example McKusick, "A Berkeley Odyssey" in Unix Review, January 1985, p. 31, and Peter Ivanov, "Interview with John Lions", Unix Review, October, 1985, p. 51, about the publication of the paper in July 1974 in the "Communications of the ACM".) The labs made the software available to academic institutions at a very small charge. For example, John Lions, a faculty member in the Department of Computer Science at the University of New South Wales, in Australia, reported that his school was able to acquire a copy of research UNIX Edition 5 for $150 ($110 Australian) in December, 1974, including tape and manuals. (See "An Interview with John Lions," in Unix Review, October, 1985, p. 51) UNIX was attractive to the academic Computer Science community for several reasons. John Stoneback, describing these reasons, writes: "UNIX came into many CS departments largely because it was the only powerful interactive system that could run on the sort of hardware (PDP-11s) that universities could afford in the mid '70s. In addition, UNIX itself was very inexpensive. Since source code was provided, it was a system that could be shaped to the requirements of a particular installation. It was written in a language considerably more attractive than assembly, and it was small enough to be studied and understood by individuals." (from John Stoneback, "The Collegiate Community," Unix Review, October 1985, p. 27.) Describing how research UNIX helped make it possible for academic computer science departments to establish and develop research in computer science, he writes: "UNIX had another appealing virtue that many may have recognized only after the fact -- its faithfulness to the prevailing mid-'70s philosophy of software design and development. Not only was UNIX proof that real software could be built the way many said it could, but it lent credibility to a science that was struggling to establish itself as a science. Faculty could use UNIX and teach about it at the same time. In most respects, the system exemplified good computer science. It provided a clean and powerful user interface and tools that promoted and encouraged the development of software. The fact that it was written in C allowed actual code to be presented and discussed, and made it possible to lift textbook examples into the real world. Obviously, UNIX was destined to grow in the academic community. (Ibid., p. 27) In trying to teach his students the essentials of a good operating system, John Lions describes how he prepared a booklet containing the source files for a version of Edition 6 of research UNIX in 1976 and the following year completed a set of explanatory notes to introduce students to the code. "Writing these," he recounts, "was a real learning exercise for me. By slowly and methodically surveying the whole kernel, I came to understand things that others had overlooked." This ability to present his students with a real example of an operating system kernel was a breakthrough. Lions writes: Before I wrote my notes on UNIX, most people thought of operating systems as huge and inaccessible. Because I had been at Burroughs, I knew that people could get to learn a whole program if they spent some time working at it. I knew it would be possible for one person to effectively become an expert on the whole system. The Edition 6 UNIX code contained less than 10,000 lines, which positioned it nicely to become the first really accessible operating system." (Lions, p. 52-3) In keeping true to the UNIX community spirit of helping each other, Lions wrote a letter to Mel Ferentz, Lou Katz and others from Usenix and offered to make copies of his notes available to others. After some negotiation with Western Electric over the patent licensing, he distributed the notes titled "A Commentary on the UNIX Operating System" to others with UNIX licenses on the conditions that Western Electric had set out. (Ibid., p. 53) Lions describes how he helped to develop a UNIX tool "pack" which was eventually combined with tools created at Bell Labs called huff and unhuff and distributed as a standard UNIX command. He and others from his college were invited to spend periods of time at Bell Labs to work with the unix researchers there. (See for example, pg. 57) Describing how research UNIX and its adoption at academic institutions has served to develop computer science, Doug Comer writes: The use of UNIX as a basis for operating systems research has produced three highly desirable consequences. First, the availability of a common system allowed researchers to reproduce and verify each others' experiments. Such verification is the essence of science. Second, having a solid base of systems software made it possible for experimenters to build on the work of others and to tackle significant ideas without wasting time developing all the pieces from scratch. Such a basis is prerequisite to productive research. Third, the use of a single system as both a research vehicle and a conventional source of computing allowed researchers to move results from the laboratory to the production environment quickly. Such quick transition is mandatory of state-of-the-art computing." (Comer, p. 44) Not only did research UNIX serve the academic community, but the contributions of the academic community were incorporated into research UNIX. An example, is the work by Babaoglu and Porker at UC Berkeley of designing a virtual memory version of UNIX for the VAX computer which was later optimized by Bill Joy and incorporated into a release of UNIX. (Ibid.) Academic contributions which were incorporated into research UNIX included the vi editor which was created by Bill Joy at University of California at Berkeley. Describing this phenomena Comer writes: "Many universities contributed to UNIX. At the University of Toronto, the department acquired a 200-dt-per-inch printer/plotter and built software that used the printer to simulate a phototypesetter. At Yale University, students and computer scientists modified the UNIX shell. At Purdue University, the Electrical Engineering Department made major improvements in performance, producing a version of UNIX that supported a larger number of users. Purdue also developed one of the first UNIX computer networks. At the University of California at Berkeley, students developed a new shell and dozens of smaller utilities. By the late 1970s, when Bell Labs released Version 7 UNIX, it was clear that the system solved the computing problems of many departments, and that it incorporated many of the ideas that had arisen in universities. The end result was a strengthened system. A tide of ideas had started a new cycle, flowing from academia to an industrial laboratory, back to academia, and finally moving on to a growing number of commercial sites." (Comer, p. 43) In the process of using UNIX within Bell Labs, bugs would be discovered and reported to the programmers, or new applications would be created by the departments using the programs for their own tasks. The research labs would need to provide maintenance and updating of software as well as getting the bug reports to the programmer and sending out fixes. To automate this maintenance work, Mike Lesk, one of the Bell Labs computer researchers, proposed an automated maintenance system that would make it possible to have the research computer call up the computers in the departments and automatically deliver updated software and test that it worked on the remote computer. As part of the automated maintenance system, Lesk created a UNIX program called UUCP (UNIX to UNIX copy) which made it possible to use a phone or hard wired connection to have one computer poll another computer and deliver the software. Describing the considerations by Bell Labs at this time, Vyssotky explains, (from Vyssotsky, pg. 64)" In 1976, there were those three versions of UNIX. The Change Control Process on all three of those versions was such that, at any moment in time, the people who were programming could tell what changes had gotten in and what changes were scheduled to go in. However, it was still a little hard for the users to tell what they were getting. It wasn't until 1978 that we had anything that I would consider to be a reasonable configuration management process of UNIX. That was the point at which we finally realized we had something which, like it or not, was a major product. So we said, `Given that it is a major product, there can be no horsing around.' We could no longer regard it as something in the underbrush. We had to regularize our arrangements. We set up a process for configuration management and we focused the thing in the direction of a coherent system." (Vyssorsky, pg. 64-68) But he emphasizes, "Perhaps, the most important one was that UNIX was being used as the operating system basis for a bunch of operations support systems in the Bell Operating Companies and we could not afford to let those support systems go down. We put configuration management and all of the associated paraphernalia in place about 1978. (Ibid., pg. 68) Lions says about the freezing, "Much of the development of UNIX in Bell Laboratories occurred before 1978. After Edition 7, many of the original group went off to do other things. At the same time, UNIX was becoming important within the Bell System, which gave rise to a support group whose charter was to develop a polished and stable version of UNIX. This group was less interested in innovation than in stabilizing the system. Universities have simply picked up the slack. (Lions, pg. 56) Meanwhile, academic UNIX users had to do their own software maintenance. Lions describes how a community of academic unix users grew up who were willing to help each other. "One very positive effect, however" writes Lions, "is that the number of universities using UNIX and the lack of any formal support forced us to band together into AUUG. (Australian unix users group -ed) The connections we have thereby made have created and cemented bonds between people in the different departments. UNIX has been a very unifying influence for computer science within Australia. This cannot be overestimated."(Ibid., pg. 57) UUCP made such exchanges easier. It was included with the Version 7 UNIX, which was made available to the academic community outside of Bell Labs. UUCP made it possible for UNIX users to communicate with each other even when they were at spatially distant locations. Using UUCP, the UNIX community was able to pioneer still another advance, Usenet News. "Though large institutions have been able to avail themselves of communications networks such as ARPANET, the UNIX community has made inexpensive electronic communication available to all of its members via Usenet," writes Stoneback, "A community that already had so much in common," he explains, "was strengthened and enhanced by the ability to move software easily among locations and to maintain a reasonable electronic mail system. The cost of this network has been borne at least in part by private industry, thus mitigating expenses for the users themselves. The Usenet network stands today as a clear sign that the UNIX community is solidly in place. It now includes numerous corporate members providing universities on the network with the added advantage of pooling academic researchers, industrial developers, industrial researchers and regular users. Combined with a functional, cheap electronic communication system, Usenet offers the academic community unique advantages." (Stoneback, p. 26) "The network," he points out, "is the direct result of a community that supports its members and in turn is nurtured by the ones it serves. The community is a reasonably democratic one, reasonably open to new ideas, resonably open to change, and reasonably generous with its benefits."(Ibid.) Thus by 1980, a survey conducted by the Computer Science Research Network (CSNET) of academic institutions to find out what computer system they used, found that "over 90 percent of all departments were served by one or more UNIX systems." (Comer, pg. 42) Explaining the surprising popularity that UNIX achieved despite its grassroots distribution system, McIlroy writes, "Therein lies the genius of Unix, which, without a sales force, and without the support of hardware makers, was enthusiastically adopted around the world ..." ("Unix on My Mind") "Unix," he emphasizes, "was the distilled essence of operating systems, designed solely to be useful. Not to be marketable. Not to be compatible. Not to be an appendage to a particular kind of hardware. Moreover a computer running Unix was to be useful as a computer, not just a `platform' for canned `solutions'. It was to be programmable - cumulatively programmable. The actions of program builders were to be no different in kind from the actions of users; anything a user could do a program could do too...." (Ibid.) Describing the environment that gave birth to these advances, McIlroy writes, "Open systems! Our systems! How well those who were there remember the pipe-festooned garret where Unix took form. The excitement of creation drew people to work there amidst the whine of the computer's cooling fans, even though almost the same computer access could be had from one's office or from home. Those raw quarters saw a procession of memorable events. The advent of software pipes precipitated a day-long orgy of one-liners...as people reveled in the power of functional composition in the large, which is even today unavailable to users of other systems. In another memorable event, the unarticulated notion of software tools, which had been bolstered by pipes, was finally brought home by the liberation of the pattern matching program grep from within the editor." (Ibid.) He continues: "A parade of visitors came to marvel at the system and to copy it. The makers of our 1972 model phototypesetter goggled when they saw the paper tape input replaced by wires straight from a computer. On-line PicturePhone[r] service caught attention. Synthetic speech was initiated by a memorable `Come here, Watson' event when words typed in a remote office range out clearly in the lab: `It sounds better over the telephone.' The computer's readings and misreadings became a constant crowd pleaser. There was great, if somewhat conspiratorial, excitement over a stealthy version of the C compiler that would recognize and silently bug the Unix login program and would propagate the ability through future generations of the compiler itself....No trace of the bug appeared in source code." (Ibid.) And UUCP and then Usenet News made this the experimental research environment available for those not at Bell Labs, or with access to the experimental Arpanet. "Eager to distribute his software quickly and painlessly, Mike invented uucp, thereby begetting a whole global network," McIlroy writes. (from "A Research UNIX Reader: Annotated Excerpts from the Programmer's Manual, 1971-1986" by M. D. McIlroy, Computing Science Technical Report No. 139, AT&T Bell Laboratories, June 1987, p. 3. Summarizing the relationship between Bell Labs and the academic community in developing UNIX, Comer concludes: "UNIX was not invented by hackers who were fooling around, nor did it take shape in a vacuum. It grew from strong academic roots and it has both nurtured and taken nourishment from academia throughout its development. The primary contributors to UNIX were highly educated mathematicians and computer scientists employed by what many people feel is the world's premier industrial research center, Bell Laboratories. Although they were knowledgeable and experienced in their own right, these developers maintained professional contacts with researchers in academia, leading to an exchange of ideas that proved beneficial for both sides. Understanding the symbiotic relationship between UNIX and the academic community means understanding the background of the system's inventors and the history of interactions between universities and Bell Laboratories." (Comer, p. 34, 42) Describing this fertilization, Dennis Ritchie wrote, "... Unix enjoyed an unusually long gestation period. During much of this time (say 1969-1979) the system was effectively under the control of its designers and being used by them. It took time to develop all the ideas and software, but even though the system was still being developed people were using it, both inside Bell Labs, and outside under license. Thus, we managed to keep the central ideas in hand, while accumulating a base of enthusiastic, technically competent users who contributed ideas and programs in a calm, communicative, and noncompetitive environment. Some outside contributions were substantial, for example, those from the University at Berkeley." ("Reflections on Software," August 1984, vol 27, No. 8, p. 75) John Lions, reviewing his experience as part of the UNIX community, concludes, "We have made a large number of contacts and exchanged a great deal of information around the world through this UNIX connection. Possibly that is the nicest thing about UNIX: it is not so much that the system itself is friendly but that the people who use it are. "(Lions, p. 57) It is a rare and wonderful event in the development of human society when a scientific and technological breakthrough is made which will certainly affect the future course of social contributions wer substantial, for example, those from the development and which becomes known when its midwives are still alive to tell us about it. UNIX, the product of researcher at Bell Labs, the then regulated AT&T system, and academic computer science, and a valuable invention for computer science, for computer education and for the education of the next generation of computer scientists and engineers, is such an event. Ronda Hauben Amateur Computerist ronda@umcc.umich.edu or ae547@yfn.ysu.edu