navigation bar
Brain graphic

a BRIEF HISTORY of GUI

Introduction
spacer
As an 'Eisenhower Baby' I've lived through most of the tumultuous history of Graphic User Interface, or GUI development. It has been an extremely interesting ride so far. This short essay was developed from notes I've used for seminars on UI design, I am surprised and honored to have many links to it from essays and school papers.
 
If you want to know where you are going, it's useful to know where you've been.

 
Design Rants: Introduction
 + Life with HTML
 + Basic Text Design
 + Visual Chaos
 + Common Web Page Problems
UI Design for Bean-Counters
Modern Software Design, Part 1
Modern Software Design, Part 2
a Brief History of GUI
GUI Design Checklists
GUI References and Bibliography
Office Ergonomics

spacer
History a Brief History of GUI
 
Graphic User Interfaces were considered unnecessary overhead by early computer developers, who were struggling to develop enough CPU horsepower to perform simple calculations. As CPU power increased in the sixties and early seventies, industrial engineers began to study the terminal entry programs of mainframes to optimize entry times and reduce mis-types. The earliest mainframe query protocols still in use, i.e., airline reservation systems, were developed during this period to queue as much information as possible into the shortest command. Essentially, operators were trained to perform computer language interpretation in their heads.
 
For an example, read this vision of future computing from the science fiction novel Inherit the Stars, 1977 by James P. Hogan:
 
"What do I do now?"
"Type this: FC comma DACCO seven slash PCH dot P sixty-seven slash HCU dot one. That means 'functional control mode, data access program subsystem number seven selected, access data file reference "Project Charlie, Book one," page sixty-seven, optical format, output on hard copy unit, one copy.'"

In the middle to late seventies several companies, including IBM and Xerox, began research on the "next generation" of computers, based on the assumption that computing power would drop in price to the point where many more individuals in companies would be able to effectively use them. IBM directed most of its efforts at mainframe development, but also started a small division to design and produce a "personal computer", which, despite its obscure operating system, would recreate the home-built small computer market. Other companies were struggling to produce cost-effective small computers using the CP/M operating system.
 
The most notable interface research program was at a facility owned by Xerox called the Palo Alto Research Center (PARC). In 1973 the PARC team began work on the Alto computer system as "an experiment in personal computing, to study how a small, low cost machine could be used to replace facilities then provided only by much larger shared systems." The Alto project continued into 1979, replaced by the Star computer, which many consider the forerunner of the Macintosh. The Alto had many unique features, and pioneered the use of the mouse, the portrait monitor, WYSIWYG, local area networking, and shared workspaces.
 
Alto, and the later Star computers, derived many of these features from cognitive psychology work. The designers attempted to communicate with users more effectively by making the computer communicate in ways the brain uses more readily; using icons for instance, because the visual part of the brain can track their presence and state much better than words. They developed ways of organizing information in patterns which the eye can track through more easily, drawing attention to the work in progress. They developed the model of WYSIWYG (what you see is what you get) to improve print proofing performance, and found through testing that the digital representation of black text on a sheet of white paper increased information legibility and retention. The Star interface added the concept of the desktop metaphor, and overlapping and resizable windows. PARC discovered along the way that whole new subsystems had to be developed to enable their technology to work; but once demonstrated, testing showed dramatic improvements in productivity, job satisfaction, and reduced training time for users. PARC's research clearly showed that a computer system of sufficient power could be optimized for human use, and that optimization would be paid back with a range of productive (and profitable) behavior and attitude improvements.
 
In the early eighties the IBM PC running DOS became the runaway best seller among computers. DOS was a cryptic command line interface, a direct descendant of mainframes. The PC had many limitations, including memory access, power, and lack of color or graphic standards; but it had enough productivity to warrant purchases of millions of units.
 
At the same time, a small group of designers at a company called Apple Computer made a deal with Xerox PARC. In exchange for Apple stock, Xerox would allow Apple to tour the PARC facility and incorporate some of their research into future products. Apple took elements of the Star interface, refined them and produced the Lisa computer. The Lisa failed, owing to its cost, lack of software availability, and other factors. Apple's next try with an enhanced and friendlier Lisa interface was the Macintosh, which found a small market foothold in the design and publishing markets. Apple was committed to its GUI, spending millions of dollars over the next ten years to research and implement enhancements; their commitment paid off in the late eighties as the desktop publishing market exploded and Apple's interface was widely acclaimed by the artists, writers, and publishers using the computers. Interestingly, one of the most successful Macintosh application developers was the Microsoft Corporation of Redmond, Washington, owner of MS-DOS. Microsoft, following the Apple GUI standards, developed a spreadsheet for the Mac which set new standards for ease of use. This product was, of course, Excel.
 
Apple worked with artists, psychologists, teachers, and users to craft revisions to their software and developer guidelines. For example, in California they sponsored an elementary school where every student had an Apple Computer. Each year the teachers and Apple programmers spent the summer planning new lessons and making enhancements to the software used to teach them, because Apple believed that children give the truest reactions to basic interface issues. Although a distant second in number of systems behind IBM compatibles today, Apple's closed hardware and software implementation at one point made them the largest personal computer manufacturer in the world, eclipsing IBM in 1992. Apple believes that the principal contributor to their success has been the consistent implementation of user interfaces across applications. Macintosh users have been able to easily master multiple applications because commands and behavior were the same across applications: Command-S is always save.
 
In the late 1980s Microsoft Corporation, producer of DOS, DOS applications, and Macintosh applications, began a joint project with IBM to develop a new graphic user interface for IBM compatible computers. This partnership later dissolved, but Microsoft went on to take user interface lessons learned from their successful Macintosh products, Excel and Word, and created a series of graphic shells running on top of DOS which could mimic many of the Macintosh GUI features. Microsoft and Apple became involved in extensive litigation over ownership of many of these features, but the case was eventually dismissed. Later version of the Windows operating system became increasingly Macintosh-like. Today Microsoft gives little credit to Apple for pioneering and validating many of the ideas which they have copied.
 
With increasing desktop power and continued reductions in CPU pricing, another area of GUI development also entered business, that of UNIX. Like DOS, UNIX is a child of the seventies and inherits a powerful and obscure command line interface from mainframes; unlike DOS, it had been used in networked applications and high-end engineering workstations for most of its life. In the eighties UNIX GUI shells were developed by consortiums of workstation manufacturers to make the systems easier to use. The principal GUIs were Solaris (Sun Microsystems), Motif (Open Software Foundation, or OSF), and later NeXTstep (Next Computers).
 
Altogether new graphical operating systems were also developed for the emerging families of RISC desktop computers and portable devices, these include Magic Cap (General Magic), Newton (Apple Computer), People, Places, and Things (Taligent), Windows CE (Microsoft), and the Palm interface (US Robotics Pilot).
 
The mid 1990s brought two new movements to GUI design - the Internet browser and it's limited but highly portable interface, and LINUX, a freeware version of UNIX. Which of these will have greater long-term impact is open to debate, but it appears that the browser has had widespread effect on GUI design, and on human culture.
 
The HTML/browser interface comes in bewildering variety of implementations. With limited interaction in forms the designers were forced back to basics, building and testing iterations. Fortunately, HTML is relatively easy to create, though some would suggest, difficult to master. Newer versions of HTML and decendants like DHTML, XML, WML, SMIL, offer greater potential for true interactive experiences but at the cost of increased download times and questionable compatibility with a diverse legacy of installed browsers. Over time the legacy browser problem will be solved as users upgrade their systems, and bandwidth issues should also improve. But the important thing learned by GUI designers from the Web is that screens do not have to be complicated to be useful - if the form solves a need and is easy to use, then people will use it.
 
LINUX represents another trend in computing and GUIs, that of group-developed software based on components. Facilitated by the Web, software designers can collaborate and produce startling work in short timeframes. LINUX is small and reliable, yet supports a large base of usable processes. Along with Java, LINUX represents a possible future of portable software running on compatible systems anytime, anywhere.
 
 
Key Features Common concepts in good GUIs
 
Important similarities exist between these GUIs which are based on sound principles of cognitive psychology and proven through thousands of hours of testing and billions of hours of use. They are summarized below:
  • Consistency: Once a set of rules is picked for a GUI, it is vital that different applications share methods for invoking similar features (external consistency), and that applications use the same methods for similar functions within the program (internal consistency).
     
  • Metaphor: To make complex processes easier to understand and manipulate, it is useful to choose a similar "real world" process to associate with the application, i.e., the desktop for managing files and choosing office applications. Use of visual images, sound, and actions serves to reinforce the illusion and make it more understandable.
     
  • User Centered: The user is in charge of the interaction on several levels. Actions are initiated and controlled by the user, the user selects the objects the action will affect, the user sees immediate visible results of actions to confirm their changes, and the user is warned about negative effects of their actions. Ideally, the user cannot be wrong, he or she can always recover from an error. When questions arise during development of new applications, they should always be settled to the users' benefit. Design specifications should arise from user needs, and research on the efficacy of the design must be done with users. The user is not a programmer, the user will make errors, but the user is in control.
     
  • WYSIWYG: Everything is seen, and features are not hidden except by the user, i.e., the tab settings on a word processor are seen unless turned off. Items which exist in the real world should look like them, especially if they may be printed, such as an invoice in an accounting program.
     
  • Aesthetics and Environment: The human eye and mind are evolved to make sense out of a disordered world. However, this process can completely consume the resources of the human brain; chaotic screen designs take a long time to understand and use. Information should be ordered into a simplified grid or list, it must be organized hierarchically according to importance and grouped into similar tasks. The application should have 'look' which reinforces the sense of craftsmanship required to create quality applications.
     
    At the same time, the design should help the user navigate the system; compatible changes to detailing, color, and patterns along with title bars help the user to recognize where they are in the application.

 
Marshall MacLuhan
To make things even more complicated, designers of user interfaces are aiming for a moving target. GUIs evolve, the hardware systems get faster, displays get larger, and the user is changing. As sociologist Marshall MacLuhan pointed out in the fifties while studying television, "...the medium is the message." This is because we must reject carrier information to extract real information from the events around us. For example, when we watch television we make constant evaluations on whether the information we see is important: the announcer's words instead of the color of his green jacket. If we did not evaluate the jacket color as unimportant and "reject" it, we would have great difficulty deciding what was important in the huge flow of information coming out of a TV. As we watch television we are constantly learning new information which must be rejected, so in many ways the TV is affecting our values and thought patterns. Similarly, GUI presentation of information has many levels of information which users learn to reject. A consistent interface makes it easier for the user to quickly extract information from the screen. Conversely, changes from learned ways of displaying or manipulating data lead to confusion and doubt, since the user must build a new internal model of hierarchical information. Since most users use multiple applications, and upgrades are constantly loaded, the user must make these evaluations daily. Users become more sophisticated, but they also develop technology induced blind spots which may prevent them from seeing important information.
 
 
Contact Information eMail comments about this page to
Copyright 1995-2005, pRCarter.
This work is licensed under a Creative Commons License.
Some rights reserved.
Last Update: 21jun2005
Design Rants: Introduction
 + Life with HTML
 + Basic Text Design
 + Visual Chaos
 + Common Web Page Problems
UI Design for Bean-Counters
Modern Software Design, Part 1
Modern Software Design, Part 2
a Brief History of GUI
GUI Design Checklists
GUI References and Bibliography
Office Ergonomics

spacer