Podcast


Central Problem

This section of Grudin’s historical account addresses a pivotal transformation in human-computer interaction: the shift from mandatory to discretionary computer use during 1980–1985. The central problem concerns how this transition fundamentally altered the priorities, methods, and disciplinary orientations of HCI research and practice.

When computer time was expensive and hardware dominated costs, human factors engineering optimized efficiency for workers mandated to use systems (data entry clerks, operators, reservation agents). The psychological profile of such users—trained, repetitive, captive—differed radically from discretionary users who could choose whether to adopt technology. The emergence of affordable microcomputers, personal computers, and office minicomputers created markets of technically unsophisticated users who received minimal training and would simply abandon tools that frustrated them.

The problem extends beyond mere interface design: it encompasses the formation of new professional communities (ACM SIGCHI), the tension between cognitive science and behavioral human factors, and the recurring cycles of AI hype and disappointment. How should research priorities shift when users have choice? What methods suit discretionary contexts where first impressions and learnability matter more than expert efficiency? Why did European and American HCI develop different emphases? These questions animate the period’s defining debates.

Main Thesis

Grudin argues that the 1980–1985 period represents a watershed when discretionary use emerged as the dominant paradigm for personal computing research, catalyzing the formation of CHI as a distinct field that diverged from traditional human factors engineering while inheriting—often unknowingly—the visions of earlier pioneers.

The thesis unfolds across several dimensions:

Technological enablement: The convergence of affordable displays, quality printers (HP inkjet 1984, laser printers 1984–1985), the IBM PC (1981), and the Hayes Smartmodem (1981) created conditions for mass discretionary use. Community bulletin boards proliferated, reaching 60,000 systems serving 17 million users by the mid-1990s.

Institutional crystallization: SIGCHI emerged from SIGSOC in 1982–1983, dominated by cognitive psychologists from seven key organizations: IBM, Xerox PARC, CMU, MRC APU, Bell Labs, Digital, and UCSD. Half of the 58 papers at CHI’83 came from these institutions. The field appropriated the legacy of Bush, Sutherland, and Nelson—pioneers most early CHI researchers had not read—to gain legitimacy.

Disciplinary divergence: CHI and human factors, despite the conference subtitle “Human Factors in Computing Systems,” moved apart after 1985. Card explicitly reduced cognitive science emphasis in CHI’86 to prevent it from “becoming human factors again.” The GOMS model, paradoxically, addressed expert repetitive use characteristic of human factors rather than the discretionary novice focus CHI championed.

AI cyclicality: The Fifth Generation panic (1981–1983) triggered massive investments—ESPRIT, Alvey, DARPA’s Strategic Computing Initiative ($400M by 1988), MCC consortium—yet produced another AI winter. Ovum’s 1985 prediction of 1,000% revenue growth proved wildly optimistic; actual growth was under 20%.

Historical Context

The early 1980s witnessed the AT&T divestiture (1984), ending a monopoly where neither customers nor employees had technological choice. AT&T’s 1985 Unix PC failed precisely because the company lacked experience designing for discretionary use. This regulatory transformation symbolized broader shifts toward consumer choice.

Office automation emerged as a distinct research field with the 1980 Stanford International Symposium (featuring two Engelbart papers), ACM SIGOA formation, AFIPS conferences, and journals like Office: Technology and People (1982) and ACM Transactions on Office Information Systems (1983). Minicomputers from Digital, Data General, and Wang brought computing to workgroup budgets—Wang’s founder became the fourth wealthiest American through word processing systems.

The cognitive revolution in psychology provided theoretical foundations. Behaviorists like Skinner focused on measurable outputs; cognitive psychologists insisted internal memory and processes mattered—computers proved such structures existed. This “heated war” through the 1960s–1980s shaped HCI’s methodological battles. Human factors leaders dismissed cognitive theorizing as “unobservable will o’ the wisps”; cognitive scientists accused behaviorists of ignoring memory and problem-solving.

European HCI developed differently: few mass-market software companies meant focus on in-house development. Behaviour and Information Technology (1982) and HUSAT at Loughborough emphasized job design, labor division, and ergonomics rather than first-time user experience.

Philosophical Lineage

flowchart TD
    Skinner --> HumanFactors[Human Factors]
    HumanFactors --> Card
    HumanFactors --> Gould
    CognitivePsychology[Cognitive Psychology] --> Norman
    CognitivePsychology --> Moran
    CognitivePsychology --> Lewis
    Newell --> Card
    Newell --> GOMS
    Card --> GOMS
    Moran --> GOMS
    GOMS --> CHI
    Norman --> CHI
    Engelbart --> OfficeAutomation[Office Automation]
    OfficeAutomation --> SIGOA
    SIGOA --> CHI
    Shackel --> HUSAT
    HUSAT --> INTERACT
    HumanFactors --> INTERACT
    CHI --> UsabilityEngineering[Usability Engineering]
    FifthGeneration[Fifth Generation] --> AIWinter[AI Winter]
    Feigenbaum --> FifthGeneration
    Lenat --> CYC
    CYC --> AIWinter

    class Skinner,Card,Gould,Norman,Moran,Lewis,Newell,Engelbart,Shackel,Feigenbaum,Lenat internal-link;

Key Thinkers

ThinkerDatesMovementMain WorkCore Concept
Card1945–Cognitive ScienceThe Psychology of Human–Computer Interaction (1983)GOMS, keystroke-level model
Newell1927–1992Cognitive ScienceHuman Problem SolvingProduction systems, cognitive architecture
Norman1935–Cognitive Engineering”Design Principles for Human–Computer Interfaces” (1983)User satisfaction functions, cognitive engineering
GouldHuman FactorsCHI’83 iterative design paperUser-centered iterative design
Shneiderman1947–Human-Computer InteractionHCIL founding (1983)Direct manipulation, interface guidelines
Shackel1927–2007Human FactorsINTERACT’84 chairUsability, European HCI
Feigenbaum1936–Artificial IntelligenceFifth Generation warnings (1983)Expert systems, singularity
Lenat1950–Artificial IntelligenceCYC project (1984)Common-sense knowledge base

Key Concepts

ConceptDefinitionRelated to
Discretionary useComputer use where the user chooses whether to engage, contrasted with mandated operational tasksHuman-Computer Interaction, Usability
Mandatory useRequired computer use by operators, clerks, and workers with no choice about tool adoptionHuman Factors, Efficiency
GOMSGoals, Operators, Methods, Selection rules—cognitive model for expert user performanceCard, Newell, Moran
Keystroke-level modelPredicting expert task time based on keystroke sequencesCard, Human Factors
Office automationMinicomputer-based productivity tools for workgroups: word processing, email, file sharingEngelbart, SIGOA
Cognitive engineeringApplication of cognitive science to interface design, distinguishing CHI from behavioral human factorsNorman, CHI
Fifth GenerationJapanese government AI initiative (1982–1992) triggering Western competitive panicICOT, AI Winter
AI winterPeriod of reduced funding and interest following unfulfilled AI promisesMinsky, DARPA
Usability engineeringSystematic approach to designing usable systems through iterative testingGould, Lewis
BundlingPractice of including software free with hardware purchases, ended by IBM antitrust action (1969)Software Industry, IBM

Authors Comparison

ThemeGrudinCard/Newell
Primary concernHistorical sociology of HCI fieldsCognitive modeling of performance
View of human factorsValuable for mandatory use contexts”Classical human factors has second-class status”
Methodological preferenceArchival, interview-based historyExperimental, model-building
Discretionary useCentral organizing conceptNot primary focus (GOMS models expert repetitive use)
Disciplinary politicsDocuments CHI-HF divergence neutrallyActively shaped CHI to prevent “becoming human factors”
AI assessmentCyclical hype/winter patternNeural networks, production systems as cognitive models

Influences & Connections

Summary Formulas

  • Grudin: The 1980–1985 period crystallized discretionary use as the organizing paradigm for CHI, differentiating it from human factors through cognitive psychology methods and first-time user priorities, while AI investments yielded another cycle of inflated expectations and disappointment.

  • Card/Newell: Transform HCI into “hard science” through cognitive modeling (GOMS, keystroke-level) that predicts expert performance, avoiding the “second-class status” of classical human factors.

  • Bannon: The shift “from human factors to human actors” has broader implications than CHI’s narrow focus on novice first-time use acknowledges; European perspectives emphasize ongoing skilled work.

Timeline

YearEvent
1979VisiCalc spreadsheet for Apple II demonstrates business potential
1980Stanford International Symposium on Office Automation; SIGOA formed
1981IBM PC released; Hayes Smartmodem enables BBS proliferation; Xerox Star, Symbolics/LMI Lisp machines
1982Japanese Fifth Generation project announced; SIGOA initiates COIS; Behaviour and Information Technology launched
1983First CHI conference (1,000+ attendees); Card/Moran/Newell Psychology of Human–Computer Interaction; HCIL founded at Maryland; MCC consortium formed
1984Apple Macintosh; INTERACT’84 London; AT&T divestiture; HP inkjet printers; MCC purchases Lisp machines; Loughborough HCI program
1985Human–Computer Interaction journal founded; AT&T Unix PC fails; HP/Apple laser printers; Ovum predicts 1,000% AI revenue growth

Notable Quotes

“Hard science, in the form of engineering, drives out soft science, in the form of human factors.” — Newell and Card

“It’s not enough just to establish what computer systems can and cannot do; we need to spend just as much effort establishing what people can and want to do.” — Smith and Green

“Text editors are the white rats of HCI.” — Green