Computer-Mediated Communication Magazine / Volume 1, Number 2 / June 1, 1994 / Page 5

Fair Information Practices with Computer Supported Cooperative Work (CSCW)

by Rob Kling (

The term "CSCW" was publicly launched in the early 1980s. Like other important computing terms, such as artificial intelligence, it was coined as a galvanizing catch-phrase, and given substance through a lively stream of research. Interest quickly formed around the research programs, and conferences identified with the term advanced prototype systems, studies of their use, key theories, and debates about them. CSCW offers special excitement: new concepts and possibilities in computer support for work.

CSCW refers to both special products (groupware), and to a social movement by computer scientists who want to provide better computer support for people, primarily professionals, to enhance the ease of collaborating. Researchers disagree about the definition of CSCW, but the current definitions focus on technology. I see CSCW as a conjunction of certain kinds of technologies, certain kinds of users (usually small self-directed professional teams), and a worldview which emphasizes convivial work relations. These three elements, taken together, differentiate CSCW from other related forms of computerization, such as information systems and office automation which differ as much in their typical users and the worldview describing the role of technology in work, as on the technology itself (Kling, 1991). CSCW is the product of a particular computer-based social movement rather than simply a family of technologies (Kling and Iacono, 1990).

The common technologies that are central to CSCW often record fine grained aspects of people activities in workplaces, such as typed messages, notes, personal calendar entries, and videotapes of personal activity. Electronic mail is the most popular of the CSCW technologies (Bullen and Bennett, 1991) and is a useful vehicle for examining some of the privacy issues in CSCW. Many electronic mail messages contain personal communications which include opinions and information which many senders would prefer not to be public information. However, most electronic mail system users I have spoken to are ignorant of the conditions under which their transmissions will be maintained as private communications by their own organizations. (They often assume that their electronic communications will be treated as private by their organizations. Others are extremely sensitive to the possible lack of privacy/security of email transmissions.)

Discussions of computerization and privacy are highly developed with respect to personal record systems which contain information about banking, credit, health, police, schooling, employment, insurance, etc. (Kling and Dunlop, 1991:Section V). Definitions of personal privacy have been examined in extensive literature about personal privacy and record-keeping systems. Analysts have been careful to distinguish security issues (e.g., lock and keys for authorized access) from privacy issues -- those which involve people's control over personal information. There has also been significant discussion of the interplay between privacy and other competing social values. The privacy issues in CSCW both have important similarities and differences when compared with the issues of personal record systems. We can gain helpful insights by building on this body of sustain thinking about privacy and record systems to advance our understanding of privacy issues in CSCW.

Another related and helpful set of inquiries examines the surveillance of workers in measuring activities related to quality of service and individual productivity (Attewell, 1991; Kling and Dunlop, 1993). Some of the most intensive fine grained electronic monitoring involves listening to the phone calls of service workers such as reservationists, and fine-grained productivity counts, such as the number of transactions that a worker completes in a small time period. While all managers have ways of assessing their subordinates' performance, clerks are most subject to these fine grained forms of electronic surveillance. The CSCW community has focussed on professionals as the key groups to use groupware and meeting support systems. Consequently, electronic monitoring has seemed to be implausible.

The computing community is beginning to be collectively aware of the possible privacy issues in CSCW applications. Professionals who use CSCW can lose privacy under quite different conditions than clerks who have little control over the use of electronic performance monitoring systems. And personal communications, like electronic mail or systems like gIBIS which supports debates, record personally sensitive information under very different conditions than do information systems for regulatory control such as systems of motor vehicle, health and tax records.

The use of email raises interesting privacy issues. In the case of email, privacy issues arise when people lose control over the dissemination of their mail messages. When should managers be allowed to read the email of their subordinates? One can readily conjure instances where managers would seek access to email files. These can range from curiosity (such as when a manager wonders about subordinates' gossip, and requests messages which include his name in the message body), through situations in which a legal agency subpoenas mail files as part of a formal investigation. A different, but related set of issues can occur when a manager seeks mail profiles: lists of people who send more than N messages a day, lists of people who read a specific bulletin board or the membership of a specific mailing list.

CSCW systems differ in many ways that pertain to informational control. For example, systems such as email and conferencing systems retain electronic information which can be reused indefinitely with little control by the people who were writing with the system. One can imagine cases in which managers may wish to review transcripts of key meetings held by computer conferencing to learn the bases of specific decisions, who took various positions on controversial issues, or to gain insight into their subordinate's interactional styles. Other systems, such as voice and video links, are often designed not to store information. But they can raise questions about who is tuning in, and the extent to which participants are aware that their communication systems is "on." In the literature about computerization and privacy, similar questions have been closely examined -- regulating the duration of records storage, the conditions under which people should be informed that a third party is seeking their records, and conditions under which individuals may have administrative or legal standing in blocking access to their records (See Dunlop and Kling, 1991, Section V).

One of the peculiarities of CSCW in contrast with traditional record keeping systems is the nature of the social settings in which systems are being developed and explored. Most personal record systems are developed in relatively traditional control-oriented organizations. In contrast, most CSCW applications have been developed in academic and industrial research labs. These settings are protective of freedom of speech and thought and less authoritarian than many organizations which ultimately use CSCW applications. In fact, relatively few CSCW applications, other than email and Lotus Notes, are used by the thousands of people in traditional organizations (Bullen and Bennett, 1991). Further, CSCW systems are primarily designed to be used by professionals rather than technicians and clerks. Professionals generally have more autonomy than clerks, who are most subject to computerized monitoring (Attewell, 1991). As a consequence, many CSCW developers don't face problems of personal privacy that may be more commonplace when prototype systems are commercialized and widely used.

These contrasts between R&D with CSCW and the likely contexts of application should not impede us from working hard to understand the privacy issues of these new technologies. CSCW applications are able to record more fine grained information about peoples' thoughts, feelings, and social relationships than traditional record keeping systems. They can be relatively unobtrusive. The subject may be unaware of any scrutiny. In R&D labs, we often have norms of reciprocity in social behavior: monitoring can be reciprocal. However, in certain organizations, monitoring may follow a formal hierarchy of social relations. For example, supervisors can monitor the phone conversations of travel reservationists and telephone operators, but the operators cannot monitor their supervisors. The primary (publicized) appropriations of "private email" have been in military organizations, NASA, and commercial firms like Epson, rather than in university and industrial laboratories.

CSCW creates a new electronic frontier in which people's rights and obligations about access and control over personally sensitive information have not been systematically articulated. I believe that we need to better understand the nature of information practices with regard to different CSCW applications that balance fairness to individuals and to their organizations.

It is remarkable how vague the information practices regulating the use of the few commonplace CSCW applications are. Yet we are designing and building the information infrastructures for recording significant amounts of information about people thoughts and feelings which are essentially private and not for arbitrary circulation, without the guidelines to safeguard them. People who use computer and telecommunications applications need to have a basic understanding about which information is being recorded, how long it is retained (even if they "delete" information from their local files, who can access information about them, and when they can have some control over restricting access to their information.

In the late 1970s the U.S. Privacy Protection Study Commission developed a set of recommendations for Fair Information Practices pertinent to personal record keeping systems (PPSC, 1977:17-19). A concern of Commission members was to maximize the extent to which record systems would be managed so that people would not be unfairly affected by decisions which relied upon records which were inaccurate, incomplete, irrelevant or not timely. Commission members believed that record keeping systems in different institutional settings should be regulated by different laws. For example, people should have more control over the disclosure of their current financial records than over the disclosure of their current police records. On the other hand, the Commission proposed that each institutional arena should be governed with an explicit set of Fair Information Practices. In a similar way, different families of CSCW applications or different institutional settings may be most appropriately organized with different Fair Information Practices. In the case of CSCW applications, fairness may have different meanings than in the case of decisions based upon personal records systems.

We need fearless and vigorous exploratory research to shed clear light on these issues. This rather modest position contrasts strongly with that taken by Andy Hopper of Olivetti, one of the panelists at this plenary session on CSCW'92. He was enthusiastic about the use of "active badges" (Want, Hopper, Falcao, and Gibbons, 1992) and insisted on discussing only their virtues. He argued that one can imagine many scenarios in which people are harmed by some uses of a particular technology, but that discussing such scenarios is usually pointless. Hopper's 1992 co-authored article about active badges examines some of the privacy threats their use can foster. But on the plenary panel he was critical of people who asked serious questions about the risks, as well as the benefits of new CSCW technologies. In this way, he took a position similar to that taken by spokespeople of many industries, including such as automobiles, who have delayed serious inquiries and regulatory protections for environmental and safety risks by insisting on unambiguous evidence of harm before investigating plausible problems.

The active badge systems which Hopper described seem to be regulated by Fair Information Practices in his own research laboratory (e.g., no long term storage of data about people's locations, reciprocity of use, discretion in use). These sorts of Fair Information Practices may be required to help insure that active badges are a convenient technology which do not degrade people's working lives. Other kinds of information practices, such as those in which location monitoring is non-reciprocal, and non-discretionary may help transform some workplaces into electronic cages. Hopper and his colleagues briefly mention such possibilities in their 1992 ACM TOIS article about active badges. And their article deserves some applause for at least identifying some of the pertinent privacy problems which active badges facilitate. However they are very careful to characterize fine grained aspects of the technological architecture of active badges, while they are far from being comparably careful in identifying the workplace information practices which can make active badges either primarily a convenience or primarily invasive. I believe that CSCW researchers should be paying careful attention to social practices as well as to technologies. Richard Harper's (1992) ethnographic study of the use of active badges in two research labs illustrates the kind of nuanced analyses which we need, although Harper also glosses the particular information practices which accompanied the use of active badges in the two labs.

Unfortunately, delays in understanding some risks of emerging technologies have led the public to underestimate the initial magnitude of problems, and to make collective choices which proved difficult alter. Our design of metropolitan areas making individually operated cars a virtual necessity is an example. In the early stages of use, the risks of a new family of technologies are often hard to discern (See Dunlop and Kling, 1991, Part VI). When major problems develop to the point that they are undeniable, amelioration may also be difficult.

I characterized CSCW, in part, as a social movement (Kling and Iacono, 1990). Most of us who study, develop, or write about CSCW enthusiastically, (and sometimes evangelistically) encourage the widespread use of these new technologies. However, as responsible computer scientists, we should temper our enthusiasms with appropriate professional responsibility. CSCW applications open important organizational opportunities, but also opens privacy issues which we don't understand very well.

The new ACM Ethical Code (ACM, 1993) also has several provisions which bear on privacy issues in CSCW. These include provisions which require ACM members to respect the privacy of others (Section 1.7), to improve public understanding of computing and its consequences (Section 2.7), and to design and build information systems which enhance the quality of working life (Section 3.2). The ACM's code is rather general and does not give much specific guidance to practitioners. The CSCW research community is well positioned to conduct the kinds of research into the social practices for using these technologies which could shape meaningful professional guidelines for their use in diverse organizations. Will we take a leadership role in helping to keep CSCW safe for users and their organizations?

Note: I appreciate discussions with Jonathan Allen, Paul Forester, Beki Grinter, and Jonathan Grudin which helped clarify some of my key points.


  1. Association of Computing Machinery. 1993. "ACM Code of Ethics and Professional Conduct." Communications of the ACM. 36(2)(Feb.):99-103.
  2. Attewell, Paul. "Big Brother and the Sweatshop: Computer Surveillance in the Automated Office" in Dunlop and Kling 1991.
  3. Bullen, Christine and John Bennett. 1991. Groupware in Practice: An Interpretation of Work Experience" in Dunlop and Kling 1991.
  4. Dunlop, Charles and Rob Kling (Ed). 1991. Computerization and Controversy: Value Conflicts and Social Choices. Boston: Academic Press.
  5. Harper, Richard H.R. "Looking at Ourselves: An Examination of the Social Organization of Two Research Laboratories" Proc. CSCW '92: 330-337.
  6. Kling, Rob. 1991. "Cooperation, Coordination and Control in Computer-Supported Work." Communications of the ACM 34(12)(December):83-88.
  7. Kling, Rob and Charles Dunlop. 1993. "Controversies About Computerization and the Character of White Collar Worklife." The Information Society. 9(1) (Jan-Feb:1-29.
  8. Kling, Rob and Suzanne Iacono. 1990. "Computerization Movements" Chapter 19, pp 213-236 Computers, Ethics and Society, David Ermann, Mary Williams & Claudio Guitierrez (ed.) New York, Oxford University Press.
  9. Privacy Protection Study Commission. 1977. Personal Privacy in an Information Society, U.S. Government Printing Office, Washington D.C. (briefly excerpted in Dunlop and Kling, 1991.)
  10. Want, Roy, Andy Hopper, Veronica Falcao and Jonathan Gibbons. 1992. "The Active Badge Location System" ACM Transactions on Information Systems. 10(1)(January): 91-102.

Rob Kling a Professor in the Department of Information & Computer Science and the Center for Research on Information Technology and Organizations at the University of California at Irvine. The above essay is based on a paper which appeared in SIGOIS Bulletin, July 1993

Copyright © 1994 Rob Kling. This article may be redistributed or printed for when there is no charge other than nominal costs of reproduction. Other reproductions and redistributions require the permission of the author.

This Issue / Index