Communication Studies 197B

Information and Institutional Change

Spring 2000

Phil Agre

phone: (310) 825-7154
email: pagre@ucla.edu
home: http://dlis.gseis.ucla.edu/pagre/

Everybody knows that information technology is going to change the world, but nobody knows how. This class offers useful concepts about the subject from computing, economics, law, politics, and management.

Course assignments. Your major assignment for the course is to apply the concepts from the lectures and readings to some institution that you know and care about -- for example, the institution where your career plans will take you. If you have no career plans, this would be a good time to get some, at least provisionally. I can help with this. Much of the course grade will derive from a single paper whose purpose is simply to apply many of the course concepts to your chosen institution in a way that lets you construct a coherent argument about it. Shorter writing exercises will be due every week as preparation for this larger task.

In addition to the readings that I have listed week-by-week below, I will also be assigning a large number of newspaper articles. Some of these are included in the reading packet, and others will be handed out in class. I want to create a sense of our course as a breaking news story, and I also want the real examples reported in the news articles to be case studies and reality checks for the many theories we will be considering. I believe that you don't understand a theory until you can apply it to cases, and that's why I want you to analyze your own case in a sustained way from many angles, as well as a large number of smaller cases as reported in the newspaper articles. The newspaper articles are listed at the bottom of this syllabus, and you should read them all during the first few weeks of heavy theorizing.

Week 2 (April 11th). Background.

This week we will gather a variety of perspectives that might motivate the project of understanding the place of information in institutional change. The readings will not have any particular unity. My hope, rather, is that everyone will find at least one point of entry to the material.
Philip E. Agre, Yesterday's tomorrow, Times Literary Supplement, 3 July 1998, pages 2-3.

This is a rapid sketch of at least one cluster of institutional approaches to computing and its place in society. It tries to get beyond prevailing ideas that the Internet is a "cyberspace" separate from the real world. Those ideas make sense in historical terms and as misunderstandings of a temporary situation, but in the long run it will be crucial to understand the Internet as something very much embedded in, and coevolving with, social relationships as they already exist.
Scott Shane, Dismantling Utopia: How Information Ended the Soviet Union, Chicago: Dee, 1995. Chapter 4: The KGB, father of perestroika.

A functioning modern society requires open information, and so many people believe that open information caused the collapse of the Soviet Union. While this story might be too simple, the place of information in the actual functioning of Soviet society is fascinating. This chapter concerns the role of the Soviet secret police, the KGB, which was the only organization in the whole country that had a complete view of what was really going on. This is why, counterintuitively, it was the KGB that set in motion the reform process that led to the Soviet system's demise.
The Economist, Special section on electronic commerce, 26 February 2000.

The big information story this month concerns "electronic commerce", which usually means companies selling things to people on the Web. Although this is a very narrow view of what "electronic commerce" could possibly mean, this bundle of articles on the topic from the Economist is good enough that you will not have to read anything else for a while.
Rob Kling and Suzanne Iacono, Computerization movements and the mobilization of support for computing, in Jacques Berleur, Andrew Clement, Richard Sizer, and Diane Whitehouse, eds, The Information Society: Evolving Landscapes, Springer-Verlag, 1990.

Where do computers come from? In a shallow sense they come from research labs and computer companies. But in a deeper sense they are shaped and proliferated by social movements that agitate for computing by means of utopian ideologies that are remarkably consistent both in their ideas and in their detachment from reality. This article, written before the craze for "virtual reality" and "cyberspace", identifies the patterns in computerization movements.
Week 3 (April 18th). Basic economics of information technology.

We will focus very hard this week on three ideas that will recur endlessly in the weeks to come. These ideas are: economies of scale (the contrast, anomalous in the terms of conventional economics, between the high fixed costs of producing useful information goods and the low variable costs of reproducing them), positive feedback (the patterns by which institutions, once established, tend to persist, so that the strong get stronger and social arrangements hang around after they have outlived their usefulness), and network effects (the distinctive economic properties of goods such as telephones and operating systems whose utility depends on the number of other people who use them).
John Cassidy, The force of an idea, New Yorker, 12 January 1998, pages 32-37.

This is a popular article about Brian Arthur's theory of positive feedback through increasing returns, and how this theory led to, among other things, the Microsoft antitrust trial. The article probably gives too much credit to Arthur (through no fault of his), but it's still an interesting story.
Brian Arthur, Increasing returns and the new world of business, Harvard Business Review, July 1996, pages 100-109.

In this article, also relatively nontechnical, Arthur applies his theory of positive feedback to business strategy. Competition has distinctive dynamics in markets that exhibit increasing returns. Because the market tends to pick a single winner and the lock in that winner's position almost irreversibly, it pays to be lucky or to make a huge up-front investment.
Carl Shapiro and Hal Varian, Information Rules: A Strategic Guide to the Network Economy, Boston: Harvard Business School Press, 1998. Chapter 7: Networks and positive feedback.

This is a long chapter about the economic dynamics of technical standards and some of their consequences for competition. When products need to be compatible with one another, consumers will want to get what everyone else is getting. As a result, it can be hard to sell anything at first (everyone is waiting to see what everyone else will get). But once a critical mass emerges, the market will change dramatically overnight. The chapter ends with some extended case studies from familiar technologies such as video recorders.
Marc Pesce, The great leap downward, Feed, February 1997.

A brief, humorous article about the author's attempt to establish a standard for online "immersive environments" called VRML. In retrospect VRML was a bad idea because the technology was not yet powerful to do anything interesting with it. But Pesce wanted to avoid a proliferation of incompatible standards later on, and he engaged in endless politics to prevent major vendors from creating their own proprietary alternatives. The machinations that ensue are instructive and sometimes hilarious. Observe especially the interaction between politics and economics.
Week 4 (April 25th). Institutions.

Now we get to the central analytical category of the course, the concept of an institution. An institution is a persistent form of relationships among people; examples include horseracing, the medical system, greeting rituals, the university, the stock market, management consulting, Christmas, the family, the common law, and the nation-state. Institutions can vary across history and between different societies, but they are remarkable for their ability to remain relatively unchanged for hundreds of years at a time. Institutional persistence is a bad thing when the institutions are unjust or inefficient, but it is a good thing when they enable people to predict the future, focus their attention, and compel others to keep their promises.
Robert E. Goodin, Institutions and their design, in Robert E. Goodin, ed, The Theory of Institutional Design, Cambridge University Press, 1996.

This is a difficult chapter that surveys ideas about institutions in various social scientific fields. The background is that the design of institutions became a hot topic after the fall of the Soviet Union, when literally dozens of countries were faced with rebuilding and reinventing social institutions that were no longer legitimate, or that no longer worked. The story of American "experts" flying in to offer their advice is not entirely savory, not least because the experts, never having lived in a society with dysfunctional institutions, had only a superficial idea of how institutions really worked. Nonetheless, a fair amount of valuable theorizing and scholarship was done during this period, and this is a good enough place to begin.
Douglass North, Institutions, Journal of Economic Perspectives 5(1), 1991, pages 97-112.

This is an only slightly less difficult article about the evolution of economic institutions. North believes that history is a great evolutionary march toward the idealized market institutions that were first imagined in England in the 18th century, and he tries to explain this march in terms of the successive increments by which people reform their market institutions to be more efficient. He is smart enough to know that this theory is wholly implausible in a world of politics and power, and so this is a snapshot of his attempt to define a theory of institutional change that accommodates his overall picture to the inconvenient reality.
Week 5 (May 2nd). Organizations.

Institutional theorists make a big point of distinguishing between organizations, which come and go, and the persistent institutions within which those organizations are embedded. For example, one distinguishes between the institution of the university and particular universities, or between the institution of broadcast journalism and particular news stations. Having established what institutions are and what can be gained by analyzing them, we can look with fresh eyes at the place of information -- and information technology -- in organizations. Although organizations include government agencies, nonprofits, and civic associations, by "organizations" here we will mostly mean "companies". That's because most of the money, and thus most of the useful literature, has been focused on business.
James Brian Quinn, Intelligent Enterprise: A Knowledge and Service Based Paradigm for Industry, New York: Free Press, 1992. Chapter 4: Revolutionizing organizational strategies.

Quinn surveys the revolution in service industries that information technology makes possible. A key idea is that communications networks and detailed computerized tracking of work activities make middle managers much less necessary. Extrapolating from this idea, Quinn describes a family of organizational forms that reduce work to its "least replicable units" and then network it into "flat" organizations with a minimum of centralized overhead. The point is not that freedom and democracy reign in the corporate world, but rather that control is immanent in the work processes themselves, for example through thoroughgoing measurement of the outcomes of work.
Thomas H. Davenport, Saving IT's soul: Human-centered information management, Harvard Business Review, March 1994, pages 119-131.

An old view of information treated it as an industrial material like steel, and traditional information systems design automated already-rationalized flows of documents without wondering what the documents meant. This view has reached the end of the line as organizations have regained awareness of information as a political football to be hoarded, filtered, and spun. Davenport describes a wide variety of these human phenomena around information and sketches their consequences for the design of organizations and technologies.
Wanda J. Orlikowski, Learning from Notes: Organizational issues in groupware implementation, in Rob Kling, ed, Computerization and Controversy: Value Conflicts and Social Choices, second edition, Academic Press, 1996.

In this celebrated case study, Orlikowski watched a major consulting firm adopt a "groupware" tool called Lotus Notes. The company's chief technology manager decided that simply installing Notes would touch off a revolution. The revolution didn't happen, and Orlikowski explained why. One problem was cognitive: nobody explained the vision for the technology, so the people used it in the same way that they had used earlier, familiar technologies like e-mail. Another problem concerned incentives: consultants' promotions depended on their building a distinctive practice, and this made sharing information a bad idea. The major lesson is simple enough in retrospect: the software has to fit with the culture.
John L. King, Where are the payoffs from computerization? Technology, learning, and organizational change, in Rob Kling, ed, Computerization and Controversy: Value Conflicts and Social Choices, second edition, Academic Press, 1996.

This is an introduction to the "productivity paradox": despite the huge investments in information technology over the last decades, it is hard to demonstrate a net payoff in terms of increased efficiency in industry. The problem may partly be one of measurement -- how do you measure increases in quality as opposed to quantitative increases in output? But another likely answer is that the real benefits of information technology do not come until institutions have changed, and institutions change slowly.
Week 6 (May 9th). Market structure.

Last week we looked at organizations in isolation, and now we look at the interfaces between them. In other words, we look at the ways that companies buy and sell goods among themselves. Why are the boundaries between organizations located where they are? How does the business at those boundaries get transacted? And how do the boundaries change when the technology underlying those transactions changes as dramatically as it is changing right now?
Friedrich A. Hayek, The use of knowledge in society, in Individualism and Economic Order, Chicago: University of Chicago Press, 1963.

Hayek was to capitalism what Marx was to communism: not its inventor, but its most influential modern intellectual/activist. And where Marx foresaw that a fragmented market without central coordination would tear itself apart, Hayek argued that central coordination was impossible because no centralized body could possibly integrate the vast quantities of local information that market participants took into account every day. A building full of Linux supercomputers might have kept the Soviet Union running a couple of decades longer, but Hayek would have been unimpressed.
Ronald H. Coase, The nature of the firm, Economica NS 4, 1937, pages 385-405. Reprinted in Oliver E. Williamson and Sidney G. Winter, The Nature of the Firm: Origins, Evolution, and Development, Oxford: Oxford University Press, 1991.

If the market is so great, and if hierarchies are so bad, why is so much of the economy coordinated by big hierarchical corporations? If trade between individuals in the market is the most efficient way to run an economy, why do companies exist at all? Coase's immensely influential argument is that companies would not exist if market mechanisms were costless to operate, but that markets require (what later came to be called) "transaction costs" such as searching for goods, negotiating contracts, handling money, monitoring to ensure that contractors do what they're supposed to, and fighting about the matter in court. Other things being equal, a reduction in transaction costs -- especially the kind that results from new information and communication technologies -- predicts that companies will break into parts. Other things are rarely equal, but it's a productive argument anyway.
Mark Casson, Economic perspectives on business information, in Lisa Bud-Frierman, ed, Information Acumen: The Understanding and Use of Knowledge in Modern Business, London: Routledge, 1994.

Market transactions require large amounts of information, and many institutions can be understood as responses to information problems. Trading customs, for example, can give way to localized negotiation as information about traders and their goods becomes cheaper to acquire and process. Casson's chapter surveys the startling number and variety of consequences that follow from this basic argument.
Daniel F. Spulber, The Market Makers: How Leading Companies Create and Win Markets, New York: McGraw-Hill, 1998. Chapter 5: Intermediation.

Markets do not happen by magic; they are human, social processes that have an economics of their own. Spulber makes abstruse economic theories of market mechanisms, and this is a chapter from a book that he wrote to explain the strategic consequences of his theories for the companies that actually make markets. The chapter concerns intermediaries: companies that bring buyers and sellers together. The Internet is often said to eliminate intermediaries by enabling buyers and sellers to deal with one another directly. This idea turns out to be wildly false, but it does at least draw our attention to the diverse bundles of roles that intermediaries can play. Those roles usually get redefined in the context of new communications technologies like the Internet, and Spulber provides a taxonomy of the roles that helps us to make predictions.
Week 7 (May 16th). Libraries.

Now that we've established some theory, this week begins a series of case studies of particular institutions. A recurring futurology of the Internet imagines that institutions will simply disappear -- that libraries, for example, will turn into big online databases. Such things are easy to say, but they fail to reckon with the full complexity of documents and the roles they play in institutional life. These readings survey some of the issues. Even though they are framed in terms of libraries, their lessons apply more widely.
J. C. R. Licklider, Excerpt from Libraries of the Future, in Mark Stefik, ed, Internet Dreams, MIT Press, 1996.

Licklider was one of the visionaries who saw modern information technologies coming, and during his time as a research director for the US military he was able to fund some of them. In these excerpts from his 1965 book about "libraries of the future", he offers a long series of predictions. Many of them, including the rise of the networked personal computer, were strikingly correct. In fact close study reveals a pattern: all of his predictions that computers would exhibit any intelligence at all proved completely wrong, and all of his other predictions were not optimistic enough.
David M. Levy and Catherine C. Marshall, Going digital: A look at assumptions underlying digital libraries, Communications of the ACM 38(4), 1995, pages 77-84.

Because viewing digital libraries as big databases is too limiting, Levy and Marshall explore how digital libraries can be understood as components of larger patterns of activity. They suggest that digital libraries will only be useful if they are designed with a full awareness of the interactions among documents, technology, and work.
Andrew Blau, Floods don't build bridges: Rich networks, poor citizens and the role of public libraries, in Sally Criddle, Lorcan Dempsey, and Richard Heseltine, eds, Information Landscapes for a Learning Society, London: Library Association, 1999.

Blau looks critically at the received idea that big collections of digital information will improve democracy by giving everyone access. This scenario ignores the social mechanisms by which people can evaluate information. When information flows in stable institutional channels, such as associations with their settled relationships and professions with their claims to expertise, each channel can develop a track record. New technologies will not help democracy if they lead to the chaotic rise and fall of communities that create no lasting social bonds. Blau argues, therefore, that libraries retain an important role in keeping the public sphere glued together, for example by cultivating communities of common information.
Pamela Samuelson, Encoding the law into digital libraries, Communications of the ACM, 41(4), 1998, pages 13-18.

This is a preview of a theme that will be central in the last week of the class: the ways in which information technologies encode rules. Even though a legislature might establish intellectual property rules, software and hardware can be designed to enforce a different set of rules. In particular, the "fair use" provisions that limit copyright holders' control over individuals' uses of their materials can be undermined if digital libraries and other networked applications require users to authenticate themselves and then track and regulate what the users do with the information.
Week 8 (May 23rd). University.

Our next case study is the university, whose demise the futurology of the Internet also predicts. Universities are all about the life of the mind, aren't they?, and so it stands to reason that the bricks and mortar campus can be replaced by big online databases and large-scale groupware applications. Although true in bits and pieces, this idea is wildly simplistic, and our goal will be to locate the dividing line between the physical and the digital. As in the case of libraries, this analysis will be applicable to a variety of other institutions.
Philip E. Agre, The distances of education, Academe 85(5), 1999, pages 37-41.

The standard story about distance education is that instructional delivery, quote-unquote, should exhibit vast economies of scale; students assembling their educations over the Internet will have an infinite variety of inexpensive online courses to choose from. This vision is appealing, but it doesn't make sense: information technology can only produce economies of scale in higher education by increasing uniformity and decreasing choice. That doesn't mean that information technology has no potential uses in higher education, but it does require us to back up and ask what real problems we want the techology to solve. One of these problems is surely the tension between the liberal arts and vocational models of education.
John Seely Brown and Paul Duguid, Universities in the digital age, in Brian L. Hawkins and Patricia Battin, eds, The Mirage of Continuity: Reconfiguring Academic Information Resources for the 21st Century, Washington, DC: Council on Library Resources, 1998.

Brown and Duguid apply to higher education the valuable concept of a "community of practice": shared culture and activity among people who have a kind of knowledge in common. On this theory, learning something isn't just acquiring knowledge or skills; it is also a matter of acquiring an identity and joining a community. This suggests a new way to organize universities in terms of communities of practice that keep in touch over the Internet and administer their own training and accreditation mechanisms. The resulting approach usefully connects educational institutions to the professions that students hope to join.
David Noble, Digital diploma mills: The automation of higher education, October 1997.

Noble is a critic of the hidden control agendas in automation, and he views the digital university as simply one more chapter in a long history of workplace conflict over technology. The point is not that the machinery itself is inherently bad, but that the machinery tends to be shaped and promoted as part of a larger package of changes driven by those in power. In this article, he raises warnings based on his analysis of technology initiatives at UCLA that he believes can result in faculty losing control over the classroom materials that they have worked to create. Noble's essay is accompanied by a long commentary of my own that seeks to head off some frequent misunderstandings about the relationship between technology and institutions.
M. M. Scott, Intellectual property rights: A ticking time bomb in academia, Academia 84(3), 1998, pages 22-26.

The university is an ancient institution still largely driven by ancient customs, but the Internet often forces institutions to revisit issues that have received little attention in living memory. One of those issues is intellectual property. So long as professors prepare their lectures and other classroom materials using common resources like the library and cheap technologies such as desktop computers, nobody needs to worry about who owns the materials that result. But if economies of scale lead to elaborate courseware whose production requires a huge drain on university resources, the spectre arises of complex wrangling over copyright and other kinds of intellectual property control. Such issues should not be faced reactively at the last minute, or in quiet committees that spring faits accomplis on faculty whose attention is usually buried in their own teaching and research topics.
Week 9 (May 30th). Local communities.

The word "community" carries a lot of warm fuzzies, and accordingly it also carries many definitions. Myths of an ideal community since lost are central to several traditions, including the academic field of sociology and the historical memory of the United States. It is only by recovering from these myths that we can investigate how local geographic communities change in the context of the Internet and other new information and communication technologies. A local geographic community is not itself an institution, but it includes many specific institutions and its members are also wrapped up in a global network of institutions that intersect in every household and main street.
Gary Chapman and Lodis Rhodes, Nurturing neighborhood nets, Technology Review, October 1997, pages 48-54.

Community groups have tried to promote economic development and other good things by providing poor communities with access to the Internet. "Access" here is not just keyboard time: it also includes training and social support. This article describes one such project in Austin, Texas.
Willard Uncapher, Electronic homesteading on the rural frontier: Big Sky Telegraph and its community, in Marc A. Smith and Peter Kollock, eds, Communities in Cyberspace, Routledge, 1999.

This remarkable study traces the rise and fall of an early network for rural schools in Montana, placing the technology in the context of rural society and its place in a global economy. Despite our stereotypes of rural life, the people that Uncapher studied were quite aware of their place in a global economy and the symbolic and practical roles that information technology could play.
Jan A. English-Lueck, Technology and social change: The effects on family and community, Paper presented at the COSSA Congressional Seminar, 19 June 1998.

This is a brief speech based on the author's ethnographic studies of the early-adopter culture of Silicon Valley. How does family life change when technology enables boundaries between work and home to break down, and when family members are joined together continuously by those same technologies? Some of the changes are obvious and logistical, but others are cultural -- in conceptions of oneself, others, and work.
Philip E. Agre, Building an Internet culture, Telematics and Informatics 15(3), 1998, pages 231-234.

Policies for promoting a networked society often focus on technology to the exclusion of all other issues. This brief essay gathers several useful ideas about how to develop a culturally appropriate policy for encouraging adoption of the Internet. One idea is to start with existing social networks, because people mostly want to communicate with the people they are already know. Another is to build institutional capacity by trying experiments, publicizing the ones that work, and waiting for the technology to become cheaper. Adopting the Internet means building an Internet culture, and that ultimately is the same as building a healthy society of decentralized power and initiative.
Elfreda A. Chatman, The impoverished life-world of outsiders, Journal of the American Society for Information Science 47(3), 1996, pages 193-206.

People who have established positions within a set of institutions (as student, voter, bank account holder, driver, and so on) usually cannot understand what it is like to be located "outside" of those institutions, clueless to their workings and cut out of their flows of information. Chatman describes this situation among many of the poor, and draws conclusions about the kinds of information services that would be required to solve the problem.
Week 10 (June 6th). Code and law.

In our final week, we will investigate the interactions between information technology and law. Information technology is shaped and regulated by law to some degree, and so is the industry that produces it. Computer databases are used heavily in the legal profession, and computer networks are a powerful tool for organizing to change the law. Information technology also, as I've mentioned above, serves as a kind of law. Electronic mail, for example, works in some ways and not others. This implicit law of technology can supplement the law of courts and legislatures, or it can supplant it.
Batya Friedman and Helen Nissenbaum, Bias in computer systems, in Batya Friedman, ed, Human Values and the Design of Computer Technology, Cambridge University Press, 1997.

No technology is neutral, but information technology is especially non-neutral. The point is not that information technology as such has any opinions; in fact it is singularly malleable. The point, rather, is that every piece of software and hardware, and every data format and communications protocol, embodies ideas about people and their lives. And these ideas can be biased. A technology can encode discriminatory rules, or its functioning may discriminate against those who cannot understand it. Friedman and Nissenbaum are ethical philosophers, and they provide a taxonomy for understanding the different kinds of bias that computers can embody.
Joel R. Reidenberg, Lex Informatica: The formulation of information policy rules through technology, Texas Law Review 76(3), 1998, pages 553-593.

Reidenberg is a law professor who sees the intrinsic law of the Internet as a modern equivalent of the "lex mercatoria" by which far-flung networks of medieval merchants regulated their dealings across borders and without recourse to governments and their courts. Lex informatica, as he calls it, is a form of self-regulation by which information technology orders human dealings across borders without the need for laws and lawsuits. He explores how lex informatica might be consciously design to complement formal law as a means of regulating technologically mediated activities.
Lawrence Lessig, The path of cyberlaw, Yale Law Journal 104, 1995, pages 1743-1755.

Lessig is also a law professor whose background is in constitutional law. He sees the implict rules of technology as a constitution of "cyberspace", and in this brief paper he explores what attitude we should take to the evolution of law in a world of rapid technical change. Should the law change rapidly as well? He argues not, on the grounds that the English system of common law (also used in the United States) works by discovering order slowly through experience with particular disputes.
David G. Post, Governing cyberspace, Wayne Law Review 43, 1996, pages 155-171.

Post presents yet one more view of the proper relationship between information technology and law. For Post, the problem with law is typically that there is too much of it, and he wants to generalize the system by which governments are disciplined through the freedom of their people to vote with their feet. If you don't like the laws in California, you can move to Oregon. In cyberspace, if such a thing could be said to exist, rule-setting could be regulated through the ability of people to choose which rules they want to live by. The ideal, in Post's view, is a highly decentralized system in which order emerges from people's local interactions and choices, rather than from a centralized authority.
Newspaper articles:

Lee Berton, Many firms cut staff in accounts payable and pay a steep price, Wall Street Journal, 5 September 1996, pages A1, A6.

Douglas A. Blackmon, In the new economy, who are the hunters and who the hunted?, Wall Street Journal, 12 April 2000, pages A1, A16.

Michael A. Cusumano, That's some fine mess you've made, Mr. Gates, Wall Street Journal, 5 April 2000, page A26.

Yochi J. Dreason, It's great all being connected; until, that is, something goes wrong, Wall Street Journal, 1 January 2000, page R38.

Yochi J. Dreason, Student, teach thyself, Wall Street Journal, 1 January 2000, page R46.

Terzah Ewing and Silvia Ascarelli, One world, how many stock exchanges?, Wall Street Journal, 15 May 2000, pages C1, C20.

Steven Greenhouse, E-mail lessens the drudgery for secretaries, New York Times, 24 April 1996, pages B1, B6.

Saul Hansell, Hackers' bazaar: Online auction services put haggling back into sales, New York Times, 2 April 1998.

Saul Hansell, Clash of technologies in merger, New York Times, 13 April 1998, page C4.

Alexandra Harney, Up close but impersonal, Financial Times, 10 March 2000, page 16.

Greg Ip, Archipelago to set up new stock market, Wall Street Journal, 15 March 2000, pages C1, C20.

Greg Jaffe, Entrepreneurs, generals join forces to launch Web sites for soldiers, Wall Street Journal, 24 March 2000, pages B1, B4.

Holman W. Jenkins, Jr., Some things are worse than a woolly Web, Wall Street Journal, 16 February 2000, page A27.

Vincent Kiernan, Internet-based "collaboratories" help scientists work together, Chronicle of Higher Education, 12 March 1999.

Microsoft's real world [editorial], Wall Street Journal, 5 April 2000, page A26.

Thomas S. Mulligan, Schwab, markets battle centralization of system, Los Angeles Times, 1 March 2000, page C4.

Jeff D. Opdyke, US investors are already going global, Wall Street Journal, 15 May 2000, pages C1, C20.

Simon Romero, Weavers go dot-com, and elders move in, New York Times, 28 March 2000.

Jacob M. Schlesinger, Puzzled investors ask: Will the real economy step forward?, Wall Street Journal, 22 March 2000, pages A1, A12.

Michael Schroeder and Randall Smith, Sweeping change in market structure sought, Wall Street Journal, 29 February 2000, pages C1, C22.

Stephanie Simon, Internet changing the way some lawyers do business, Los Angeles Times, 8 July 1996, pages A1, A15.

Amy Stevens, Clients second-guess legal fees on-line, Wall Street Journal, 6 January 1995, pages B1, B6.

Bob Tedeschi, Internet reshapes the construction industry, New York Times, 21 February 2000.

Bob Tedeschi, Creating marketplaces for business-to-business transactions, New York Times, 24 January 2000.

Shawn Tully, The B2B tool that really is changing the world, Fortune 141(6), 20 March 2000.

John W. Verity, Invoice? What's an invoice?, Business Week, 10 June 1996, pages 111-112.

Kenneth R. Weiss, A wary academia on the edge of cyberspace, Los Angeles Times, 31 March 1998, pages A1, A23.

Bernard Wysocki, Jr., The big bang: Some industries may find themselves blown apart by the digital age, Wall Street Journal, 1 January 2000, page R34.