Technology and Human Responsibility

When science is governed by a conviction that the world is a machine, the distinction between science and technology naturally grows tenuous. Indeed, the influential philosopher, Daniel Dennett, has argued even of biology that it “is not just like engineering; it is engineering. It is the study of functional mechanisms, their design, construction, and operation.” And the University of Texas historian of science and technology, David Channell, argues that we should no longer think of technology as applied science; rather, “science is just applied technology.”

The study of technology is therefore essential to an understanding of what science is becoming today. You might say that all the work of The Nature Institute relates to technology — that is, we are concerned to rise from a technological or mechanistic view of the world to a living, qualitative, and contextual understanding of it. In order to achieve this, we must understand the character of technological thinking as deeply as possible, and learn how to transform it. Here is some of our work aiming in this direction:

NetFuture Archive

In 1995, Stephen Talbott started the online publication, NetFuture: Technology and Human Responsibility. It became a publication of The Nature Institute in 1998, and a total of 189 issues were published. The articles, mainly written by Stephen Talbott, focus heavily, but not exclusively, on technological issues and the contrast between mechanistic and organic thinking. NetFuture gained wide influence as (in political scientist Langdon Winner’s words) “one of the few places on the Net where wisdom finds a voice.”

NetFuture is now maintained as an archive that includes all articles and a topical index.

The Future Does Not Compute
Transcending the Machines in Our Midst

The networked computer and digital technologies in general have rapidly come to define the quintessential “machine” assumed by the theoretical constructions of mechanistic science. In his 1995 book, The Future Does Not Compute, Steve Talbott looks at a broad range of issues, including:

  • how computers and the Net can distort the education of the child;

  • the relation between technology and environmental concerns;

  • the power of computer-based organizations to sustain themselves in a semi-somnambulistic manner, free of conscious, present control;

  • the tendency of the “global village” to dissolve real villages;

  • the role of computers in supporting group activity;

  • the hollowing out of language by technology;

  • how to understand computers within the context of the broad evolution of human consciousness;

  • the connections between high technology and a new kind of mysticism.

See the book’s main page for the full text of the book, along with an annotated table of contents and excerpts from reviews.

Devices of the Soul
Battling for Our Selves in the Age of Machines

In this “urgent and important book” (according to Michael Pollan), Steve Talbott challenges us to step back and take an objective look at the technology driving our lives. In the course of this exploration, Talbott illustrates that we’re forgetting one important thing — our Selves, the human spirit from which technology stems. Find out more about Devices of the Soul by visiting our bookstore.

Technology and the Handicapped

Our booklet, Extraordinary Lives: Disability and Destiny in a Technological Age, explores the role of technological assists in the life of the handicapped, and by this means throws light on the larger role of technology in modern society. Written by Steve Talbott, the booklet is part of our series of Nature Institute Perspectives.

A Few Places to Start

From among the several hundred articles on various aspects of technology that have appeared in NetFuture and elsewhere, the following rather arbitrary selections may suggest wider horizons to explore:

“Of Machines, Organisms, and Agency,” by Stephen L. Talbott. In Context #35 (Spring 2016).
Whether we look at them at the molecular level or as we naturally encounter them, organisms appear to be agents carrying out intentions, even if not consciously or in anything like a human manner. But what do we mean by “agency” and “intention”?

“When Engineers Take Hold of Life: Synthetic Biology”, by Craig Holdrege. In Context #32 (Fall 2014).
What happens when genetic engineers, becoming yet more ambitious, begin to envision the synthesis of altogether new life forms, using Lego block-like “BioBricks”? The ambition may be foolish, but huge resources are now being devoted to it, with grave implications for the biological future.

“Computers, the Internet, and the Abdication of Consciousness,” an interview of Steve Talbott for the C. G. Jung web page, conducted by Dolores Brien.
In her introduction to the interview, Brien writes, “The thrust of Stephen Talbott’s deeply thought and deeply felt work is to awaken us from our psychological somnambulism vis à vis the technology which permeates our personal life and culture.”

“The Trouble with Ubiquitous Computing,” Part 1, Part 2, and Part 3, in NetFuture.
By letting their work develop out of a one-sided preoccupation with the technological milieu rather than immersion in the meaningful contexts affected by their inventions, high-tech engineers inflict technological “answers” upon us without any serious reference to the supposed problems they are the answers for. Anything that can be automated should be automated — so runs a common sentiment within the high-tech world. What is right about this, and what is just plain foolish?

“Children of the Machine,” chapter 14 in The Future Does Not Compute.
Through education based on computer programming, the child loses — never having fully developed it in the first place — that fluid, imaginative ability to let experience reshape itself in meaningful ways before she carves out of it a set of atomic facts.

“Who’s Killing Higher Education? (Or is It Suicide?)” in NetFuture #78.
For a long while now we have slowly been reconceiving education as the transfer of information from one database or brain to another. In the end, we will realize that this makes not only the teacher but also the student obsolete.

“Is Technological Improvement What We Want?” in NetFuture #38.
Technical improvements in the intelligent machinery around us tend to represent a deepened threat in the very areas we began by trying to improve. This, so long as we do not recognize it, is the Great Deceit of intelligent machinery. The opportunity to make software more friendly is also an opportunity to make it unfriendly at a more decisive level.

Finally: go to the NetFuture topical index for a list of several dozen subject headings, each of which links to the appropriate articles.