Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

Computer Science and the Need for a Human Centered Science

by Joseph Weizenbaum - 1981

A summary of Professor Joseph Weizenbaum’s talk, along with the complete text of "a science fiction fable" written by Professor David Bohm in response to Professor Weizenbaum's presentation.

A copy of the speech delivered at the symposium by Joseph Weizenbaum, professor of computer science at Massachusetts Institute of Technology and presently visiting professor at the University of Hamburg in West Germany, was unfortunately not available when this issue of the Record went to press. Because of the importance of Professor Weizenbaum's remarks, however, we are providing the following summary of his talk, along with the complete text of "a science fiction fable" written by Professor David Bohm in response to Professor Weizenbaum's presentation.

We have yet, said Joseph Weizenbaum, to grasp the full seriousness of the critical times in which we live. A fundamental and spreading misconception of the role and place of technology in human life poses, he said, an imminent threat to the human future. In his own field of computer science, Weizenbaum stated, this misunderstanding has attained its most extreme and most dangerous manifestation. Computer science, he said, originally began as a human-centered science with the intent that computers under human control would serve human purposes. What went wrong, he argued, was that attempts to capture some aspects of the human in the machine produced a conception of a machine-like human. It was as though one aspect of the human mind, the computer aspect, were externalized and embodied in a machine which was then taken as the model for the entire human mind, which is then viewed as itself only a computer—and, at that, an inferior computer in comparison with an increasingly sophisticated, silicon-based computer technology. This reduction of the human mind to the machine, Weizenbaum noted, had long been immanent in Western thinking, but the computer gave it a concrete embodiment that to many people has become utterly convincing.

The results of this misconception, Weizenbaum held, have pushed human concerns more and more to the periphery of our social policies and our views of the world. It has, for example, he pointed out, become increasingly axiomatic in the Western university that every aspect of human life and thought is computable, not only in principle but in fact. Those dimensions of the human being and those human problems not amenable to computer solutions are regarded simply as unimportant or nonexistent. There has even been a growing tendency among those involved in the artificial intelligence movement, he said, to speak contemptuously of the human being, to describe the human brain as merely "a meat machine," and to view human intelligence as deserving to be replaced by what is seen as the more effective, silicon-based intelligence of the computer.

At the same time, Weizenbaum said, we entrust increasingly crucial decisions to computers that we understand and control less and less, and whose capacity for error—as recent reports of mistaken nuclear alerts and banking system errors attest—have acquired truly life-threatening proportions. The task before us, Weizenbaum argued, is once again to look for the human meanings in the ways the information-processing capacities of the computer are used. Humans, he said, can care and choose; computers can decide but not choose, and they do not care.

In the discussion that followed, Philip Siekevitz commented that "an even worse example of the same thing comes from biology." "For example," Siekevitz continued, "sociobiology has said that there is a gene for altruism, but the sociobiologists wish to anthropomorphize in order to reduce the human to what we would call the nucleotides on a DNA string—which have no relation at all to altruism. They wish to go back to the biological, but they misread even it completely. As to what we can do, there is a very strong reaction against this among biologists."

David Bohm suggested that a reason for the growing blind faith in the nonhuman is that man is caught in jealousy and various forms of irrationality. There is, as a consequence, a climate of total despair in which "people no longer believe that man can handle himself rationally and are clutching at straws in turning to artificial intelligence."

"The psychoanalytic response to our situation," said Rollo May, "ought to be, and hopefully, perhaps can be, our own absorption of the degree of despair that is present. The problem is our own denial of the situation. I can feel it in myself," he continued, "for very good reasons. But our chance for survival is nil unless we can face the despair without denial."

"I agree," said Weizenbaum, recalling the earlier statement of Huston Smith. "It's not what we do, but what we are, and what we are is related to what we see. The emphasis should be on the we and on the I. Everything starts with you and me. This radiation effect of starting with me is the only hope."

Cite This Article as: Teachers College Record Volume 82 Number 3, 1981, p. 521-522
https://www.tcrecord.org ID Number: 969, Date Accessed: 10/20/2021 4:07:05 AM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Joseph Weizenbaum
    Massachusetts Institute of Technology
    E-mail Author
    Joseph Weizenbaum is professor of computer sciences at M.I.T. and is currently a guest professor at the University of Hamburg in Germany. He is author of Computer Power and Human Reason: From Judgement to Calculation, and with W. Handler is editor of Display Use for Man-Machine Dialogue.
Member Center
In Print
This Month's Issue