about summary refs log tree commit diff stats
path: root/js/games/nluqo.github.io/~bh/four.html
blob: 1faf496073d77cf5f033eefcbff6eade697b0b7f (plain) (blame)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
<HTML>
<HEAD>
<TITLE>Abstracts for Four Papers I'm Gonna Write Someday</TITLE>
</HEAD>
<BODY>
<H1>Abstracts for Four Papers I'm Gonna Write Someday</H1>
<CITE>Brian Harvey<BR>University of California, Berkeley</CITE>


<P>Copyright (C) 1989 by Brian Harvey.  Permission is hereby granted to anyone
to reproduce this paper, provided that it is reproduced in its entirety,
without editing, and including this notice.


<H2>I.  A Professional Ethics Course Wouldn't Have Helped Robert Morris</H2>

<P>In recent months everyone with an axe to grind has been using the November
Internet Worm as a grindstone.  Security freaks call for more security;
freedom freaks call for less security; decentralists call for less reliance
on computer systems; antimilitarists call for less military reliance on
computers; manufacturers of security devices call for people to buy their
products.  Regrettably, some people at CPSR meetings are jumping on the
bandwagon and using this incident to argue for professional ethics classes.

<P>I'm all for professional ethics classes.  But I don't think such a class at
Cornell would have prevented this incident.  Such classes should, and
generally do, examine issues that are morally difficult for professionals
working in the field.  What would an ethics course for computer scientists
be about?  Probably a major focus, for example, would be on the extent of
military funding of computer science research.  This is a question of real
importance not only for working professionals but for the graduate students
who would be enrolled in the course, who may not like working on weapons
research but who do want research assistantships.  If 80% of all computer
science research is funded by the DoD, this poses a problem for antiwar
computer scientists.

<P>Would a computer science ethics course deal with privacy?  Probably, but
I hope not at the level of simple exhortations to respect it.  If I were
teaching such a course, I'd begin by calling the assumptions about
privacy into question.  For example, let's say the police want to build
a spiffy data base system to keep track of criminals.  What's so bad about
that?  I do think it's bad, often, but I don't think it's obvious why.  I
think that the answer requires a lot of specific historical knowledge about
the political role of the police in the United States, and the recurring
real abuses of police power.

<P>If you asked Robert Morris whether computer professionals should respect
people's privacy, I bet he'd say yes, sincerely.  He would then go on to
say that the Internet worm wasn't an invasion of privacy, but "just a joke."
I propose to take this claim seriously.  I argue that the relevant ethical
issue is this:  Playing practical jokes on one's friends is different from
playing practical jokes on strangers.  It's not that one is always okay and
the other always not okay, but the standards are different.  Practical jokes
are about trust and testing trust.  The degree of trust one can expect from
friends is higher than the degree it's reasonable to expect from strangers.
This would be a terrific issue to raise in an ethics class for 12-year-olds.
(I'm not being sarcastic; when I was a 12-year-old I attended a school with
required ethics classes.)  It's unlikely that a teacher of graduate students
would think to raise it.

<P>I believe it is a serious problem in our society that adolescence commonly
lasts into the mid-20s and beyond.  The reasons have to do with a lack of
serious adult values, the commercial glorification of youth, a tight economy
in which adult life often truly is bleak and joyless, state-sponsored
lotteries, and many other things.  Professional ethics classes, though, do
not address this problem.


<H2>II.  Moral Dilemmas are not Ethics</H2>

<P>The model for professional ethics courses is medical ethics courses.  The
latter often revolve around dilemmas, that is, around issues that are
genuinely controversial among honest, well-motivated doctors.  Abortion,
euthanasia, whether to offer an honest diagnosis if you think it'll hurt
the patient's health: all of these questions in which life and death are
literally at stake are no easier for ethical philosophers than for medical
practitioners.

<P>The purpose of a medical ethics course is not to encourage doctors to be
ethical.  That is taken for granted, as a precondition of the course.
Nor is the purpose of the course to call attention to obscure ethical
questions.  Every medical student knows about these questions, as does
everyone who reads newspapers.  The purpose of the course is to provide
the students with knowledge of the range of arguments that have been made
about the difficult questions, so that they do not begin their careers with
one-sided views out of ignorance of alternatives.

<P>In computer science our situation is not like that of the medical profession.
Among our colleagues the very idea of social responsibility is open to
question.  "First, do no harm" is not controversial among doctors, but
some computer programmers are perfectly comfortable building the tools for
arbitrageurs and other social parasites.  "Suppose your employer orders you
to release a product known to have bugs because the deadline is approaching..."
This is an ethical dilemma?  It wouldn't be, in a profession with a sense of
ethics.

<P>The medical ethics course is useful as an adjunct to the real ethical
education of medical students, which happens in hospital wards.  Everyone
involved understands that the course is an adjunct.  Everyone understands
that ethics is about empathy, human respect, and courage more than it's
about intellectual resolution of moral puzzles.

<P>In computer science, solving puzzles is central to our work.  It is all too
easy to see social responsibility as just another kind of puzzle, to be
solved by the same techniques of formal reasoning we use with other puzzles.
A dilemma-based computer ethics course too easily lets us off the hook.
Instead our ethics courses must be about ethics!  That is, they must force
students to confront the existence of good and evil, to choose between
selfishness and community spirit.  Very few computer scientists explicitly
choose evil, but many prefer to pretend that there is no choice to make.


<H2>III.  There Is Nothing that Everyone Needs to Know about Computers</H2>

<P>I have been arguing for several years with people who believe that to be
employable, one must be "computer literate" -- skilled in some aspect or
other of computer use.  In the context of social responsibility there
seems to be a different argument, asserting that one cannot be an effective
citizen in a democracy without a technical understanding of the political
issues involving computers.  How will people know which way to vote on
Star Wars, if they don't understand programming methodology?

<P>This version of the "computer literacy" argument is also nonsense.  It's
a losing battle.  Computers are not the only technology that comes to the
attention of voters.  Freon, oil spills, nuclear power, genetic engineering,
the prime interest rate, the use of standardized tests that may or may not
discriminate against some group in college admissions, research on animals,
potential AIDS drugs, biochemical versus psychodynamic approaches to mental
illness, teaching foreign-born students in English or in their native
languages, what the Founding Fathers really meant about bearing arms: are
the voters to be "literate" about all of these?

<P>How, in fact, do I decide to believe the scientists who tell me that people
evolved from animals, and not the ones who tell me that nuclear power is
safe?  I have no technical knowledge about either issue.  Supposing that I
were forced to take "biology literacy" and "nuclear power literacy" courses;
how would I decide whether or not to trust the teachers of those courses?
The answer is that my beliefs are based on nontechnical aspects of the
issues.  For example, I know that there is money to be made in nuclear power,
but I don't see anyone profiting from the theory of evolution.  I know that
the supposedly neutral Nuclear Regulatory Commission conspired with the
plant owners to withhold information about the Three Mile Island failure;
I don't know of any such scandal among evolution theorists.  I know that
the nuclear power industry got Congress to pass a law exempting them from
civil damage suits, and I understand what this means about their own
confidence in their operations.  I know that the spokespeople for evolution
include exemplary human beings like Stephen Jay Gould, who also finds time
in his schedule to work against racism; those who speak for nuclear power
are more likely to be sleazeballs who also argue for nuclear weapons.

<P>What the voters need is "political literacy": knowing how to read the
newspaper without technical knowledge of the subject under discussion.
They need the intellectual weapon of class analysis.  They need the
commitment to remember last year's scandals to help them understand
this year's.  They need the sophistication to understand dialectical
tension, in which two contradictory views can both be aspects of the
truth, without dissolving into relativism, in which everything and nothing
is true.


<H2>IV.  Ethics is Learned in the Laboratory</H2>

<P>What is the policy about game-playing at your school's computer lab?
Some students want to play computer games.  Other students (perhaps the
same students at another time) want to get their assigned work done.  Does
some adult facility manager decide the rule?  (No games 8am to 11pm, let's
say.)  Then, do paid adult staff members police the rule?  Or are students
part of the process of setting the rule and enforcing it?

<P>What happens when a student shows an interest in developing system
software?  Is s/he encouraged?  Given access to source files?  Allowed
to install the new version for general use?  Or informed that students
can't be trusted to write software lest it be full of trapdoors?

<P>Is the computer lab always open?  Is it closed at night because there's no
money for staff to prevent equipment theft?  Is there a way students could
organize cooperatively to staff the lab?  Are they encouraged to do so?

<P>When one student complains about another student violating the privacy of
his or her files, how is the issue resolved?  (What about faculty or staff
violating the privacy of student files?  Is that an issue?)

<P>The computer lab is the best place to begin professional education in
social responsibility.  The crucial point is to build a sense of
community.  Faculty should be part of this community also, but decisions
about things like game policy should be truly democratic.  It's the students
who face the consequences, and they can understand the issues.

<P>(I guess I am arguing for Carol Gilligan's relationship-based view of
moral development, as against Lawrence Kohlberg's rule-based view, which
is embodied in the presentation of moral dilemmas in ethics classes.)

<P><ADDRESS>
<A HREF="index.html"><CODE>www.cs.berkeley.edu/~bh</CODE></A>
</ADDRESS>
</BODY>
</HTML>