Science is grey, not black and white.

tl;dr Blaming scientists for the knowledge they produce is a pointless task fraught with insoluble problems.  It will work better if we police the use of knowledge rather than its production.

Over at TheScientist, Heather Douglas has put forth her opinion on the moral responsibility of scientists in a piece entitled “The Dark Side of Science”.  Her argument is fairly straightforward:  science can have bad consequences, and if we as scientists can foresee those consequences, we have a moral responsibility to not do the research or to hide the results of this research away.  She says:

 As scientists develop ways to generate sequences of base-pairs ever more cheaply and efficiently, the opportunity for the malicious or the simply unreflective to play with pathogens to see what kind of traits arise looms larger.  And it is not just technological know-how that can be problematic.  The detailed knowledge of cellular or genetic functioning can have worrisome implications as well.  Knowledge of what makes a virus more transmissible can assist us in detecting when a virus might be more prone to producing an epidemic, but it could also be used to make viruses more virulent.

In sum, scientists are responsible for both what they intend to achieve and that which is readily foreseeable, as we all are.  There is nothing inherent in becoming a scientist that removes this burden of responsibility.  The burden can be shared—scientists can come together to decide how to proceed, or ask for input from ethicists, social scientists, even the broader public.  Alternatively, scientists could decide (and it has been proposed) that some forms of regulation—either in the selection of projects or in the control and dissemination of results—be imposed on the field of synthetic biology, to reduce the risks.  The more oversight scientists submit to, the less responsibility they bear, but it comes at the cost of the freedom to choose the type of work they can do and how they do it.  This is the essential tension:  as long as there is freedom of research, there is the responsibility that comes with it.

Well, now.
There’s two big problems with this piece.  The first hinges on the idea that we can control the production of knowledge, as though it were a tap that only we have our hands on.  My comment on her piece was biting, but I’m not particularly inclined to apologize for that:
Ah, yes, because if we decide not to do research due to the possibly dangerous side-effects, that means that no one else can discover that knowledge either!  Whew, what a relief.  It would be terribly frightening if the scientific method were available to just *anyone*, lab researcher or terrorist bent on biowarfare alike.
Even putting aside the problem of intention vs outcome, as Brian mentioned, I would refer you to the computer security industry and the disastrous failure of security through obscurity policies.  In the days of $1000 genomes and DIYbio, the idea that we can stuff the genie back in the bottle if we just cover our ears and loudly declare “Nah nah nah nah, not researching that!” is ludicrous.

Science is a distributed process, and knowledge is out there for anyone who cares to look for it.  If I “foresee” that researching zebra finch song production (as an example of research I don’t do) will lead to bioweapons production in the Middle East, I can … what?  Send out a strongly worded email to every lab working on zebra finches across the world and hint that zebra finch songs hold the key to the apocalypse?  Secretly poison every zebra finch in existence to stop research on this ubiquitous lab animal?  I suppose that it’s not entirely impossible that I could convince every legitimate scientific lab on earth to stop working on zebra finch songs – I’d have to schlep all over the world to do it, and somehow convince every PI working on it to stop (academic cat herding, anyone?) – but even if I did, what then?  Zebra finches aren’t that hard to find or work on, so stopping research on this bioweapons outcome would somehow require forcing mainstream labs across the globe to maintain absolute secrecy while also forcing them not to work on this problem.  And this all depends crucially on human beings not being curious. That hasn’t worked out very well in the past.

The issue is that there’s no single body that is responsible for science, and no way to control the production of knowledge.  This works in our favour when it comes to religious groups and other nutjobs who wish to squash the advancement of human knowledge;  it doesn’t when we would like to put the cat back in the bag.  So, in this regard, Douglas’ argument is finished before it begins, because what she is asking for is impossible on the face of it.  It might be possible to slow things down, and hiding details may be a responsible thing to do in some circumstances (though I’ve yet to see a convincing case for this), but if the knowledge is valuable it is free for anyone to find.

The second problem with Douglas’ argument is the idea of “foreseeable”.  She’s rather vocal about this:

Many of the comments seemed to miss the key point that scientists are responsible for foreseeable consequences only– and I would want to restrict that to specifically and clearly foreseeable consequences.  This bounds the scope of responsibility, and in the way that is done in everyday life (and in courts of law– the reasonable person standard for foreseeability is the common standard for issues of negligence).

This acts in both the past and the future. In the past, retroactively deciding what was or was not foreseeable in the production of human knowledge is going to be an impossible task.  Science is, by definition, the process of creating knowledge and understanding that does not yet exist;  how that will interact with the body of knowledge we currently have and the body of knowledge that we have yet to produce is not foreseeable by any reasonable person.  No reasonable person has a system for deciding what will be the outcome of research – if we did, we wouldn’t need to do the research.  The reasonable person standard doesn’t apply here, because by definition you are retroactively assigning probabilities to events that effectively had a uniform distribution (when you produce the research, all things are possible).  Oh, but doing research into virulence is more likely to produce bioweapons, you say!  Really?  Or is it just as likely to produce a way to stop them (more below)?  Note that I’m speaking about basic research here, not applied research directly focused on producing weapons.  If your intention is produce a weapon, you bear the responsibility of creating that weapon;  if your intention is create new knowledge, assigning responsibility for what is done with that knowledge is a lot less clear.

In short, I would argue that “specifically and clearly foreseeable” is a red herring;  my argument is that producing new knowledge always has consequences that are not specifically and clearly foreseeable.   She uses a legal definition that doesn’t apply, and then wants us to abide by it. Ah, you say, but some consequences of research are easily foreseeable:  if you produce a new viral strain with higher virulence, then a bioterrorist could use that to infect a population and kill millions.  But my response is now in the future:  unintended consequences cut both ways.  Producing a viral strain with higher virulence could be the thing that cracks the problem and lets us produce a vaccine or cure for the virus in question;  only by understanding how something works can we stop it.  And if producing that knowledge could have saved millions of lives but Douglas stops it, does she accept responsibility for the deaths?  If millions die and someone later uses that viral strain to produce a cure, do we get to punish Douglas because she stopped us from finding that cure when it was “specifically and clearly foreseeable”?  After all, I’m a reasonable (and reasonably educated) person, and I can conceive of many ways that such research could have helped.

Herein lies the core of the problem:  science is grey, not black and white.  There’s a lot of science that has both good and bad applications.  The laws of classical mechanics apply to nearly every scale of our daily life and make it better in many ways, from the curve of a thrown ball in sports to the launching of satellites into space, but they have also produced nearly all of the tools of conventional war (bullets to bombs).  The discovery of the structure of the atom has left us with nuclear power and nuclear weapons.  Douglas might say that I’m misconstruing her argument, that it’s only “specifically and clearly foreseeable” if it’s on a small scale and really obvious.  But again, I say that this is misdirection:  show me basic research that wasn’t intended to produce a weapon with unambiguously negative consequences.  And when there’s both good and bad, black and white, how do we decide what shade of grey is acceptable?  If it could kill millions (atomic weapons) but potentially provide power enough to help stop climate change (which could potentially impact billions), who gets to decide what is acceptable before we research it?

The fact is, we can’t stop knowledge from being produced.  But knowledge has no moral value or valence in and of itself;  knowing how to make a gun doesn’t force you to construct one and shoot someone with it.  This analogy is apt in another way:  I’m greatly in favour of gun control, and my belief from reading and thinking about the issue is that people cannot be trusted to use guns properly without regulation.  But in scientific terms, this means that we regulate the application of knowledge, not its production.  We don’t tell scientists what they can research, we as a society decide what we’re going to do with what comes out of research.  What I’m proposing is more difficult than what Douglas is proposing, because we don’t get to just cover our eyes and pray that no one else is smart enough to figure out what we’d rather keep hidden.  In my way we have to have hard conversations about how best to deal with and regulate the knowledge when it arrives.  And I’m sorry if that scares her, but that’s the way it’s going to have to be.

Advertisements
Tagged ,

One thought on “Science is grey, not black and white.

  1. […] not overly political here.  That’s not to say that I’m not opinionated (I’ve got plenty of opinions), but I don’t tend to post specifically in order to weigh in on timely […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s