Category Archives: Personal opinions

Congratulations, Alberta! Uh, I think….

Well, the CBC has called it:  Alberta has elected a Progressive Conservative government.

I never thought that I would be vaguely happy to hear that my home province has elected a Tory government;  they’ve been in power for over 40 years now and I would have been happy to see a change, but I am glad that the homophobic, racist, anti-science Wildrose party hasn’t taken power.   And the fact that Alberta has now elected its first female premier is something that shouldn’t be ignored or minimised.

What I’m still upset about, though, is that Alberta has decided to put the Wildrose party in second place, crushing the Liberals and NDP by comparison.  Danielle Smith wouldn’t be out of place running along side Sarah Palin, and now she’s forming the official opposition in Alberta?

For shame.

Tagged , ,

Hurrying to go to better things.

Over at Why Evolution is True, Jerry Coyne addresses an article by Andrew Riggio, in which Riggio questions the thought processes of a man named Paul Lord who thanked God for saving him from a tornado that struck in Oklahoma, killing several (including 3 young girls) but sparing him.  Noticing that (unsurprisingly) the comments have exploded into a sprawling mess, Coyne pulls out a few for special attention including this one:

Kleb  •  22 hrs ago
Wooooooow. Bitter much? The author’s argument presupposes that from God’s point of view death is bad. People of “true faith”, as his last sentence mentions, are equally grateful to God for His providence in death as in life. Look at the great heroes in Christianity. When they died they weren’t bawling and begging God to spare them, they were profoundly relieved to be joining Him and, at the same time, deeply grateful for the ride they had been on in this world. From a Christian perspective, then, there is no inconsistency here. The survivor is grateful for the life God has given him here, as he should be, but that doesn’t mean he isn’t also looking forward to meeting the Lord.

Whoa!  Should we be grateful to God for asking 6 million Jews to join Him during the Holocaust?

Seeing this reminded me of something I read a while ago in John Aberth’s great book Plagues in World History.  Faced with the crushing mortality and morbidity of the First Plague (the Justinian plague that ran from roughly 541 to 750 CE), Christian preachers responded the only way that they could, from the pulpit.  In reply to the bubonic scourge, Aberth notes this particular line of thinking:

By the seventh century, sermon cycles were being compiled to be recited on a regular basis whenever plague struck a region as part of the Church’s now standard response to urge its flock to repent in the face of God’s wrathful chastisement;  this at least is the overarching theme of four homilies composed at this time in Toledo, Spain, which, as expected, are replete with quotations from the Old Testament.  Yet, one sermon, the third in the series, adopts a strikingly different tone by employing the carrot rather than the stick [...].  In a remarkable passage, one that seems to be inspired by the New Testament, in particular the letters of St. Paul, the preacher now dangles the promise of immortality during the Christian afterlife or resurrection in order to help his listeners conquer their fear of imminent death from the “groin disease”:

But what should we say?  You who take fright at this blow (not because you fear the uncertainty of slavery, but because you fear death, that is, you show yourselves to be terrified), oh that you would be able to change life into something better, and not only that you could not be frightened by approaching death, but rather that you would desire to come to death.  When we die, we are carried by death to immortality.  Eternal life cannot approach unless one passes away from here.  Death is not an end, but a transition from this temporary life to eternal life.  Who would not hurry to go to better things?  Who would not long to be changed more quickly and reformed into the likeness of Christ and the dignity of celestial grace?  Who would not long to cross over to rest, and see the face of his king, whom he had honored in life, in glory?  And if Christ our king now summons us to see him, why do we not embrace death, through which we are carried to the eternal shrine? For unless we have made the passage through death, we cannot see the face of Christ our king.

(emphasis mine)

Kleb the commenter has one thing right:  a belief that death is acceptable and even preferable to this life, whether from God’s point of view or from the worshipper’s, is certainly not a new phenomenon.  I wonder at the power this line of thinking must have held when those who contemplated it were faced with a disease that can kill 60-90% of those it infects and may have wiped out as many as 25 million during the Justinian plague alone.  In the mean time, though, I’ll be thankful for the efforts of modern medicine and science which have brought the plague to its knees. (Even if we are on the verge of squandering that advantage and resurrecting the plague’s power through antibiotic abuse, but that’s an entirely different post).

Rejection Watch Vol. 1, Supplementary Online Material

Photo by Troy C. Boucher Photography, used under a CC license.

Submissions for Rejection Watch have dropped off, which doesn’t really surprise me;  the traffic on this blog isn’t quite strong enough to sustain a feature like that (yet!), though I don’t regret the attempt.  The submissions I did get were fantastic and if anyone out there still wants to send me material, I’ll be happy to resurrect it whenever they do.  In the mean time, though, a couple of relevant posts from around the web have cropped up in the last day or so, and I feel like they make great supplementary reading for those of you reeling from academic rejection.

Rosie Redfield (she of debunking-#arseniclife-fame) over at RRResearch posts her rage over a crappy review of her postdoc’s paper:

We finally (after two months) got the reviews back for the postdoc’s manuscript about DNA uptake bias.    It’s a rejection -  the reviews were quite negative.  The first reviewer was very unfair; they didn’t find any fault with the methods or data or analysis, but they attacked our brief discussion of the functional evolutionary context of uptake bias.  This is all too common for my papers.  The reviewer is so hostile to the idea that bacteria might take up DNA for food that they don’t focus on the science.  Because the paper was rejected we don’t get to do an official response to the reviews, so I’m relieving my frustration by responding to them here.

She goes on to do a detailed, blow-by-blow response to the objections of the two reviewers.  The whole thing is a great read, even if you’re not in this field;  the feeling of ‘oh, that happened to you too?’ is too good to pass up.

Meanwhile, over at The Bug Geek Crystal has found a new pit of despair:

So you know that I handed my draft manuscript in to my advisor last week.  He sent back a document covered in red ink. Then my labmates pointed out all the dumb things I did, and showed me all the cool things I COULD have done but didn’t.

My advisor, a real funny guy, said, “You should make a new graph about the revision process,” and I was all, “Ha ha ha that’s so funny.”

The graph she makes is pretty awesome, but one of the things that struck me was that even the most well-meaning revisions from people close to you (advisors, labmates, colleagues you respect) can cut deeper than the blunt hammerings of an anonymous reviewer with a grudge.  I think that this is because it inspires different emotions, rage for reviewers and despair for labmates.  When we have a personal relationship with those who have dripped red ink on our work, it’s hard to avoid  the attack on your sense of self:  this person knows me, and didn’t think my work was perfect, so there must be something wrong with me.  I should have done better, screwed up.  These are people that you (usually) like, that you want to look smart in front of.  Contrast Crystal’s feelings of despair with Rosie’s feelings of rage;  when anonymous reviewers trash our material, unless we think that they’re right we can work up a really good mad and use it as fuel to revise. In the academic setting, it feels to me like rage is a more productive emotion, a provocation to defiant action (‘I’ll show you Mr. Anonymous Reviewer who will never read this paper again!’) while despair has a soporific effect that leaves us drained and dragging ourselves through the revisions.

Of course, this is just a sweeping generalisation based on my own experience that is almost certainly wrong in some fashion.  But hey, maybe you can leave an anonymous review?  Then I’ll show you.

Tagged

Science journalism blows it, dolphin rape edition.

A few weeks ago I got into a discussion on Twitter with Ananyo Bhattacharya, online editor of Nature News and writer for The Guardian’s science section, after he put out a call asking for ways to improve science journalism. During that conversation, I argued that one way to do this is to create a culture of journalism that values scientific knowledge and expertise as a core value[1]. Ananyo seemed unimpressed with my viewpoint, and suggested that the main point of science journalism was to pry into the dark corners and root out biases, fraud, and the like in science. He views scientific communication and scientific journalism as two distinct things (and thinks that journalists doing ‘PR for science’ is ‘drippy’). Indeed, when asked directly during a Royal Institute forum on science journalism whether journalists should read the original papers behind the stories that they write, he dismissed the idea:

“If the question is ‘must a good science journalist read the paper in order to be able to write a great article about the work’ then the answer is as I said on Tuesday ‘No’. There are too many good science journalists who started off in the humanities (Mark Henderson) – and some who don’t have any degrees at all (Tim Radford). So reading an academic research paper cannot be a prerequisite to writing a good, accurate story … So I stick to the answer I gave to that question on the night – no, it’s not necessary to read the paper to write a great story on it (and I’ll also keep the caveat I added – it’s desirable to have read it if possible).”

He further suggests, in the same comment (original source), that if journalists had to read original papers than no one could report on particle physics[2].

I’m not going to try and hide my bias here: I don’t like Ananyo’s viewpoint on this. I don’t think that it will lead to good writing, either of the communication or journalistic variety, but more importantly I think that forcing journalists to read the papers before they write an article might have stopped stupid @#$@ like what happened today from happening at all.

The story: I received an e-mail this morning from Dr. Bill Sherwin, a member of the Evolution and Ecology Research Centre (E&ERC) here at my current institution, the University of New South Wales. Bill is one of the authors on a new paper coming out in the Proceedings of The Royal Society (B), entitled ‘A novel mammalian social structure in Indo-Pacific bottlenose dolphins (Tursiops sp.): complex male-male alliances in an open social network’. The paper is a nice little exploration of the characteristics of social networks in dolphins found in Western Australia; in essence, they were testing whether two hypotheses about the nature of these social networks were tenable given the data they’ve observed. In particular, they tested whether dolphins show signs of engaging in ‘community defence’, where higher order alliances of dolphins form to patrol and defend a larger community range, similar to chimpanzees, or if it follows a ‘mating season defence’ model where male groups shift their defence to smaller ranges or sets of females when it’s mating season. The comparison to terrestrial species with complex social cognition (such as primates and elephants) is an interesting one, because it provides yet more insight into the relationship between the development of complex cognitive faculties and social relationships.

So far, so good. Bill gave a simple explanation of the paper in an email that he was sent out to the E&ERC this afternoon:

We put out a paper that said “dolphin male alliances are not as simple as other species”, but it has stirred up quite a lot of interest, because somewhere in it, the paper mentioned “bisexual philopatry”, which when translated out of jargon means  “males stay near where they were born, AND females stay near where they were born” – nothing more or less than that.

‘Quite a lot of interest’ is one way to put it. ‘Idiots crawling out of the woodwork’ is another. Here’s the headlines of four stories that were written about this paper:

Dolphins ‘resort to rape’: Dolphins appear to have a darker side, according to scientists who suggest they can resort to ‘rape’ to assert authority. [The Telegraph]

Male dolphins are bisexual, US scientists claim. [news.com.au]. (Note that this is an Australian website, and Bill is Australian).

Male bottlenose dolphins engage in extensive bisexuality. [zeenews.com]

And by far the best of the lot (guess who it’s from?):

The dark side of Flipper: He’s sexual predator of the seas who resorts to rape to get his way. [That's right, The Daily Mail].

……..

Are you kidding me? If the ‘writers’ of these articles had read the paper, they would have noticed that it contains nothing about the sexual behaviour of the dolphins they studied, bisexual or otherwise, aside from brief mentions of the possible consequences of social networks on reproductive success. It certainly didn’t mention anything about bisexual behaviour, homosexual behaviour, or rape. Now, it’s well known that dolphins engage in homosexual behaviours, and I’ve seen papers arguing that they use sexual coercion as well (Rob Brooks confirms this). But these topics have nothing to do with this paper at all. Even a cursory glance through the original source would have killed these headlines – and the first few paragraphs of the Mail story – which aren’t just a miscommunication but border on outright fabrication. The articles themselves are weird mixes of sensationalist headline with a regurgitated paraphrasing of the much better Discovery News piece that they are treating as the primary source. Here’s the problem, though: it’s Discovery News that makes the original mistake about ‘bisexual philopatry’, interpreting it as bisexual behaviour (hot male dolphin-on-dolphin action, as it were). A reporter who had read the original source could have corrected that mistake fairly easily, or could even have been driven to ask further questions. Without that, however, the press cycle grinds mercilessly forward to Flipper the bisexual rapist.

For my part, I was happy to see that James Randerson’s informal survey of science and health writers showed that many of them do read the original papers. And the kind of people who write things about science that I trust, whether they’re professionally trained in science or not, are not the sort of people who do boneheaded things like this. Ananyo might retort that ‘asking questions’ is enough (he suggested as much in his comment above). Matt Shipman said much the same thing in the piece that Ananyo was commenting on. Yet of all people, Ananyo should be wary of this answer, with his focus on investigative science journalism. A scientist writing an email or doing a phone interview can tell you just about anything that you want to hear; a press officer can write a terrible press release; a wire service will probably distort what comes down the line. But a scientific paper is the One, True Source. It is a public record of what was done, and it is the first and best place to start for answers about a study or a scientific topic[3].

Don’t mistake my criticism of Ananyo’s position of reading scientific papers as a general attack on scientific journalism. I think that there’s a lot of great science journalism out there, and that there are even more great science journalists and communicators. Despite the perennial swirl of internet discussion on the topic, I don’t actually think that the whole field is hopelessly broken like some seem to. I just happen to believe that scientific papers, the products of our time and energy as researchers, form an integral part of the process of talking about science (and it’s part of the reason for my support for Open Access publishing). And I think that disgraceful trainwrecks like the reporting on Bill’s paper are a perfect illustration of the need for these papers to be a part of that process.

[Update: Rob Brooks has also discussed this issue over at TheConversation].

——-

[1] Because of Twitter’s space constraints, this was misconstrued to mean that I was agitating for all science journalists to have a Ph.D. in a scientific discipline. Though I wouldn’t be upset if this happened, that’s not what I meant: it is more than possible to have a deep love and knowledge of science without having a degree in a scientific field. Hell, Carl Zimmer probably knows more about viruses and evolutionary biology than I do, and his only training is an undergraduate degree in English. My argument is only that having scientific training increases the probability of a writer or journalist having a good grasp on how science works, not that it’s the only way for that to happen. I will continue to argue, though, that those having a love of science (professional or amateur) will, on average, produce better science writing and science journalism than those who don’t.

[2] He also claimed that most of the people asking journalists to read papers are biologists and medical people, who write easier-to-understand papers. I would have to turn this back on him: if biology and medical papers are so easy to understand, why shouldn’t journalists read them every time?

[3] Yes, there’s no guarantee that what is written in the paper is true. But the chances of detecting fraud are essentially zero if you don’t read the paper to begin with, and if you’re a journalist looking to catch the next Stapel, chances are that you’ll have to wait for the scientific community to find him and tell you about it anyways.

Tagged ,

Tell Ontario teachers that they should ban pickles, too.

Ontario teachers:

The Ontario English Catholic Teacher’s Association says computers in all new schools should be hardwired instead of setting up wireless networks.

It also says Wi-Fi should not be installed in any more classrooms.

In a position paper released on Monday, the union — which represents 45,000 teachers — cites research by the World Health Organization.

Last year the global health agency warned about a possible link between radiation from wireless devices such as cellphones and cancer.

Some believe wireless access to the Internet could pose similar risks.

What the WHO actually says:

Are there any health effects?

A large number of studies have been performed over the last two decades to assess whether mobile phones pose a potential health risk. To date, no adverse health effects have been established as being caused by mobile phone use.

What a smart commenter (ve5cma) pointed out:

Headline should read:

“Teachers’ Union Falls for Junk Science”

Sub head:
Standing within sight of a 50,000 watt radio station transmitter, the head of the teachers’ union complained about the 4 watt WiFi router.

What I’m doing right now:

Found at guyism.com

I knew that when the WHO classed cell phones as “possibly carcinogenic” (a classification so soft that it includes pickles as possible carcinogens) people would crawl out of the wood work using that as an excuse to ban anything electronic that scares them,  and wi-fi seems like an obvious target.  And let’s face it, Ontario has had problems with this before.  So I guess I can’t say that I’m too surprised something like this happened, but I sure am disappointed.  No one, including the WHO, has been able to find a link between cancer and cell phones.  So how does the Ontario English Catholic Teacher’s Association think that a radiation emitter being held against your head and failing to cause cancer is a good reason to ban wi-fi  throughout schools?   There’s no reason to believe that this kind of radiation has any effect on biological tissue  (even if it’s not physically impossible), and the available evidence is strongly against the idea.

It’s just sad that a group of people responsible for teaching science to children can fail so badly at basic scientific literacy.  For shame, Ontario English Catholic Teacher’s Association, for shame.

Update: Orac at Respectful Insolence hits the same notes with a lot more depth.

Tagged , , ,

Sight-reading my science

Sight Reading, by skelly98; used under a CC license.

Parents often say things like “when you’re older, you’ll be glad we made you do this”.  They’re also often wrong about that, but occasionally they get it right.  In my case, one of the few things that I agree with unreservedly is that I did indeed come to appreciate the time I spent learning to play the piano.  At least, I agree in hindsight;  as a child, in the future tense, I most certainly did not experience wild joy from sitting in front of our old, battered upright piano, stuck down in the basement and banging away for hours.  In younger times, I much preferred reading to the endless repetition of scales and pieces, but this was a preference that did not endear me to my mother.  

So, I developed a compromise system that appeased the attentive ear sitting upstars and awaiting the next masterpiece in A-flat.  Lessons were held at the home of Mrs. Birch, my Jekyll / Hyde piano  teacher who was a perfect candidate for Kindly Grandmother Jekyll of the Year until a student unwittingly sat themselves down at her Yamaha baby grand and unleashed Generalissimo Hyde;  these lessons inevitably involved carting several books to and from her house, which required a dedicated book bag that sat beside the piano.  This was a perfect place to stash whatever book I was reading  while I whipped off a quick left-handed play-through of whatever Bach fugue or Mozart piano concerto I was mostly ignoring, after which I would haul out my book and greedily mow through as many pages as I felt I could get away with before the warden would get restless upstairs.  

############

If you don’t play an instrument, you might not be familiar with the concept of sight-reading.  Learning to “read” music, to turn the notes on the page into a series of motor commands that lead to music coming from the instrument you are playing, is an important skill for any musician to develop and sight-reading is just the logical endpoint of that skill.  Sight-reading involves playing music placed in front of you for the first time as though you’ve been playing it for years;   at least, that’s what it’s supposed to be like, but it often involves a fair bit more squinting, scrambling, and muttered cursing than you might reasonably expect, especially when some bastard just handed you Mozart’s Rondo alla Turca and you’ve never played it before.  Experienced sight-readers will also probably agree that it requires the ability to see things coming, because if your eyes aren’t a few notes out in front of the notes that you’re playing right at this moment, the next thing that you’re going to be doing is trying to extricate your fingers from whatever hellish gang-sign-slash-car-wreck they’ve managed to tangle themselves into when you stopped paying attention.

############

Mrs. Birch wasn’t just a stern piano instructor, she was also a tattle-tale.  If she felt that you weren’t up to snuff, she would take the accompanying parental figure aside when they came back to pick you up and tell them that you clearly hadn’t been practicing enough.  Since the result of this was a crackdown that I wished to avoid, I was left with an extremely strong incentive to be good enough at the lessons to avoid such an outcome. Unfortunately, my equally strong disincentive to practice in favor of my much-preferred reading left me in a bit of a pickle, since it was hard to look like I’d practiced when I really hadn’t.  Thus, I became really good at sight-reading.

#############

When it comes to science, my problem isn’t that I don’t like to practice.  In fact, I read and think about science all the time.  But the trouble is, I have a terrible memory.  I know that this admission blows a hole in the scientific mystique, because if you watch television, every scientist on TV (besides being an intellectual giant) has what seems like unbelievably perfect recall.  Arguably, it’s their defining characteristic, since they’re rarely shown performing other skills like critical thinking or deep analysis of a problem – with notable exceptions, of course.  On the other hand, like any good story there’s a fairly large grain of truth to it;  those prodigious memories do (mostly) exist, and I’m sure that you even know someone like that.  Many of the best scientific minds I know do, in fact, have fantastic recall of the things in their field, good enough that it makes me inadequate to be in the same room with them.  Even the ones who can barely remember to tie their own shoes without a Post-it note on their laces can rattle off details of papers they read as an undergrad 25 years ago without pausing for breath.

Of course, some perspective is important here, both in assessing others and in assessing oneself.  When looking at the way other scientists recall information and data, it’s important to remember that they’ve spent their entire adult lives on these topics. Sometimes, this is really obvious.  Academia is a training program in narrowing your focus until you’re the world’s foremost expert in an area of knowledge so tiny that sometimes you’re also the only person in the world who cares;  anyone who has gone through the Ph.D. process and received a degree at the end is going to be able to spout reams of facts about their chosen topic, even if only from sheer self-defence.  And, when you feel insecure, it’s also easy to suffer from a perverse confirmation bias, where you only remember the times that other people sound smart and make you feel stupid by comparison.  In academia, there’s always plenty of smart people to make you feel inadequate, much like I imagine women feel when reading fashion magazines, flipping through page after page of ads that make them feel like they have to measure up to the impossible standards depicted therein. And, finally, it’s easy to exhibit sampling bias that borders on a half-baked solipsism:  since you can peer inside your own thoughts and see all of the failures of your own memory and cognition in real-time but the thought processes of others are opaque, it’s possible to forget (or disbelieve) that others can feel like that too.

With those caveats firmly in place, it is still pretty clear to me that I’m not that scientist.  You know the one: the diamond-tipped bit on the drill of science, driving a hole into our uncertainty and powering through to the truth in their field.  These are the sorts of scientists whose praises are rightfully sung for their life-long dedication to a field;  to pluck a name out of the air at random, I’m thinking of people like Frans de Waal, who has spent decades expanding our knowledge of primate social and evolution.  This dedication to primate behaviour has rewarded him handsomely with world-wide recognition as one of the foremost names in this field, which is as it should be [1]. But I’m just not that guy.  I can’t face the idea of an entire career spent drilling down into one topic;  there’s too much out there, and I want to play in more than just one sandbox.  You can see this in my academic history, where I’ve wandered from computer science (undergrad), detoured briefly into classics, back to psychology (undergrad and M.Sc), to behavioural ecology (Ph.D.), and now I’ve dropped into the depths of evolutionary biology to work on the dynamics of viruses and bacteria.  Unfortunately, this sort of academic field-hopping is viewed with suspicion, at best (“Narrow and deep is good.  Shallow and broad is usually not appropriate”).  And, it doesn’t really maximize my production of papers, which means that if I want to maintain an academic career I will probably need to settle down soon.  In truth, I think I’m getting there, because I keep coming back to questions relevant to behavioural ecology even when studying pathogens [2].

In sum, the path my career has taken me and my cognitive limitations have left me with this basic truth:  I’m a scientific sight-reader.  What does that mean?  Lacking prodigious recall and Renaissance-man tendencies, it means that I’m always in the soup.  For one thing, I’m always having to look things up, even surprisingly basic things.  Usually, it’s to confirm to myself that my memory hasn’t played tricks on me (a problem that arises because I don’t tend to use the same techniques and knowledge repeatedly), but sometimes its simply because my background is shallow, not deep.  Sometimes, I’ve missed things.  In meetings or conversations, I have to think hard, because I need to be a step ahead of the conversation if I’m going to be of any use to it.  I need to find hooks to the knowledge that I do have, ways that I can draw analogies to and from things I know, applications to a problem that come from my background.  I’m always squinting, mumbling, and cursing my way through the fog of uncertainty, scrambling to stay one step ahead before I lose the plot. Like many things, this is both a blessing and a curse.  To the positive, though I’m not a world-expert in most of what I do, I can usually contribute something just by virtue of having that broad toolkit.  My years of experience with programming has led me to carving out a nice niche as a simulation guy in behavioural ecology – “computer jockey”, as my Ph.D. advisor put it.  That, my study of behaviour, and my background in statistics did get me the postdoc I’m holding now; I may be master of none, but I’m still a jack of all trades.  On the other side of the coin is the problem of depth.  I often need to rely on other, smarter people to make sure that I’m not making basic mistakes.  This isn’t all bad, as I love spending time picking the brains of those smart people and working with them.  But it certainly does not promote self-confidence, and I always feel like I’m one step away from being exposed as a fraud (though it’s probably sub-clinical).

Am I a poorer scientist for it?  In these days of increasing specialisation and balkanisation of scientific fields, there are many who would say yes.  And perhaps they’re right.  My interests tend toward the interdisciplinary, an idea to which much lip service is paid and little support seems to be given.  It’s certainly caused me no end of troubles, and it will probably keep causing more.  But I remain stubbornly convinced that there are benefits, too. Perhaps being forced to constantly leap about to stay in front has made me pay attention, if nothing else.  I don’t know where this will lead, but I do know one thing:  even if I’m not destined to change the world with my science, there’s nothing else on this earth that I would rather be doing with my life.

#############

Recitals and exams, those were the parts of playing piano that I always hated the most.  They drew most strongly on the skills I had avoided, including long hours of careful repetition and memorisation down to the last note.  Not to say that good piano players are robots;  quite the contrary, that laser-like focus provides the raw material for some of the greatest musicianship known.  That sort of depth frees you to be creative by moving your cognition about what you’re doing to a more sophisticated level, somewhat like having a large vocabulary frees you to be able to read almost anything without needing to parse the text every time.  

I was never going to be a musician.  This doesn’t mean that my time was wasted, though, or that all was lost.  Years later, when I was hopelessly in love with the woman who would later become my wife (and she was as yet completely uninterested in me), I sat down with her at the piano and played an arrangement of a song I loved that I had never seen before.  I had played other arrangements of this piece and loved them, but this particular set of squiggles on the musical page was new to me.  Yet when I sat down at the piano and she snuggled in beside me, I was inspired to play that song as though I had been practicing it all my life.  And through the years, we’ve  talked about the history of the piece, and I’ve told her about some of the science – the mathematics, physics,  psychology, and biology – that informs music and musical experience.   My knowledge may not be deep, but its breadth can lead to unforeseen recombination;  that piece of music and my playing it for her became part of a tapestry of woven skill and knowledge that helped form my relationship with the person I love.  And, at the risk of post-hoc justification, I wouldn’t have it any other way.  

——–
1. To be clear, I’m referring to his work only; I don’t know Dr. de Waal personally, though I have absolutely no reason to believe that he’s anything other than a perfectly nice person.
2. This is something I’ll be blogging more about as I develop the line of research and hopefully present it at ISBE in August.

Just a picture of a flower.

Just a picture of a flower that I took in Centennial Park (in Sydney) a little while ago.  I liked it, nothing more than that.

Science is grey, not black and white.

tl;dr Blaming scientists for the knowledge they produce is a pointless task fraught with insoluble problems.  It will work better if we police the use of knowledge rather than its production.

Over at TheScientist, Heather Douglas has put forth her opinion on the moral responsibility of scientists in a piece entitled “The Dark Side of Science”.  Her argument is fairly straightforward:  science can have bad consequences, and if we as scientists can foresee those consequences, we have a moral responsibility to not do the research or to hide the results of this research away.  She says:

 As scientists develop ways to generate sequences of base-pairs ever more cheaply and efficiently, the opportunity for the malicious or the simply unreflective to play with pathogens to see what kind of traits arise looms larger.  And it is not just technological know-how that can be problematic.  The detailed knowledge of cellular or genetic functioning can have worrisome implications as well.  Knowledge of what makes a virus more transmissible can assist us in detecting when a virus might be more prone to producing an epidemic, but it could also be used to make viruses more virulent.

In sum, scientists are responsible for both what they intend to achieve and that which is readily foreseeable, as we all are.  There is nothing inherent in becoming a scientist that removes this burden of responsibility.  The burden can be shared—scientists can come together to decide how to proceed, or ask for input from ethicists, social scientists, even the broader public.  Alternatively, scientists could decide (and it has been proposed) that some forms of regulation—either in the selection of projects or in the control and dissemination of results—be imposed on the field of synthetic biology, to reduce the risks.  The more oversight scientists submit to, the less responsibility they bear, but it comes at the cost of the freedom to choose the type of work they can do and how they do it.  This is the essential tension:  as long as there is freedom of research, there is the responsibility that comes with it.

Well, now.
There’s two big problems with this piece.  The first hinges on the idea that we can control the production of knowledge, as though it were a tap that only we have our hands on.  My comment on her piece was biting, but I’m not particularly inclined to apologize for that:
Ah, yes, because if we decide not to do research due to the possibly dangerous side-effects, that means that no one else can discover that knowledge either!  Whew, what a relief.  It would be terribly frightening if the scientific method were available to just *anyone*, lab researcher or terrorist bent on biowarfare alike.
Even putting aside the problem of intention vs outcome, as Brian mentioned, I would refer you to the computer security industry and the disastrous failure of security through obscurity policies.  In the days of $1000 genomes and DIYbio, the idea that we can stuff the genie back in the bottle if we just cover our ears and loudly declare “Nah nah nah nah, not researching that!” is ludicrous.

Science is a distributed process, and knowledge is out there for anyone who cares to look for it.  If I “foresee” that researching zebra finch song production (as an example of research I don’t do) will lead to bioweapons production in the Middle East, I can … what?  Send out a strongly worded email to every lab working on zebra finches across the world and hint that zebra finch songs hold the key to the apocalypse?  Secretly poison every zebra finch in existence to stop research on this ubiquitous lab animal?  I suppose that it’s not entirely impossible that I could convince every legitimate scientific lab on earth to stop working on zebra finch songs – I’d have to schlep all over the world to do it, and somehow convince every PI working on it to stop (academic cat herding, anyone?) – but even if I did, what then?  Zebra finches aren’t that hard to find or work on, so stopping research on this bioweapons outcome would somehow require forcing mainstream labs across the globe to maintain absolute secrecy while also forcing them not to work on this problem.  And this all depends crucially on human beings not being curious. That hasn’t worked out very well in the past.

The issue is that there’s no single body that is responsible for science, and no way to control the production of knowledge.  This works in our favour when it comes to religious groups and other nutjobs who wish to squash the advancement of human knowledge;  it doesn’t when we would like to put the cat back in the bag.  So, in this regard, Douglas’ argument is finished before it begins, because what she is asking for is impossible on the face of it.  It might be possible to slow things down, and hiding details may be a responsible thing to do in some circumstances (though I’ve yet to see a convincing case for this), but if the knowledge is valuable it is free for anyone to find.

The second problem with Douglas’ argument is the idea of “foreseeable”.  She’s rather vocal about this:

Many of the comments seemed to miss the key point that scientists are responsible for foreseeable consequences only– and I would want to restrict that to specifically and clearly foreseeable consequences.  This bounds the scope of responsibility, and in the way that is done in everyday life (and in courts of law– the reasonable person standard for foreseeability is the common standard for issues of negligence).

This acts in both the past and the future. In the past, retroactively deciding what was or was not foreseeable in the production of human knowledge is going to be an impossible task.  Science is, by definition, the process of creating knowledge and understanding that does not yet exist;  how that will interact with the body of knowledge we currently have and the body of knowledge that we have yet to produce is not foreseeable by any reasonable person.  No reasonable person has a system for deciding what will be the outcome of research – if we did, we wouldn’t need to do the research.  The reasonable person standard doesn’t apply here, because by definition you are retroactively assigning probabilities to events that effectively had a uniform distribution (when you produce the research, all things are possible).  Oh, but doing research into virulence is more likely to produce bioweapons, you say!  Really?  Or is it just as likely to produce a way to stop them (more below)?  Note that I’m speaking about basic research here, not applied research directly focused on producing weapons.  If your intention is produce a weapon, you bear the responsibility of creating that weapon;  if your intention is create new knowledge, assigning responsibility for what is done with that knowledge is a lot less clear.

In short, I would argue that “specifically and clearly foreseeable” is a red herring;  my argument is that producing new knowledge always has consequences that are not specifically and clearly foreseeable.   She uses a legal definition that doesn’t apply, and then wants us to abide by it. Ah, you say, but some consequences of research are easily foreseeable:  if you produce a new viral strain with higher virulence, then a bioterrorist could use that to infect a population and kill millions.  But my response is now in the future:  unintended consequences cut both ways.  Producing a viral strain with higher virulence could be the thing that cracks the problem and lets us produce a vaccine or cure for the virus in question;  only by understanding how something works can we stop it.  And if producing that knowledge could have saved millions of lives but Douglas stops it, does she accept responsibility for the deaths?  If millions die and someone later uses that viral strain to produce a cure, do we get to punish Douglas because she stopped us from finding that cure when it was “specifically and clearly foreseeable”?  After all, I’m a reasonable (and reasonably educated) person, and I can conceive of many ways that such research could have helped.

Herein lies the core of the problem:  science is grey, not black and white.  There’s a lot of science that has both good and bad applications.  The laws of classical mechanics apply to nearly every scale of our daily life and make it better in many ways, from the curve of a thrown ball in sports to the launching of satellites into space, but they have also produced nearly all of the tools of conventional war (bullets to bombs).  The discovery of the structure of the atom has left us with nuclear power and nuclear weapons.  Douglas might say that I’m misconstruing her argument, that it’s only “specifically and clearly foreseeable” if it’s on a small scale and really obvious.  But again, I say that this is misdirection:  show me basic research that wasn’t intended to produce a weapon with unambiguously negative consequences.  And when there’s both good and bad, black and white, how do we decide what shade of grey is acceptable?  If it could kill millions (atomic weapons) but potentially provide power enough to help stop climate change (which could potentially impact billions), who gets to decide what is acceptable before we research it?

The fact is, we can’t stop knowledge from being produced.  But knowledge has no moral value or valence in and of itself;  knowing how to make a gun doesn’t force you to construct one and shoot someone with it.  This analogy is apt in another way:  I’m greatly in favour of gun control, and my belief from reading and thinking about the issue is that people cannot be trusted to use guns properly without regulation.  But in scientific terms, this means that we regulate the application of knowledge, not its production.  We don’t tell scientists what they can research, we as a society decide what we’re going to do with what comes out of research.  What I’m proposing is more difficult than what Douglas is proposing, because we don’t get to just cover our eyes and pray that no one else is smart enough to figure out what we’d rather keep hidden.  In my way we have to have hard conversations about how best to deal with and regulate the knowledge when it arrives.  And I’m sorry if that scares her, but that’s the way it’s going to have to be.

Tagged ,

Jargon: the new black sheep.

tl;dr:  Reducing jargon is good, but some jargon words are important and we need to keep them.  We should choose the words that give the public the best chance of understanding real science.  

The conversation about jargon in scientific communication has been thrust before my eyeballs a fair bit lately, starting with the Southern Fried Science piece - inspired by this paper – that got picked up by Boing Boing (which is where I saw it) and spawned an editable Google Doc spreadsheet of bad jargon words that should be replaced by “better choices”.  Then I noticed Christie Wilcox mention the issue while coining a term that I quite like:  “jargon walls”.  Her description of the issue is, frankly, awesome and so I’m going to quote it directly to set the stage:

Right now, science is almost entirely a one-way conversation. Scientists, as a group, pride themslves on doing cutting-edge research and publishing it in the top-tier journals of their field – then most feel that their part in the conversation is over. The problem is, these publications aren’t really communicating science to anyone but other scientists. Articles are kept locked behind expensive paywalls, and even those that are published in open access journals are still inaccessible, as they lie behind what I like to call jargon walls.

It’s not that non-scientists are too stupid to get science. Far from it. The average person simply doesn’t have the specific vocabulary to understand a scientific paper. I’m not stupid, yet when I take my car in to the mechanic, I don’t have the specific vocabulary to understand exactly what is making my check engine light keep turning on.

This jargon wall breeds distrust. Do I overall trust mechanics to know how to fix my car? Sure. But when one starts going on and on about how my timing belt needs adjustment, my fuel injectors need to be replaced, and there’s an oil leak in my engine that needs fixing, do I fully trust that he’s not just making up problems to get me to pay more for repairs? Not for a second.

And just before Christmas, another piece showed up in my Twitter feed from Deep Sea News by para_sight, with the tagline

Scientists must use the #language that we ALL possess, not the one only scientists possess

So many smart people have commented on this issue (and there’s a lot of links I’ve forgotten here, so please forgive me!) and agreed that scientists have to speak clearly and use more accessible language that the misgivings I felt when reading these blog posts and papers bugged me.  I commented on the aforementioned Boing Boing post at the time to express some of my shaky feelings on the matter:

Differences in interpretation are often ideological.  Someone mentioned climate change deniers above, screeching over “manipulation”.  What happens if scientists work really hard to change the words they use to exactly match the public use (which is a pointless moving target anyways; an example is the use of the word scheme, which as I’ve recently discovered means “government program” here in Australia), and climate change deniers and their ilk misinterpret the new word usage to mean whatever they want it to anyways?  You’re not going to solve the linguistic version of confirmation bias by changing the word you use – they’ll just move the goalposts and kick again.

Having said that, I’m not against the idea of reducing friction with the public by trying to avoid potential confusions.  It’s a worthwhile goal.  But I do think that you have problems coming from both directions;  ideologues will misinterpret you no matter what you do, and *every* field has its own jargon, usually for a reason. Solving that will probably require a more sophisticated approach, likely involving some combination of word normalisation as the paper suggests, and better education about common variants as other commenters have argued for.

But while I still think that these things are true, they didn’t really get at the heart of what was really bugging me about the anti-jargon push.  And then today, when I was reading para_sight’s piece, it finally hit me.  What really bothers me about this issue is that not all jargon is bad.  Some jargon words are really important, and we need to fight for them.  Some, we can abandon, but some must be defended.

What is jargon, anyways?

Let’s put the discussion on an even footing by beginning with a definition.  What is jargon?  The term usually implies one of two overlapping but distinct definitions.  The first is a terminology which is especially defined in relationship to a specific activity, profession, group, or event, and is what I think of when I use the term jargon.  But the other common definition (e.g. #6 on this OED page) centers on the exclusionary nature of jargon: “Applied contemptuously to any mode of speech abounding in unfamiliar terms, or peculiar to a particular set of persons, as the language of scholars or philosophers, the terminology of a science or art, or the cant of a class, sect, trade, or profession”.  It is this second definition, I believe,  that most people are working from when they call for scientists to avoid jargon.  To an extent, I agree with this argument.  A lot of the words that scientists use in technical publications don’t belong in communications aimed at the general public.  para_sight points out that tongue-twisters like “Ekman transport”, “population dynamics”, or “thermodynamic-anything” just get in the way.  At best, you end up spending half of your piece defining them, and at worst you don’t define them and the audience simply abandons you in frustration.

Evolution is a jargon word.

But not all jargon words are bad, and this is the point that I feel has been missed in the recent articles on this subject.  If you go back and look at those lists (here, here), you’ll notice some big words have been left off.  It’s never explained why, but nobody suggests that we stop using words like evolution or climate change.  Why? Evolution is, by either definition above, a jargon word.  It is terminology especially defined in relation to the field of biology, and while it is not unfamiliar in terms of exposure (most people have heard the word), it is certainly unfamiliar to many in terms of meaning.  I’d like to tell you how many, but I can’t find those numbers;  I can find plenty of numbers on how many people believe in evolution, but none on how many people can define evolution.  Even qualitatively, though, a simple scan of the internet will show you that the word is used about as many ways as there are web pages, and many of those usages are partly or entirely wrong.  Worse, creationists and people like them will deliberately misuse words like evolution in an attempt to redirect the conversation.  Virtual gallons of digital ink have been spilled trying to sort out the meanings of the word “evolution” and explaining its importance.

Evolution.  Natural selection.  Climate change.  Stem cell.  Herd immunity.  Quantum mechanics.  Pretty much anything from medicine.  These are all jargon words.  Why aren’t they on the lists of words-to-avoid?

They’re jargon, but they’re important jargon.

It may seem obvious, but we don’t avoid using the word evolution; we use the word because it’s an important word, even if it is jargon.  It is the central idea of biology (must … resist … overused Dobzhansky … quote) and instead of using a different word or just blithely re-explaining the word every time we use it, as biologists we spend a lot of time trying to get people to understand and remember this particular piece of jargon.  This is the battle we have chosen to fight:  we’ll try to avoid using words regression, or sensitivity, but we will not give up on words like evolution.

Climate change / global warming faces the same challenge.  Sommerville and Hassol note that “… many people confuse climate change with the ozone hole. They incorrectly identify the ozone hole, aerosol spray cans, toxic waste, nuclear power, and the space program as causes of global warming”.  Climate change is, in fact, jargon just like evolution or Ekman transport, but climatologists and other scientists involved in the fight to bring about change on this topic show no signs of avoiding the word.  Instead, they take every chance they can to go head-on with deniers and explain patiently, over and over, what the word really means.  They are trying, in other words, to teach the jargon to the audience.

So what?

The examples I’ve used are obvious and unlikely to cause dispute.  I’ll be really surprised if people show up at this blog post to tell me that we should stop using the word evolution because it’s jargon or because the public doesn’t understand it.  Yet I raise them because they’re an example of my point, which is that not all jargon is bad.  Some jargon must be retained and explained, no matter how many times we have to do it or how many problems it causes.  And so I would like to propose an unpopular idea:  that the message of reducing jargon is good, but incomplete.  It is incomplete because, in coordination with other members of the team (science communicators, social scientists, policy makers, etc.), we must have discussions about which words are unnecessary jargon and can be avoided, and which are necessary and must be retained.  This won’t be easy, because it will require not only detailed conversations among many parties with an axe to grind, but it will also require data and reasoning about which fights can be won (or should be fought even if we can’t win them).

I think the first discussions could be had about the list published at Southern Fried Science.  Here’s the list that they published there:

If I had any power over this, I would agree that some of these are unnecessary.  I can accept the “better choices” associated with words like assay, power, recent, or sensitivity.  Words like these, with obvious alternative meanings that conflict and cause confusion without the corresponding benefit to keeping the word seem like obvious choices to me.  But others on this list do not.  Gene is a good example.  Not only do I think that the “better choice” is most definitely not better – how is the public going to know what a “coding region” is without a lot more explanation, anyways? – but the word itself is so powerful within biology that explaining it and fighting for it brings tangible benefits.  In fact, I propose that we should value particular pieces of jargon in proportion to how much power they will grant the public in understanding the scientific world around them.  The scientific literature is not going to abandon the word gene any time soon, so if we spend our time explaining what a gene is to the public I would argue that it will pay off by giving inquisitive laypeople a boost over the jargon wall.  They can learn about the statistical meaning of confidence if ever they need to, but knowing the technical meaning of gravity will pay dividends in understanding the world around us that a fight over the meaning of the word primer won’t.

Back in the heyday of ScienceBlogs.com, before Pepsigate and the Mass Exoduses of mid-2010, there was a series of posts that attempted to explain basic concepts in science.  A few  of these directly address the ideas I’m talking about in this post, such as Larry Moran’s article on “What is Evolution?”, and I believe that this is one of the strengths that blog writing can bring to the table:  a platform for working scientists to directly address the public about the things that the public should (or might want to) know about science and the world around them.  As part of an overall strategy of scientific outreach, we can use blogs to fight for the jargon that is important, while learning the techniques of good science communication from good science communicators.

Tagged ,

Ben Stein is a sleaze…

Image by themadlolscientist.

I focus on science on this blog, but sometimes something ticks me off so much that I have to take advantage of this platform I’ve created for myself to climb on my soapbox and shout at the world.

Today, that “something” is Ben Stein.  I’ve never made any secret of the fact that I think that Ben Stein is a useless human being. His actions during the Expelled fiasco were enough to write him off permanently for me, but as I discovered a few days ago, apparently he can sink even lower.  How?  Well, I’ve been catching up on podcasts from the burgeoning Nerdist empire,  among which is a fun podcast by Riki Lindhome (actress and one half of the awesome comedy song act Garfunkel and Oates) called “Making it”.  On the fifth episode, Riki is interviewing the actress Doira Baird (imdb) when Doira drops a bit of a bomb about our favourite game show host and creationist pundit:  Ben Stein propositioned her to be his mistress.  You can hear the podcast here;  the fun starts at about 9:45 into the episode, and I’ve transcribed the best part:

RL: What, did he offer to pay for you or something?

DB: Yeah.

RL: Really?

DB: Oh, yeah. Secret girlfriend!

RL: Is he married?

DB: Umm-hmm.

RL: Oh, God.

DB: Good times.

RL: [wretching noise]

DB: That’s Hollywood for you, but I never …

RL: He doesn’t seem like that kind of guy [ed: *snort*], I mean, not that I’ve met him…

DB: I think he has some sort of, uh, agreement, I have no idea, I just know that he has a lot of money.  But I was so naive, I met him at the mall.

RL: Really?

DB: [laughs] Yeah, I met him at the mall. Of course, because when I came out here I didn’t have a car, so I would would just, like, take the bus to the Beverley Centre and hang out.

RL: [laughs] Totally.

DB: And he came up to me, and would take me to dinner.  I was so naive, I was like “he’s like my grandfather”.

RL: [mocking] “He’s not trying to sleep with me at all“.

DB: Uh, no!  And he would take me all these fancy places, and show me things in Hollywood that I would never have known about otherwise.

RL: Oh my God.  That shows, like, the true naiveté that you, like, thought it was, like, “oh, he just likes spending time with me!”.

DB: Completely.  It’s amazing that I wasn’t sold into white slavery …

RL: [laughing] … by Ben Stein.

DB: I have no idea how that did not happen.

Insert your own joke about Ben Stein’s money here.

Doira and Riki get a good laugh out of the whole affair and poke some fun at how naive she was as an aspiring actress in Hollywood, but the implications of what she was saying floored me.  Ben Stein, who seems to consider himself a righteous man lamenting the descent of the world into hell, is going to lecture us in between attempts to get young actresses into bed with him?  Where does he find the time?

Of course, maybe Doira made the whole thing up.  But if Ben Stein wants, he can sue Doira Baird for libel;  in the mean-time, the story is on the record for everyone to listen to.  Perhaps this story explains why Stein recently defended Herman Cain against the pile-up of adultery accusations:  maybe he’s preparing his own defence.

Tagged ,
Follow

Get every new post delivered to your Inbox.