Monthly Archives: January 2012

Sight-reading my science

Sight Reading, by skelly98; used under a CC license.

Parents often say things like “when you’re older, you’ll be glad we made you do this”.  They’re also often wrong about that, but occasionally they get it right.  In my case, one of the few things that I agree with unreservedly is that I did indeed come to appreciate the time I spent learning to play the piano.  At least, I agree in hindsight;  as a child, in the future tense, I most certainly did not experience wild joy from sitting in front of our old, battered upright piano, stuck down in the basement and banging away for hours.  In younger times, I much preferred reading to the endless repetition of scales and pieces, but this was a preference that did not endear me to my mother.  

So, I developed a compromise system that appeased the attentive ear sitting upstars and awaiting the next masterpiece in A-flat.  Lessons were held at the home of Mrs. Birch, my Jekyll / Hyde piano  teacher who was a perfect candidate for Kindly Grandmother Jekyll of the Year until a student unwittingly sat themselves down at her Yamaha baby grand and unleashed Generalissimo Hyde;  these lessons inevitably involved carting several books to and from her house, which required a dedicated book bag that sat beside the piano.  This was a perfect place to stash whatever book I was reading  while I whipped off a quick left-handed play-through of whatever Bach fugue or Mozart piano concerto I was mostly ignoring, after which I would haul out my book and greedily mow through as many pages as I felt I could get away with before the warden would get restless upstairs.  

############

If you don’t play an instrument, you might not be familiar with the concept of sight-reading.  Learning to “read” music, to turn the notes on the page into a series of motor commands that lead to music coming from the instrument you are playing, is an important skill for any musician to develop and sight-reading is just the logical endpoint of that skill.  Sight-reading involves playing music placed in front of you for the first time as though you’ve been playing it for years;   at least, that’s what it’s supposed to be like, but it often involves a fair bit more squinting, scrambling, and muttered cursing than you might reasonably expect, especially when some bastard just handed you Mozart’s Rondo alla Turca and you’ve never played it before.  Experienced sight-readers will also probably agree that it requires the ability to see things coming, because if your eyes aren’t a few notes out in front of the notes that you’re playing right at this moment, the next thing that you’re going to be doing is trying to extricate your fingers from whatever hellish gang-sign-slash-car-wreck they’ve managed to tangle themselves into when you stopped paying attention.

############

Mrs. Birch wasn’t just a stern piano instructor, she was also a tattle-tale.  If she felt that you weren’t up to snuff, she would take the accompanying parental figure aside when they came back to pick you up and tell them that you clearly hadn’t been practicing enough.  Since the result of this was a crackdown that I wished to avoid, I was left with an extremely strong incentive to be good enough at the lessons to avoid such an outcome. Unfortunately, my equally strong disincentive to practice in favor of my much-preferred reading left me in a bit of a pickle, since it was hard to look like I’d practiced when I really hadn’t.  Thus, I became really good at sight-reading.

#############

When it comes to science, my problem isn’t that I don’t like to practice.  In fact, I read and think about science all the time.  But the trouble is, I have a terrible memory.  I know that this admission blows a hole in the scientific mystique, because if you watch television, every scientist on TV (besides being an intellectual giant) has what seems like unbelievably perfect recall.  Arguably, it’s their defining characteristic, since they’re rarely shown performing other skills like critical thinking or deep analysis of a problem – with notable exceptions, of course.  On the other hand, like any good story there’s a fairly large grain of truth to it;  those prodigious memories do (mostly) exist, and I’m sure that you even know someone like that.  Many of the best scientific minds I know do, in fact, have fantastic recall of the things in their field, good enough that it makes me inadequate to be in the same room with them.  Even the ones who can barely remember to tie their own shoes without a Post-it note on their laces can rattle off details of papers they read as an undergrad 25 years ago without pausing for breath.

Of course, some perspective is important here, both in assessing others and in assessing oneself.  When looking at the way other scientists recall information and data, it’s important to remember that they’ve spent their entire adult lives on these topics. Sometimes, this is really obvious.  Academia is a training program in narrowing your focus until you’re the world’s foremost expert in an area of knowledge so tiny that sometimes you’re also the only person in the world who cares;  anyone who has gone through the Ph.D. process and received a degree at the end is going to be able to spout reams of facts about their chosen topic, even if only from sheer self-defence.  And, when you feel insecure, it’s also easy to suffer from a perverse confirmation bias, where you only remember the times that other people sound smart and make you feel stupid by comparison.  In academia, there’s always plenty of smart people to make you feel inadequate, much like I imagine women feel when reading fashion magazines, flipping through page after page of ads that make them feel like they have to measure up to the impossible standards depicted therein. And, finally, it’s easy to exhibit sampling bias that borders on a half-baked solipsism:  since you can peer inside your own thoughts and see all of the failures of your own memory and cognition in real-time but the thought processes of others are opaque, it’s possible to forget (or disbelieve) that others can feel like that too.

With those caveats firmly in place, it is still pretty clear to me that I’m not that scientist.  You know the one: the diamond-tipped bit on the drill of science, driving a hole into our uncertainty and powering through to the truth in their field.  These are the sorts of scientists whose praises are rightfully sung for their life-long dedication to a field;  to pluck a name out of the air at random, I’m thinking of people like Frans de Waal, who has spent decades expanding our knowledge of primate social and evolution.  This dedication to primate behaviour has rewarded him handsomely with world-wide recognition as one of the foremost names in this field, which is as it should be [1]. But I’m just not that guy.  I can’t face the idea of an entire career spent drilling down into one topic;  there’s too much out there, and I want to play in more than just one sandbox.  You can see this in my academic history, where I’ve wandered from computer science (undergrad), detoured briefly into classics, back to psychology (undergrad and M.Sc), to behavioural ecology (Ph.D.), and now I’ve dropped into the depths of evolutionary biology to work on the dynamics of viruses and bacteria.  Unfortunately, this sort of academic field-hopping is viewed with suspicion, at best (“Narrow and deep is good.  Shallow and broad is usually not appropriate”).  And, it doesn’t really maximize my production of papers, which means that if I want to maintain an academic career I will probably need to settle down soon.  In truth, I think I’m getting there, because I keep coming back to questions relevant to behavioural ecology even when studying pathogens [2].

In sum, the path my career has taken me and my cognitive limitations have left me with this basic truth:  I’m a scientific sight-reader.  What does that mean?  Lacking prodigious recall and Renaissance-man tendencies, it means that I’m always in the soup.  For one thing, I’m always having to look things up, even surprisingly basic things.  Usually, it’s to confirm to myself that my memory hasn’t played tricks on me (a problem that arises because I don’t tend to use the same techniques and knowledge repeatedly), but sometimes its simply because my background is shallow, not deep.  Sometimes, I’ve missed things.  In meetings or conversations, I have to think hard, because I need to be a step ahead of the conversation if I’m going to be of any use to it.  I need to find hooks to the knowledge that I do have, ways that I can draw analogies to and from things I know, applications to a problem that come from my background.  I’m always squinting, mumbling, and cursing my way through the fog of uncertainty, scrambling to stay one step ahead before I lose the plot. Like many things, this is both a blessing and a curse.  To the positive, though I’m not a world-expert in most of what I do, I can usually contribute something just by virtue of having that broad toolkit.  My years of experience with programming has led me to carving out a nice niche as a simulation guy in behavioural ecology – “computer jockey”, as my Ph.D. advisor put it.  That, my study of behaviour, and my background in statistics did get me the postdoc I’m holding now; I may be master of none, but I’m still a jack of all trades.  On the other side of the coin is the problem of depth.  I often need to rely on other, smarter people to make sure that I’m not making basic mistakes.  This isn’t all bad, as I love spending time picking the brains of those smart people and working with them.  But it certainly does not promote self-confidence, and I always feel like I’m one step away from being exposed as a fraud (though it’s probably sub-clinical).

Am I a poorer scientist for it?  In these days of increasing specialisation and balkanisation of scientific fields, there are many who would say yes.  And perhaps they’re right.  My interests tend toward the interdisciplinary, an idea to which much lip service is paid and little support seems to be given.  It’s certainly caused me no end of troubles, and it will probably keep causing more.  But I remain stubbornly convinced that there are benefits, too. Perhaps being forced to constantly leap about to stay in front has made me pay attention, if nothing else.  I don’t know where this will lead, but I do know one thing:  even if I’m not destined to change the world with my science, there’s nothing else on this earth that I would rather be doing with my life.

#############

Recitals and exams, those were the parts of playing piano that I always hated the most.  They drew most strongly on the skills I had avoided, including long hours of careful repetition and memorisation down to the last note.  Not to say that good piano players are robots;  quite the contrary, that laser-like focus provides the raw material for some of the greatest musicianship known.  That sort of depth frees you to be creative by moving your cognition about what you’re doing to a more sophisticated level, somewhat like having a large vocabulary frees you to be able to read almost anything without needing to parse the text every time.  

I was never going to be a musician.  This doesn’t mean that my time was wasted, though, or that all was lost.  Years later, when I was hopelessly in love with the woman who would later become my wife (and she was as yet completely uninterested in me), I sat down with her at the piano and played an arrangement of a song I loved that I had never seen before.  I had played other arrangements of this piece and loved them, but this particular set of squiggles on the musical page was new to me.  Yet when I sat down at the piano and she snuggled in beside me, I was inspired to play that song as though I had been practicing it all my life.  And through the years, we’ve  talked about the history of the piece, and I’ve told her about some of the science – the mathematics, physics,  psychology, and biology – that informs music and musical experience.   My knowledge may not be deep, but its breadth can lead to unforeseen recombination;  that piece of music and my playing it for her became part of a tapestry of woven skill and knowledge that helped form my relationship with the person I love.  And, at the risk of post-hoc justification, I wouldn’t have it any other way.  

——–
1. To be clear, I’m referring to his work only; I don’t know Dr. de Waal personally, though I have absolutely no reason to believe that he’s anything other than a perfectly nice person.
2. This is something I’ll be blogging more about as I develop the line of research and hopefully present it at ISBE in August.

Advertisements

Is it the 1960s again?

Ritual Sacrifice of the Gummulate Tribe!

Ritual Sacrifice of the Gummulate Tribe! by Grizdave, used under a CC license

 

Found in a textbook today ([1], p. 14-15), immediately following a discussion of Ebola and Lassa fever infections in humans:

While having the death of a host individual occur as the product of an encounter with a pathogen may seem like a dire outcome, this outcome represents a mechanism of defence operating at the leve l of the host population.  If a particular infectious agent is something against which members of the host population could not easily defend themselves, then it may be better to have that particular host individual die (and die very quickly!) to reduce the possible spread of the contagion to other members of the population.

In other words, if it looks like you’ve been infected by something nasty, you sacrifice yourself to stop its spread for the good of the other members of your population.

Look, I’ll be the first to admit that I hold a dim view of multi-level selection, but I’d be really surprised if anyone in the MLS camp were to make an argument as simple-minded as this.  Virulence is a complex topic, certainly, but the above paragraph could have been lifted from a previously-unknown book by Wynne-Edwards in the 1960s and no one would know the difference.  How is it that people are still getting away with stuff like this forty years after it was first shredded by the likes of George Williams and John Maynard Smith?

—-

1. Christon J. Hurst. Defining the ecology of viruses. In Christon J. Hurst, editor, Studies in viral ecology, volume 1, chapter 1, pages 3–40. John Wiley and Sons, Inc., 2011.

Tagged ,

Rejection Watch Vol. 1(3): Dave Walter

Dr. David Walter is a current member of the Department of Biological Sciences at the University of Alberta (where I did my undergrad and M.Sc.;  unfortunately, I was in the Psychology department then, and never met Dave) and he’s also an advisor to the Royal Alberta Museum on mite behaviour, ecology, and identification.  You can also find him blogging at Macromite’s Blog where he has some quite amazing pictures.  Dave sent me this great story of the perils of naming new species for Rejection Watch:

——-

This only counts as a near rejection, so you may decide to reject it yourself, but your tale of meeting your nemesis at a poster at a conference reminded me of a similar encounter. I’m an acarologist and long ago got used to having my papers sent back with the ‘not of interest to our readership’ theme and soon found the journals that would find mites of interest or learned to hammer through the few papers that more general journals would accept and, other than a few bent nails that couldn’t be straightened, have had a reasonably successful career.

My graduate training was in both ecology and systematics, but at heart I was an ecologist and considered the taxonomy part just plain hard work with no reward. Still, when you need a name to hang some behaviour on, you may have to describe new species. Very early in my first postdoc I found that I was up to my 13th new species description. That seemed a bad sign and having just read an article in Smithsonian magazine about fear of the number 13, I was inspired to name my new species ‘triskadecaphobiae’. Well, first thing that went wrong was the word was too long to fit on a slide label, but by the time that I figured that out the paper was already off to the Annals of the Entomological Society of America.

Eventually the reviews came back on official forms (this was a long time ago) and the second problem appeared – whoever typed the form had misspelled triscadecaphobiae. Still, one review was okay, but the other is etched in my mind and went more or less like this: ‘Normally I would suggest accepting such a paper with minor revision, but because of the author’s obvious scientific immaturity as evidenced by his choice of a scientific name, I recommend rejection. A scientific name should march down the ages as a testimony to the good taste of the author …’ and so on for several stinging sentences.

Later, it turned out that a friend of mine was visiting the lab of the referee at the time my paper had arrived and the ref had come storming out of his office red-faced, waving my paper, and shouting ‘who is this arsehole Walter and who does he think he is?’ and possibly other less kind things that I have forgotten. It also turned out that another friend – we had been graduate students together – was a postdoc in this lab. I had previously named another species in honour of this student by appending the Latin for ‘belonging to’ (-ianus) to his name. This is perfectly correct (and went to a French journal, so they noticed nothing amiss), but of course, sounds rather asinine. ‘X-ianus’ was, basically, puerile and I suppose every time my friend saw the name he was annoyed and let people know it. So, I was reaping what I had sowed.

The editor of the journal, a famous entomologist who I suppose should remain nameless, was very nice and asked me for my opinion. I thought about the name for a while, considered the problem with the labels and a future of misspellings (not to mention my reputation), and suggested that I name the species after the editor – by appending an ‘i’ to his name. That worked perfectly and I was able to get past #13 and add another line to my CV. It was 20 years before I found the mite again, but it turns out to be common in the northern Great Plains, so the name has turned out to be both useful and honours a great entomologist.

I still manage to sneak a pun into a paper every now and then, and at least one species name has made a list of such irreverences (Funkotriplogynium iagobadius – species named after the King of Funk, James Brown, – but the genus was someone else’s and the species from one of my postgraduate students following in my mould and I just went along). So, I guess I’ve never really learned to grow-up completely, but I have become more circumspect (and insidious).

Even better, I ran into the referee in front of a poster at an Ent Soc meeting a couple of years later and stopped to introduce myself and admit to triskadecaphobiae. He turned out to be delightful and we subsequently enjoyed a productive correspondence. Turns out his comments to the editor were much less vitriolic than his comments to the author and he was simply taking the opportunity take me down a peg. Still, I wonder what will happen when I get to my 13th new genus …

—-

I want to thank Dave for sending me that and remind you that Rejection Watch is driven entirely by reader submissions, so if you’ve been holding on to yours until now, get them into me!  That email address again is rejectionwatch@gmail.com, so send me your best academic rejection story now, and I’ll throw in this free juicer*!

 

*Limit of 0 juicers per person

Fixing HieroTeX on OS X Lion.

I have some weird hobbies, including puttering with ancient languages.  As part of that, I’d like to be able to typeset Egyptian hieroglyphic (not ‘hieroglyphics’:  pet peeve of mine.  Hieroglyphic is the language, hieroglyphs are the characters;  saying ‘hieroglyphics’ is like saying ‘I speak Frenchs’) in LaTeX using TeXShop for OS X.  There’s a great package for doing this, HieroTeX, and TeXShop actually comes with clear and usable instructions for installing it to work on a Mac (see the instructions in ~/Library/TeXShop/Engines/Inactive/Hiero).  Unfortunately, there’s one problem:  HieroTeX, as it is currently compiled and distributed, is a PowerPC application that requires Rosetta to run.  Since Lion has been removed from Lion, the first time you try to typeset anything using TeXShop and HieroTeX, the process will fail with a pop-up telling you that the executable file sesh is no longer supported because it is a PowerPC application.

I’ve managed to recompile the sesh executable from source on a Lion machine.  If you’re running TeXShop on Lion, this may help you.

Here’s how to fix this.  The short version:

  1. FOLLOW THE TEXSHOP DIRECTIONS FIRST.
  2. Download sesh.zip from here, and unzip the file (let’s say onto your Desktop).
  3. Place the new sesh into /usr/texbin (you will likely be asked for your admin password):   cd ~/Desktop; sudo cp sesh /usr/texbin

This isn’t guaranteed to work at all, but I’ve tested it with TeXShop 3.06 running on Lion (10.7.2).  Lion should be the only version of OS X that requires this fix (I think).

The long way is to recompile sesh from source yourself.  If you want to do that, you’ll need to grab the source from here.  Get the file called HieroTex.tar.gz and unzip it.  It will create a source tree with the sesh source in the Seshnesu subdirectory.  To make it, use make sesh (or muck with the variables in HieroTex/variable.mk first, but it worked for me without changes).  Unfortunately, trying to compile the unmodified source will fail because the code uses malloc.h, which is outdated and won’t be found on Lion.  To fix this, grep the source files and replace every instance of #import <malloc.h> with #import <stdlib.h>, unless it has already been imported elsewhere in the file, in which case just delete the malloc line.  It should compile now, and you can move it to the texbin directory as above.  After that, TeXShop should successfully typeset HieroTeX.  A word of warning:  the first round of typesetting after installing HieroTeX and replacing sesh will take forever as the fonts get sorted out.  Just go get yourself a drink and wait.  If you like, here’s a simple file to test the install that I got from the HieroTeX manual:

\documentclass{article}
\usepackage{hiero}
\begin{document}
\begin{hieroglyph}
A1 \end{hieroglyph}
\end{document}

If this file is successfully typeset by TeXShop, you’re up and running!

Tagged , , , , ,

Just a picture of a flower.

Just a picture of a flower that I took in Centennial Park (in Sydney) a little while ago.  I liked it, nothing more than that.

Science is grey, not black and white.

tl;dr Blaming scientists for the knowledge they produce is a pointless task fraught with insoluble problems.  It will work better if we police the use of knowledge rather than its production.

Over at TheScientist, Heather Douglas has put forth her opinion on the moral responsibility of scientists in a piece entitled “The Dark Side of Science”.  Her argument is fairly straightforward:  science can have bad consequences, and if we as scientists can foresee those consequences, we have a moral responsibility to not do the research or to hide the results of this research away.  She says:

 As scientists develop ways to generate sequences of base-pairs ever more cheaply and efficiently, the opportunity for the malicious or the simply unreflective to play with pathogens to see what kind of traits arise looms larger.  And it is not just technological know-how that can be problematic.  The detailed knowledge of cellular or genetic functioning can have worrisome implications as well.  Knowledge of what makes a virus more transmissible can assist us in detecting when a virus might be more prone to producing an epidemic, but it could also be used to make viruses more virulent.

In sum, scientists are responsible for both what they intend to achieve and that which is readily foreseeable, as we all are.  There is nothing inherent in becoming a scientist that removes this burden of responsibility.  The burden can be shared—scientists can come together to decide how to proceed, or ask for input from ethicists, social scientists, even the broader public.  Alternatively, scientists could decide (and it has been proposed) that some forms of regulation—either in the selection of projects or in the control and dissemination of results—be imposed on the field of synthetic biology, to reduce the risks.  The more oversight scientists submit to, the less responsibility they bear, but it comes at the cost of the freedom to choose the type of work they can do and how they do it.  This is the essential tension:  as long as there is freedom of research, there is the responsibility that comes with it.

Well, now.
There’s two big problems with this piece.  The first hinges on the idea that we can control the production of knowledge, as though it were a tap that only we have our hands on.  My comment on her piece was biting, but I’m not particularly inclined to apologize for that:
Ah, yes, because if we decide not to do research due to the possibly dangerous side-effects, that means that no one else can discover that knowledge either!  Whew, what a relief.  It would be terribly frightening if the scientific method were available to just *anyone*, lab researcher or terrorist bent on biowarfare alike.
Even putting aside the problem of intention vs outcome, as Brian mentioned, I would refer you to the computer security industry and the disastrous failure of security through obscurity policies.  In the days of $1000 genomes and DIYbio, the idea that we can stuff the genie back in the bottle if we just cover our ears and loudly declare “Nah nah nah nah, not researching that!” is ludicrous.

Science is a distributed process, and knowledge is out there for anyone who cares to look for it.  If I “foresee” that researching zebra finch song production (as an example of research I don’t do) will lead to bioweapons production in the Middle East, I can … what?  Send out a strongly worded email to every lab working on zebra finches across the world and hint that zebra finch songs hold the key to the apocalypse?  Secretly poison every zebra finch in existence to stop research on this ubiquitous lab animal?  I suppose that it’s not entirely impossible that I could convince every legitimate scientific lab on earth to stop working on zebra finch songs – I’d have to schlep all over the world to do it, and somehow convince every PI working on it to stop (academic cat herding, anyone?) – but even if I did, what then?  Zebra finches aren’t that hard to find or work on, so stopping research on this bioweapons outcome would somehow require forcing mainstream labs across the globe to maintain absolute secrecy while also forcing them not to work on this problem.  And this all depends crucially on human beings not being curious. That hasn’t worked out very well in the past.

The issue is that there’s no single body that is responsible for science, and no way to control the production of knowledge.  This works in our favour when it comes to religious groups and other nutjobs who wish to squash the advancement of human knowledge;  it doesn’t when we would like to put the cat back in the bag.  So, in this regard, Douglas’ argument is finished before it begins, because what she is asking for is impossible on the face of it.  It might be possible to slow things down, and hiding details may be a responsible thing to do in some circumstances (though I’ve yet to see a convincing case for this), but if the knowledge is valuable it is free for anyone to find.

The second problem with Douglas’ argument is the idea of “foreseeable”.  She’s rather vocal about this:

Many of the comments seemed to miss the key point that scientists are responsible for foreseeable consequences only– and I would want to restrict that to specifically and clearly foreseeable consequences.  This bounds the scope of responsibility, and in the way that is done in everyday life (and in courts of law– the reasonable person standard for foreseeability is the common standard for issues of negligence).

This acts in both the past and the future. In the past, retroactively deciding what was or was not foreseeable in the production of human knowledge is going to be an impossible task.  Science is, by definition, the process of creating knowledge and understanding that does not yet exist;  how that will interact with the body of knowledge we currently have and the body of knowledge that we have yet to produce is not foreseeable by any reasonable person.  No reasonable person has a system for deciding what will be the outcome of research – if we did, we wouldn’t need to do the research.  The reasonable person standard doesn’t apply here, because by definition you are retroactively assigning probabilities to events that effectively had a uniform distribution (when you produce the research, all things are possible).  Oh, but doing research into virulence is more likely to produce bioweapons, you say!  Really?  Or is it just as likely to produce a way to stop them (more below)?  Note that I’m speaking about basic research here, not applied research directly focused on producing weapons.  If your intention is produce a weapon, you bear the responsibility of creating that weapon;  if your intention is create new knowledge, assigning responsibility for what is done with that knowledge is a lot less clear.

In short, I would argue that “specifically and clearly foreseeable” is a red herring;  my argument is that producing new knowledge always has consequences that are not specifically and clearly foreseeable.   She uses a legal definition that doesn’t apply, and then wants us to abide by it. Ah, you say, but some consequences of research are easily foreseeable:  if you produce a new viral strain with higher virulence, then a bioterrorist could use that to infect a population and kill millions.  But my response is now in the future:  unintended consequences cut both ways.  Producing a viral strain with higher virulence could be the thing that cracks the problem and lets us produce a vaccine or cure for the virus in question;  only by understanding how something works can we stop it.  And if producing that knowledge could have saved millions of lives but Douglas stops it, does she accept responsibility for the deaths?  If millions die and someone later uses that viral strain to produce a cure, do we get to punish Douglas because she stopped us from finding that cure when it was “specifically and clearly foreseeable”?  After all, I’m a reasonable (and reasonably educated) person, and I can conceive of many ways that such research could have helped.

Herein lies the core of the problem:  science is grey, not black and white.  There’s a lot of science that has both good and bad applications.  The laws of classical mechanics apply to nearly every scale of our daily life and make it better in many ways, from the curve of a thrown ball in sports to the launching of satellites into space, but they have also produced nearly all of the tools of conventional war (bullets to bombs).  The discovery of the structure of the atom has left us with nuclear power and nuclear weapons.  Douglas might say that I’m misconstruing her argument, that it’s only “specifically and clearly foreseeable” if it’s on a small scale and really obvious.  But again, I say that this is misdirection:  show me basic research that wasn’t intended to produce a weapon with unambiguously negative consequences.  And when there’s both good and bad, black and white, how do we decide what shade of grey is acceptable?  If it could kill millions (atomic weapons) but potentially provide power enough to help stop climate change (which could potentially impact billions), who gets to decide what is acceptable before we research it?

The fact is, we can’t stop knowledge from being produced.  But knowledge has no moral value or valence in and of itself;  knowing how to make a gun doesn’t force you to construct one and shoot someone with it.  This analogy is apt in another way:  I’m greatly in favour of gun control, and my belief from reading and thinking about the issue is that people cannot be trusted to use guns properly without regulation.  But in scientific terms, this means that we regulate the application of knowledge, not its production.  We don’t tell scientists what they can research, we as a society decide what we’re going to do with what comes out of research.  What I’m proposing is more difficult than what Douglas is proposing, because we don’t get to just cover our eyes and pray that no one else is smart enough to figure out what we’d rather keep hidden.  In my way we have to have hard conversations about how best to deal with and regulate the knowledge when it arrives.  And I’m sorry if that scares her, but that’s the way it’s going to have to be.

Tagged ,

A perfect companion to Rejection Watch…

If you’ve been enjoying the Rejection Watch series on this blog, you might want to head over to The Journal of Universal Rejection and submit your latest masterpiece.  From their home page:

The founding principle of the Journal of Universal Rejection (JofUR) is rejection. Universal rejection. That is to say, all submissions, regardless of quality, will be rejected. Despite that apparent drawback, here are a number of reasons you may choose to submit to the JofUR:

  • You can send your manuscript here without suffering waves of anxiety regarding the eventual fate of your submission. You know with 100% certainty that it will not be accepted for publication.
  • There are no page-fees.
  • You may claim to have submitted to the most prestigious journal (judged by acceptance rate).
  • The JofUR is one-of-a-kind. Merely submitting work to it may be considered a badge of honor.
  • You retain complete rights to your work, and are free to resubmit to other journals even before our review process is complete.
  • Decisions are often (though not always) rendered within hours of submission.

And after you’ve submitted don’t forget to check out their blog, which houses a fabulous collection of rejection letters from their esteemed editors.

Rejection Watch Vol 1(2): Rob Williams

Welcome to issue 2 of Rejection Watch,  now in the less-than-three-hours-to-read format.  Rob Williams writes:

——

My favourite review (now, with the benefit of hindsight), is a paper that estimated abundance of pelagic sharks in continental shelf waters of British Columbia, Canada. One review was great. The other reviewer obviously wasn’t familiar with the statistics underlying line transect surveys, but rather than admitting that, s/he wrote, “How do you know that you didn’t just see one shark over and over again?” So, our shark would have to have been hovering a mile ahead of us, anticipating where our next (randomly selected) trackline would take us, and speeding up ahead to be seen again, repeating the process 100 times. Like Jaws, but this time hell-bent on being seen, not extracting revenge. But, one bad review meant that the paper was rejected. Fortunately, we told the editor how silly this concern was, and the paper was accepted after revision.

—–

Rob sent a link along to the paper he’s talking about, and you can find it here.  I was so amused by the idea of a super-troll shark racing around in an effort to subvert the scientific process that I googled “Sneaky Shark”, with predictable results.

I’d like to thank Rob for sending in the first reader-submitted Rejection Watch!  I already have another ready to go, I’ve been promised that one is incoming in the next few days, and I’m also collecting the short rejection blurbs I’ve gotten on Twitter for the first special Twitter edition of Rejection Watch coming soon (thanks @hylopsar for suggesting it).  So, email me your stories at rejectionwatch@gmail.com, or if you’ve only got 140 characters in you tweet me @BehavEcology and we’ll share the pain of rejection together.

Unidentified: any spider people recognise it?

Spider people, leave me a comment if you know anything about this spectacular individual, found in Sydney while I was walking to lunch:

Unidentified spider I saw on my way to lunch...

Captured with my iPhone 4S...

 

Update: Twitter responds!

 

 

 

 

And a quick email to one of UNSW’s spider gurus, Michael Kasumovic, confirms that this is probably a juvenile of Nephilia plumpies.  Thanks, guys!

 

Tagged , ,

Rejection Watch, Vol. 1(1): Me.

Since I put out the call for Rejection Watch a few days ago, I’ve gotten a great response on Twitter and have already received a reader submission that is in the can and ready to go.  But, since this is my feature and my blog, it does make sense that I should go first!  So, I’ll relate something that happened to me in the course of my Ph.D. and during my Ph.D. defence;  it’s not the classical scenario of “I got rejected for a great/ridiculous/unbelievable reason”, but instead the more postmodern twist of “I tried to reject someone else, it came back to bite me, and then things really went pear-shaped”.  It’s also longer than will likely be the norm for this space.  If you’re looking for the more classical – and shorter – scenario, I’ve got a great submission from Rob Williams coming up in Issue number 2 of Rejection Watch;  look for it on fine blogs (mine) everywhere (here) soon.

And in the mean-time, if you’re a scientist, stop reading this and send me your best rejection.  The rules are simple (see the announcement for details), and you can make it as long or as short as you want.  Funny, terrible, crushing, life-changing, whatever it is click the link: rejectionwatch@gmail.com and send me your story!

—–

Names have been obliterated to protect the guilty, namely me.  

tl;dr:  Things go badly wrong for the author.  

Early in 2010, I was asked to review a paper for the Journal of Theoretical Biology which modeled a question directly related to what I was working on for my Ph.D. and which came from a lab whose work I enjoy. Obviously, I jumped at the chance.  Reviewing the paper, however, turned out to be more challenging than anticipated because I immediately ran into a conceptual problem with their model that I felt scuttled the whole affair.  In essence, I felt that they weren’t modelling what they thought they were modelling.  On the other hand, these were some pretty smart people that I was criticising (including a pretty big name in evolutionary biology), so I spent a very long time convincing myself that I hadn’t gotten things backwards.  Agonised, I went back and forth about it for a couple more days, and finally wrote that that I thought the paper was unpublishable unless the authors changed it to address the my objection (essentially by redoing the whole thing) or could provide a justification for why I was wrong.

Sure enough, the revision came back and the authors rejected my view on the matter;  they politely conveyed that they felt that I wrong about this being a problem, and that I had misinterpreted the logic of the entire class of models that this work was based on.  Since this was a fairly binary, yes/no disagreement, I decided that it was time to seek out the advice of others.  I fired off a quick e-mail to my Ph.D. advisor and got an equally quick reply:  “Nope, I agree with them.  You’re wrong.”

Hmm.

I thought about it some more and though I still felt that I was right, numbers weren’t on my side.  And since all of the smart people in the room felt differently, I had to concede that perhaps I had made a mistake.  I sent in my review of the revised paper and withdrew my objection,  the paper was duly published, and after nursing my bruised ego for a few days I promptly forgot about the matter.

Fast-forward six months to the ISBE conference in Perth last year.  The head of the lab that had written – and who was last author on – the paper I had tried to reject is attending the conference and giving a poster;  I wander by his poster without realising it, and he makes a point of grabbing me to chat.  I’ve never met this guy before (we’ll call him Big Cheese, BC for short) and have no idea what to expect.  He’s also … shall we say, not from around these parts? … and he’s pretty intense,  so I’m having a bit of trouble reading his tone when he tells me that “we need to sit down and talk, the two of us”.  I, of course, immediately say “Oh, sure” before I can figure out how to get away with “Hey, look over there!” followed by a quick exit out the side window.

Having agreed to meet with him, I spend the night wondering what in the world he wants to talk to me about and hoping that it’s just about our shared scientific interests.  Instead, it catches me completely flat-footed that the first thing BC he says when he sits down is “It was you who reviewed our paper, wasn’t it.”  Not even a question, really, just a statement of fact.  He’s still looking pretty intense,  I still can’t read his body language, and I’m pretty sure I’m about to set a record for being the first guy knifed at a behavioural ecology conference.   Doesn’t seem like the odds of me getting away with denying it are all that good, though, so I fess up.

Turns out, all is well:  he’s one of the nicest guys you can imagine, and his intensity is actually just enthusiasm mixed with a tiny dash of thick accent.  We chat excitedly about science for over an hour, and part ways on good terms.  I finish out the conference feeling pretty good about things and promptly forget about the whole thing once again.

You’d think that it would have ended there, but there’s an odd coda to this show.  Fast-forward another eight months, and I’m scrambling to get my Ph.D. defence in order because I’ve landed a post-doc in Australia and I have to get this degree thing wrapped up.  BC is now the external examiner on my committee, which makes sense for two reasons.  First, he’s actively publishing in the field that I’m defending my thesis on.  Second, having learned my lesson from the review I’ve since written a paper employing a model similar (in basic logic, though not application) to the one that I tried to reject when BC wrote it.  I feel that there are differences between the two efforts that rescue my paper from my own objections, but I’m well-prepared for this to be a significant talking point during my defence.  In fact, I’ve spent the better part of a month reading and preparing to defend my position on the matter.  So you can imagine my shock when I get written comments back from the other examiners and another member of my committee is rejecting my paper for exactly the same reason that I rejected BC’s paper the first time around!

At this point, I’m about to give up entirely, and so I settle on what I believe to be a time-honoured strategy in academic circles:  when the matter comes up during my defence,  I plan on winding up both BC and the other committee member on the subject, pointing them at each other, and just leaving the room.  I can just come back and pick up the pieces after they’ve hashed it out, goes my thinking.  I spend the night before my defence planning to both defend and deflect on this subject (and I was already pretty emotional to begin with) so it’s quite the anti-climax when I find out that through the magic of bureaucratic mis-scheduling, BC is not going to be tele-conferencing into the defence after all.

Le sigh.

Thankfully, I’m prepared to defend my choices anyways, and I make it through the defence only a little worse for wear.  Also, I discover that I’m pretty good at verbal tap-dancing, and I can wave my hands with the best of them.  But if you’re a Ph.D. student reading this and looking forward to your own defence one day (assuming you live in a place that has oral defences;  if you don’t, I respectfully hate you), take a lesson from my ups and downs:  reviews can come back to haunt you at the most unexpected of times, so watch what you write!

——

Once again, Rejection Watch will be a semi-regular feature driven by reader submissions.  So, get yours in now by emailing me at rejectionwatch@gmail.com, and then you can feel free to mock me in the comments below.