Friday, March 27, 2015

Authorship and Intellectual Significance

I have been thinking about this subject for a while. How is coauthorship determined for a scientific paper? Are there any standards set by the community by which everyone abides? The motivation for this was the recent publication of two papers by my former group at NIST, upon which I was not listed as coauthor. NIST has decided to obey strict guidelines for authorship - one must make a "significant intellectual contribution" to the work. (This guideline is supposed to work in general, but is often not followed.) Of course, "significant intellectual contribution" is vague and ill-defined. A significant contribution that is not "intellectual" can be overlooked for authorship purposes. Technicians and staff are incredibly important to the generation of scientific knowledge, but are not awarded with authorship status because their contributions are often not "intellectual." As a note, this often leads to authorship discrimination based on formal educational attainment - PhDs can often be given coauthorship while someone with a bachelor's degree who does very similar support work on a project will not.

"Significant" is then next term to discuss. If we talked about the project and the proper way to analyze data, does that count? What about deriving an equation from a known source? Simply being in a position of power on a given project, but contributing essentially nothing to experimental design, sample preparation, and data analysis? Some would argue that these are not significant. Some might draw an arbitrary line (well, we all have to draw an arbitrary line somewhere).

Problems arise when different institutions have different standards for coauthorship. If one individual works in a group with more lax standards, that individual will produce more papers and their CV will look better. Because the general standards for judging scientists often involve total number of publications, this can be problematic for scientists competing for jobs. Simply working in an institution with more stringent publication standards, including how many publications should be written for a given data set (some groups publish essentially the same data in multiple journals for slightly different audiences), can hinder an individual's career for good.

When I was at NIST, the management was cracking down on stringency standards for authorship. This is fine, for an institution that provides full-time, permanent jobs for its staff - they will not need to worry about finding a new job and competing in the market. However, for postdoctoral researchers who will be let go after two years no matter what (as happens now that money has been removed from the system), it can be a disaster.

A final note, as to what motivated me to think about these issues, was reading a recent paper published by a former colleague in a very high impact journal in our field. This colleague asked me to derive an expression so he could compare it to experiments. I did this, and went through the derivation with him, and pointed out differences from some other groups' work due, which he was using to fit the data, to the unique geometry of his samples. (I actually know that the changes were crucial because the original version of his work, which was initially rejected by Nature, is still on the arXiv, and it includes the old expression. The version published in Physical Review Letters has the new expression, based on my input.) In addition, the supplemental material refers to work that I performed, both intellectually and physically, some of which was published elsewhere. And, while I was aware that the work was being published, I was not given access to the prepublication versions. Now, we can argue about the intellectual significance and merit of my contributions. I, personally, don't particularly care about the acknowledgement so long as the science gets done - but I am realizing that I might be too easy-going (for my own career) on the topic.