You Keep Using That Word,
I Do Not Think It Means What You Think It Means
Inigo Montoya – Princess Bride
Greetings. Today I want to talk about citation impact factor. And what is citation impact factor you ask? Citation impact factor is what scientists use to assess their academic worth and scientific “impact” on things. It is a measure, I suppose, of a scientist's “value” as scientist. The more citations you have, the more valuable you are as a scientist.
Citation impact is a very simple measure. You write an article, you publish it, and you count how many other scientists “cite” your article (i.e. include it in the reference section of their article or book). That’s all there is to it. So for example if I write an article on labour process theory, and if I publish it in the journal, Work, Employment, and Society (Sosteric, 1996), and if that article goes on (for a time) to become one of the top ten most cited articles in the journal (as it did), perhaps gathering as many as (gasp) 99 citations, then it is judged to be a high impact article.
If it is high impact article, I get to point to it and say “I wrote something important.” If it is not high impact then all I can do is hang my head in academic shame and obscurity. I’m not cited therefore I’m not important. Without citations I am relegated to the dustbins of this planet's scientific history.
As boring as it is, this seems sensible, logical, and valid enough. Citations are a measure of usage, distribution, and impact. The more people use my research in their own, the more it is distributed, the more impact I have. The more impact I have, the more prestigious I become. But the logic, should seemingly sound, doesn’t hold, and the measure itself is not empirically valid. For example, if you unpack the notion of “impact” a bit and ask “impact on what,” the validity of the whole enterprise comes into immediate question. I mean, what can you really say about an article that has been cited 99 times? Does that mean it has been read 99 times? Hardly! Many scholars cite articles based on only a quick read of the abstract, some cite articles they have never read, and many more read things that they never cite at all. I’ve read thousands of books and articles in my life time and I've cited only a handful. It hardly seems valid to suggest that “number of citations” means anything at all in these circumstances. And these are just the most obvious problems. Even if you get past this rather glaring "fail" to the point where you would suggest that having a 99 citation count actually means something, you still have to answer the questions “means what?” And that’s not so easy to answer either. Does having a 99 mean your ideas are part of the intellectual fabric of this planet? Maybe! But then again, maybe not. You can’t tell that from the number alone. In order to determine actual impact, you would have to do some complicated (and expensive) qualitative research and if you did I’m pretty sure you’d find that sometimes an idea is incorporated into the intellectual fabric of this planet (or at least your discipline) after 99 citations, and sometimes not. You might also find that some of the 99 citations are criticisms and dismissals. If scholars cite your article just so they can express their hatred and disdain, how does that count towards prestige and impact? Clearly, there are significant problems. Anybody being honest about using citations would have to admit that as an operationalization of an academic’s scholarly worth, the measure is unreliable and invalid. And of course, if the measure is both unreliable and invalid, it's totally and utterly useless.
What I've said so far, brief as it may be, is enough to call into question the validity of citation counting; but some people might want to grasp and claw at the measure anyway. There is, however, a final consideration that makes it clear just how ridiculous citation counting is and that is the fact that sometimes scientific ideas can have major impact without ever getting cited at all. Consider this. I was heavily influenced by the thinking of Karl Marx, but I don’t cite him in every article I write. I don’t feel I have to, but his influence is undeniably there all the time. And if that is not enough to cause you to call into question the citation analyst’s little radar, consider this. A few years ago I wrote an article on forms of emotional abuse commonly found in schools (Sosteric, 2012). I published the article in Socjourn. In the time that it has been on this site, many people have commented on it, and many hundreds of thousands of people (parents and teachers) have read it. Just last year it was picked up by Educational Testing Service in the U.S.A to be used in their teacher certification materials. The contract I signed specified 50,000 teachers over the next ten years! This means that my ideas on emotional abuse in schools will be influencing the next generation of teachers as they complete their programs and pass certification. Clearly this article on abuse in schools is having a major impact on society, and possibly even academic thinking; but as far as I know, it hasn't been cited even once.
So where does all this leave citation impact factor? If you ask me, it is an unreliable, invalid, ridiculous, and useless measure. And frankly, I'm not the only one to say so (Adam, 2002; Baum, 2013; "Not-so-deep impact," 2005; "The ratings game," 2010). When you start to go deep, the issues pile higher. Citation counts may give the university administrator something to point to when they're looking at you for a promotion (which to be honest is what they are really all about), but other than that they have no utility at all. You won't get a fair representation of your impact on science by counting citations, and you certainly won’t get any indication of your impact on society; so, don't even bother to try. Citation analysis might sound all quantitative and empirical and hard, but if you ask me it is a weak, flaccid, distraction. Citation counting is a vacuous and meaningless measure that speaks more to academic ego than it does to actual contributions. If you want my advice, just give it up.
Adam, D. (2002). The counting house. Nature, 415(6873), 726-729. doi: http://dx.doi.org/10.1038/415726a
Baum, J. A. C. (2013). The excess-tail ratio: correcting journal impact factors for citation distributions. [Article]. [email protected]@gement, 16(5), 697-706.
Not-so-deep impact. (2005). [10.1038/4351003b]. Nature, 435(7045), 1003-1004.
The ratings game. (2010). [10.1038/464007b]. Nature, 464(7285), 7-8.
Sosteric, M. (1996). Subjectivity and the Labour Process: A Case Study in the Food and Beverage Industry. Work, Employment, and Society, 10(2).
Sosteric, M. (2012). The emotional abuse of our children: Teachers, schools, and the sanctioned violence of our modern institutions. The Socjournal, March.
Written by Mike Sosteric (Dr. S.)