One the silliest things that academics do is compare the number of A publications people have. The idea that a career can be summarized by a single number is clearly absurd. Even if research publications are all that count (and I think that is obviously nonsense) A publications aren’t all the same. While it is usually fair that on average better journals publish better papers, there are plenty of bad papers published in the best journals and much better papers published in lesser journals. Furthermore, the top journals have clear biases, often towards extending what we already know. Indeed, the “ideal” publishing strategy is to make each paper just good enough to get over the line, i.e. not as good as it could be given anything ‘extra’ should be kept for another paper. Anything controversial should also be abandoned as it probably won’t get past peer review. Plus, lets be honest, some papers in the best journals have results, either through fraud or accident, that don’t hold up to the sniff test.
So ranking academics by publications is clearly nonsense. It is part of the problem with business academia highlighted by Honig and colleagues in their paper in the Academy of Management Perspectives. They address a litany of problems and the vast majority are correct. That said I am going to be a bit contrarian and suggest that such nonsense as counting publications has a role when there are other problems in the system. I don’t think we can just let academics do what they want, and call it academic freedom. There needs to be some sort of method of review. Clearly counting publications is a ludicrous one but academics need to come up with something better if we want to get rid of it, not just point out the obvious problems. Something beats nothing and you can’t improve things without a better plan, even if the current plan is silly.
I would also say that counting publications helps shake up the status quo. Other ways that academics judge academics are often even worse than counting. We judge researchers on the university granting their phd, whether they know the big shots in the field, their extensive knowledge of obscure jargon, and other such snobbery. Mike Lounsbury makes an important point: “Data on publications in “high quality” journals could be used by previously marginalized scholars” (Honig and others, 2018, page 416). Hard working, bright people, often non-western scholars, can break into the elite if there is an objective measure rather than simple snobbery based upon who one knows. (Although obviously knowing people helps gets published even in “double blind” systems).
Counting publications is clearly absurd but so is letting the old guy in the corner tell us who is worthy of respect. We need to move on from counting publications but we need better systems, we can’t go back. Let us work on changing the ridiculous system of counting publications but the only way to do this is by coming up with something better.
Read: Benson Honig and others ( 2018) Reflections on scientific misconduct in management: Unfortunate Incidents or a normative crisis? Academy of Management Perspectives, 32 (4) 412-442