Home > News > Improving scholarly journals — Part 1
76 views 8 min 0 Comment

Improving scholarly journals — Part 1

- February 19, 2009

I like scholarly journals — call it a character flaw. Back in the day I would go over to the library five days a week and spend an hour or so just walking around in the current periodicals room, picking up, looking through, and sometimes even reading journals in a wide array of fields, most of which I knew nothing about. I was interested in how scholars in various fields do and present their work, and the journals gave me a great window into that process. Over the years, my career has pretty much revolved around journals — getting published in them and rejected by them, reviewing submissions for them, serving on their editorial boards, helping select their editors, and even editing a couple of them myself.

I think I can assert without fear of contradiction that for some time now there’s been widespread dissatisfaction with our journals; perhaps dissatisfaction has been widespread from the very start. These dissatisfacitons have come in so many shapes and sizes that I won’t even try to itemize them. Fortunately, a lot of smart people are giving serious thought to the means by which scholars communicate with one another and about how such communication might be improved.

The days of scholarly journals as they’ve been traditionally understood may well be numbered, but let’s assume for now that they’re going to continue to be a, or even the, primary means of scholarly communication for some time to come. How then to improve them, from the sometimes contradictory perspectives of authors, reviewers, readers, and editors?

These concerns are the focus of a section in the January 2009 Perspectives on Psychological Science titled “Issues in Publishing, Editing, and Reviewing.” Included in that section are nine brief pieces that contain a number of interesting ideas. In this post and in a follow-up or two, I’ll describe some of these ideas and, drawing on my own experience, provide some reactions to them. I invite you to weigh in on these issues, too.

I’ll begin wtih a short piece by Denise C. Park, “Publishing in the Psychological Sciences: Enhancing Journal Impact While Decreasing Author Fatigue,” here, gated. Park prefers the way neuroscientists do journals to the standard operating procedure in psychology. She outlines a series of steps that would bring to her home discipline, psychology, what she sees as the strong points of neuroscience journals. More specifically, she offers a seven-point set of proposals. Here I’ll deal with just the first two points, saving the rest for later. I’ll italicize Park’s suggestions and follow each immediately with my responses.

* Shorten articles. Institute a specified word count and require a concise statement of the problem under study. I agree, but with reservations. First the agreement part: Sigelman’s Law states: “Most books should have been an article; most articles should have been a research note; and most research notes should never have been published.” In editing my own work, I find that successive drafts get shorter and shorter and thus more clearlyfocused. Many — in my experience, most — political scientists take far too long to get to the point rather than telling readers right off the bat what central question they’re trying to answer. They also try to cite everything that’s ever been written on their subject, whether it’s directly relevant or not, rather than presenting a focused literature review. They use footnotes that are far too numerous and far too lengthy to incorporate material that’s of secondary or tertiary importance. And they present their findings in mind-numbing detail rather than homing in on the results that speak directly to their central question. Now the reservations part: For one thing, some fields of some disciplines are just wordier than others are. For example, for political theorists subtle linguistic distinctions are the coin of the realm. By contrast, for quantiatively-oriented political scientists much of the action is in the tables or (if Andrew has his way) the figures. So word limits that are applied across subfields produce some inequities. Moreover, word limits can produce a telegraphic sytle of writing that’s just not pleasant to read. I suppose that I’m exposing myself as a member of the old school when I say that ceteris paribus I much prefer an engagingly written article to one that crams in a lot of results in machine-gun fashion.

* Use Web-based information more freely. The most effective place to present and maintain [information necessary for replication] is not in a print journal, but on a journal-sponsored Web site. Again I agree, but again with reservations. One reservation is that I want an article to be an integral unit. I don’t want to replicate your work, but if I want to know, for example, how you operationalized a variable I don’t want to have to go looking for that information. It makes good sense to lay off “the details” to a Web site. But there’s no fixed definition of what “the details” are. I also worry about the impact of technological change on this process. In my scholarly lifetime, I’ve gone from preserving data on paper to punch cards (and if you dropped the box, you were in deep doo-doo) to tapes (boy, did I hate tapes) to PCs that ran on CP/M, to DOS-based text files, to SPSS system files, to Stata dta files, and (to say the same thing in a different way) I’ve put data on 5-1/4 inch diskettes, 3-1/2 inch diskettes, hard drives, memory sticks, and various other places — and as soon as I decide to archive my stuff one way, technology overtakes me so that eventually I find myself unable to access the stuff that I took such pains to preserve. So we can provide a “permanent” Web home for storing information about our articles? The whole notion of permanence is so shaky in a field that changes as rapidly as information storage and retrieval that I don’t have a whole lot of confidence in the permance of ostensibly permanent Web links.