Preliminary Findings from the Review, Promotion, and Tenure Study

Support for the open access movement has grown in recent years, and today more than a quarter of scholarly literature is freely available. Yet, despite years of advocacy work and countless policies and mandates promoting openness, the majority of researchers are still not compelled to make their research outputs publicly available. Why is this the case? What barriers stand in the way of creating real change?

In a study by the ScholCommLab, Juan Pablo Alperin, Carol Muñoz Nieves, Lesley Schimanski, Erin McKiernan, and Meredith Niles suggest one possible explanation: the state of the review, promotion, and tenure (RPT) protocols at today’s research institutions. These documents, which are meant to provide faculty with the guidelines needed to achieve career success, are among the key incentive structures that drive their research dissemination strategies. But as the team’s findings reveal, surprisingly few of them mention open access at all—and even fewer provide the tools needed to pursue it.

In this interview, the study’s principal investigator, Dr. Alperin, shares some of the preliminary results and what they could mean for the future of scholarly research.

Tell me a bit about the study. What did you investigate and where are you at now?  

We’ve recently finished the first phase of the research: we collected more than 850 RPT guidelines from colleges and universities across the United States and Canada and assessed the degree to which they included guidelines specific to open access, research metrics, and forms of research dissemination. Basically, we found that only about 5% of the institutions studied made explicit mention of open access in their guidelines, and, in several of those few cases, the mention was done to call attention to the potential problematic nature of these journals, which are sometimes incorrectly seen as being of lower quality than subscription journals. We have submitted a comprehensive literature review of these issues for publication, and are about to submit another manuscript with these and other results. While we wait for these to come out, we have published the dataset containing the analysis of terms and concepts. We’ve also started a second phase in which we will survey faculty at each of the institutions in our study to find out more about their perceptions of the guidelines and how they use them to inform their work.

Why did you decide to study RPT guidelines? How do they factor into the question of openness?

We spend a lot of time at events related to open access and scholarly communications and it seemed that everywhere we went the role of incentives kept coming up as the biggest barrier to openness. Since the RPT process is one of the places where these incentives are formalized, it made sense to study them. As we did a review of the literature, we found that people at every level of universities—faculty, deans, and vice provosts—have all called for changes in the process to incentivize different forms of scholarship.

“People at every level of universities—faculty, deans, and vice provosts—have all called for changes in the process to incentivize different forms of scholarship.”

Are you surprised by what you found so far?

I was surprised to see that mentions of “open access” were almost non-existent. We knew we weren’t going to find it everywhere, but to find it mentioned by only six institutions out of a hundred-and-twenty-something? And to see that most of mentions only discussed open access as a word of caution against predatory and low quality publishing? That was disappointing. It shows that there’s room to do better.

Another thing that surprised me was how little the term “impact factor” was mentioned in the documents. In the open access world, we talk about the impact factor as acting like a sort of boogieman, but in our study, we found that the actual term was used by only 20% of the institutions—much less than you would think given how often it is discussed as the source of all research evaluation problems. It was unsurprising, however, to see that it was present in a higher proportion of documents from research-intensive institutions (the so-called “R-types”), where it appeared in about one third of the documents. This shows us that we need to be cautious of how the discourse of research-intensive universities dominates our thinking about incentive structures across all institutions.

Although impact factor was not so pervasive, words related to metrics (such as citations, h-index, rejection rates) appeared a little more often—in the documents of almost half the institutions. Again, this varied by institution type, with metrics being mentioned in almost three quarters of those R-type institutions. Similarly, we found them to be much more present in the documents of physical science and mathematics departments than in those of the social sciences and humanities.

So, while it was not surprising to see the RPT documents of research-intensive institutions and physical sciences and math departments included more mentions of impact factors and metrics than others, we were surprised to see the extent to which this difference dominates the narrative across the board.

Tell me a bit more about the second phase of the study. What do you hope to achieve in the coming months?

It’s still unclear to what extent people are actually using these documents to make decisions around what they should do for their career. After spending a lot of time with these documents, we have a sense that there’s a lot of culture that happens around them, rather than in the documents themselves. That’s why we’re doing the second part of the study.

“After spending a lot of time with these documents, we have a sense that there’s a lot of culture that happen around them, rather than in the documents themselves. That’s why we’re doing the second part of the study.”

The documents use a lot of vague language that allows for many different interpretations, including a large majority that mention the words “public” and “community”. At the same time, around 90% of them specifically mention peer review, concrete output types like articles and books, and publication venues like journals and university presses. So, while the documents leave room for interpretation when it comes to openness, they also provide specifics around traditional output types. We know from everything we hear day to day, as well as from the literature, that researchers still don’t feel they should be more open or more public. So we’re trying to better understand people’s relationships to the documents: Are they using them as guides? Are they really consulting them?

Are all aspects of the guidelines pretty vague? Or just the parts that discuss openness?

This is the part that’s interesting. The trifecta that normally gets valued in the RPT process is teaching, research, and service. But it’s known, both in the literature and in what we hear anecdotally, that research is really the core.

We’ve been analyzing and looking for the presence of terms like “public,” “community,” and “quality,” and we found them used in a very large percentage of the documents. But when we looked at the words around them, we found that they’re nonspecific qualifiers, words like “significant”, “substantial”, “demonstrable”. The guidelines have nonspecifics around what openness looks like. But the documents do have specifics around research outputs, around how to succeed in that realm. We see terms like “article,” “book,” and “refereed,” specific terms that guide what research should look like.

“The trifecta that normally gets valued in the RPT process is teaching, research, and service. But it’s known, both in the literature and in what we hear anecdotally, that research is really the core.”

But isn’t the number of research institutions with open access policies increasing? Where is that support actually playing out, then, if not within the RPT protocols themselves?

This is another reason why we thought it was important to tackle incentives. There has been an uptick in the number of universities with open access policies. But even still, there’s been little uptake among researchers. They don’t really see the value of openness yet, or they don’t prioritize it. To me, that means something else needs to change. Policies alone may not be enough. Tackling incentives might be a way of pushing academics to really think about these issues.

So where does this leave us? What’s next for you?

I want our research to inform conversations and actions about shifting the way we value open research and public engagement for greater openness. I’m hoping that our work can provide some evidence for the vice presidents, provosts, deans, and faculty who feel incentives are impeding change. I want it to provide the information that’s needed to make effective change happen. That is, I want to see this work being taken up by the people who are trying to make a difference.


Want to learn more about the RPT project? Data from the study can be found at Harvard Dataverse and a preprint is available at Humanities Commons. Or, sign up for our newsletter for ScholCommLab news, research updates, and more.