Governing Magazine survey says… Go to a Policy School (and pay close attention in your Public Management and Policy Process classes)!
Dutch photographer Jan Banning has traveled the world documenting the consequences of war, the homeless and impoverished, and victims of human trafficking. Asked to photograph a story on the administration of international development aid, something he thought to be “un-photographable,” Banning and a journalist set out to visit hundreds of local government offices worldwide. Between 2003 and 2007, they met civil servants in eight countries on five continents. “Though there is a high degree of humour and absurdity in these photos,” Banning says, “they also show compassion with the inhabitants of the state’s paper labyrinth.”
The European Union recently commissioned a study to “reflect on the state of the discipline and general trends within the discipline and in practice” of public administration (brought to you by the EU’s “Coordinating for Cohesion in the Public Sector of the Future” Group–or COCOPS). The subsequent report produced a ranking of public administration/management journals through the results of a survey of European scholars, which asked the respondents to rank the order of preference for where they would submit a good paper.
At my own school, faculty have vigorously (and in a healthy manner, I might add) debated the relative importance of journal ranking. And, this debate is certainly not isolated to my current place of employment. But one might question whether any of this debate really matters. Once a given metric becomes an established point of reference among those judged on that metric, is there any reason to believe that any other metric (qualitative or quantitative) will adequately replace it?
For instance, the Journal Citations Report or Google Scholar Metrics are two rather widely accepted quantitative metrics for journal prominence in a given field. JCR, in particular, has been used for years and is prominently featured as the metric of choice on most social science journals’ websites.
Below, I show tables derived from the COCOPS study, JCR, and Google Scholar Metrics. I have eliminated distinctively “policy”-oriented journals from lists in the “Public Administration” category in both JCR and Google Scholar. Even keeping in mind the obvious European bias in the COCOPS report, an almost identical list would emerge based on five-year impact factor or Google Scholar metrics. In ALL three lists, the top five journals in the field of public administration are PA, PAR, JPART, Governance, and PMR.
Note that some journals do not yet have a 5-year impact factor score (e.g., IPMJ). Nonetheless, it seems to me that there are a couple things you could derive from the COCOPS report… (1) traditionally accepted quantitative rankings are endogenous to choice; or (2) they aren’t a bad rubric for some fields; or (3) both.
E.g., scholarships, vouchers, or subsidizing private preferences?
As we at Bureauphile were shocked to learn earlier in the week, S. 679 successfully passed a House vote on July 31, 2012. We have done some very preliminary analysis of the impact of S. 679, which we presented at the “Appointee Politics and the Implications for Government Effectiveness” Workshop in Alexandria, VA on May 4th to some of the Senate committee staff responsible for writing the bill.
RegBlog, a blog dedicated to regulatory news and analysis (and a favorite of bureaphiles everywhere), has a two-week series of guest blogs honoring the legacy of James Q Wilson. The series continues through July 19th, so be sure to check in. In the meantime, here are some of the highlights from the past week (after the jump):
Here are some notable features of bureauphilia from the academic world: