Substantive Dimensions of the Deliberations
We encourage contributors to the Discussion Board to publicly identify by registering and logging in prior to posting. However, if you prefer, you may post anonymously (i.e. without having your post be attributed to you) by posting without logging in. Anonymous posts will display only after a delay to allow for administrator review. Contributors agree to the QTD Terms of Use.
Instructions
To participate, you may either post a contribution to an existing discussion by selecting the thread for that topic (and then click on "Post Reply") or start a new thread by clicking on "New Topic" below.
The transition to Stage 2 of the deliberations is currently underway but will take some time to complete. In the meantime, we very much welcome additional contributions to the existing threads in this forum.
For instructions on how to follow a discussion thread by email, click here.
-
Alan Jacobs
University of British Columbia - Posts: 38
- Joined: Fri Feb 26, 2016 9:59 pm
[From Steering Comm.] Inviting input on specific transparency practices
Here's one area in which we would like to invite some additional input: What are some specific transparency practices and tools that you think are of value, and why? We're interested in hearing about transparency practices in the broad sense -- including not just data access but also, for instance, transparency about how we've gathered the empirical information on which we rely, about how we have analyzed or interpreted that information, or about how our own subjectivity or positionality may have shaped our interpretations.
So suppose that we bracket, for a moment, questions of what editorial policies/standards ought to look like, and think about research practice -- things that we already do or could readily do. Thinking about your own research, for instance, what specific kinds of things do you do to make your findings more interpretable or transparent to your readers? When advising students about how to make their own research methods and findings credible and explicit, what do you advise them to do, and why? When you read the work of others in your field, what forms of openness do you appreciate? Have you seen innovative ways of being transparent that may not be widely known but are worth broader consideration? Can you point us to transparency practices in cognate disciplines that political scientists should take note of (as Regina Bateson has helpfully done in directing us to the field of oral history)?
Alan and Tim
Post Reply
-
SheenaGreitens
Re: Inviting input on specific transparency practices
I'd also second the reference to existing guidelines on oral history.
Best,
Sheena Chestnut Greitens
Post Reply
-
Mala Htun
Univ of New Mexico - Posts: 16
- Joined: Wed Apr 06, 2016 9:20 am
Re: Inviting input on specific transparency practices
I think we need to underscore the importance of proper and consistent footnotes as a transparency practice. Ideally, these would include page numbers where the information came from, which too many texts these days do not do.
mala
Post Reply
-
Marcus Kreuzer
Villanova University - Posts: 26
- Joined: Sat Apr 16, 2016 9:48 am
Re: Inviting input on specific transparency practices
Here are some of the practices that I observed in some work that I currently doing on the origins of proportional representation. I discuss them from what I would consider least to most transparent:
1) single test: scholars just explore empirical implications of his/her theory paying no attention to any alternative explanations. The analytical transparency in such tests would be low because there is no dialogue with alternative explanations. (Rogowski 1989, Alesina & Glaeser 2005)
2) single test + literature review: as 1) plus the literature review points to some anomalies in competing explanations. Very often scholars use an outlier, or formal model to discredit an alternative explanation and then proceed to only test the empirical implications of his/her theory. (Ahmed 2014)
3) asymmetrical dual testing: the scholar goes further than just identifying empirical or theoretical flaws in the alternative hypothesis. He/she might also replicate some status quo explanations to demonstrate its lack of robustness. Or they might engage in a more detailed case-study based process tracing. Here the test and alternative hypotheses are both subjected to empirical verification even though such testing is more extensive for the test than the alternative hypotheses. (Calvo 2005)
4) symmetrical, dual testing: two explanations are both closely reviewed for their theoretical flaws as well as subjected to equal empirical verification. (Leeman & Mares, 2014)
These are just some of the practices that I observed within the PR literature. The explicitness and the degree of engagement with alternative explanations is a crucial element in generating strong tests and valid causal inferences. I am sure that the inventory of constructing such strong tests is far more extensive than what I was able to observe in this small literature. But, how tests were configured clearly is an important element of test construction and thus would warrant closer attention.
Post Reply
-
Guest
Re: Inviting input on specific transparency practices
I see the issue in two ways:
a) checking the quality of the reasoning of the paper and/or replicating it
From this standpoint, having access to the transcript of the interview is pointless because it can be faked and it would be really hard to cross-check in case of anonymity. Access to the raw data when digital recording is available would be helpful, but would require a new way to quote that includes the time in the digital recording. For low security environments the latter seems doable and having access to the digital recording of an interviews linked to a quote in the paper could be quite interesting. BUT, this could be done only in situation in which anonymity is not required. In such situation the burden on the researcher would not be too crazy. Upload a recording and identify the minute of the quote and add it.
b) generating an archive for future meta-research
If we started collecting online in a vast repository all transcripts of structured interviews on topic X, I could see the emergence of the possibility of new meta-studies using machine coding. In a sense the repository would crowd-source interviews. Each researcher at most can do a few hundred, but over the years by uploading them we could create an archive. But this has nothing to do with replication. It's a crowd-sourcing project with all the usual sampling limits that such projects have.
Post Reply
-
Marcus Kreuzer
Villanova University - Posts: 26
- Joined: Sat Apr 16, 2016 9:48 am
Re: Inviting input on specific transparency practices
malahtun wrote:FOOTNOTES!
I think we need to underscore the importance of proper and consistent footnotes as a transparency practice. Ideally, these would include page numbers where the information came from, which too many texts these days do not do.
mala
Two additional suggestions on footnotes:
1) The historians' more discursive footnotes are their multi-purpose transparency tool. But, they also are lengthy, and lengthier than many social science journals are willing to accept. So, it might helpful if journal explicitly encourage longer footnotes.
2) What about excepting footnotes from the word count for a journal submission. I think that this is the idea for active citations which would be moved on line. But what about regular citations? If journals want better documented research transparency, they also have to make more space available. Otherwise research transparency comes at the expense of regular content.
Post Reply
-
easchatz
[Steering Committee] Inviting input on specific transparency practices
Ed Schatz
From Aisha Ahmad:
"A Better Standard for Security Scholars:
"....Rather than releasing interview notes (which is dangerous and inappropriate), I propose that the correct course of action on qualitative data transparency should be centred on three key areas: (1) research ethics, (2) data security, and (3) positionality. To that end, I heartily support a new requirement that leading journals require ALL researchers to include a brief online methodological appendix responding to these three key issues.
"1. ETHICS: Researchers working with human subjects should be required to provide a methodological appendix that discusses their research ethics process, as well as formally disclose any ethical or security challenges faced during their research process. Too many scholars suppress this information; disclosure should be made mandatory.
"2. DATA SECURITY: Researchers with confidential respondents should be required to discuss how they ensured the security of their data, with special consideration to the challenges of protecting information in our digital age. This conversation is essential to ensure the safety of respondents in the long-term, and will help future scholars to think sensibly about these issues.
"3. POSITIONALITY: Researchers from ALL backgrounds should be required to comment in a methodological appendix how their intersectional positionality (gender, race, etc.) affected both their research process and results. I am appalled that it is more often female and minority scholars who discuss their positionality, whereas very few of my male and white colleagues in security studies feel obligated to do this. All qualitative data is affected by the researcher's positionality, whether the scholar is working with government elites or impoverished refugees; addressing this issue explicitly should be a professional requirement for all publication in leading journals."
Post Reply
-
Marcus Kreuzer
Villanova University - Posts: 26
- Joined: Sat Apr 16, 2016 9:48 am
Re: [Steering Committee] Inviting input on specific transparency practices
DA-RT front-ends research transparency by trying to make the review process more rigorous. At a recent CES roundtable, Gary Marks (UNC, Chapel Hill) pointed out that traditionally research is evaluated after publication through further replication and by re-testing it against new evidence. Put differently, he contends that public market place of ideas rather than more secretive review process should be where research transparency is assessed.
Sociological Science is a new open-source journal that expands on Gary Marks suggestion by seeking to make the post-publication review process more public. The journal has a comment and reaction feature that invites fellow scholars to share their assessment with an already published article. I think that this has great promise because it crowd sources the assessment of scholarly work and improves the flow of communication between scholars and their audience. In doing so, it should improve the self-corrective capacity of scholarship and, in the rare cases of genuinely shoody scholarship, it also would increases reputational costs. Obviously, there also are some issues to worked out to avoid unprofessional or worse posting.
Sociological Science is interesting in another way. They reduced the gate-keeping function of reviewers and journal editors. https://www.sociologicalscience.com/for-authors/editorial-and-review-process/
I copy here from their submissions guidelines:
"The editorial process at Sociological Science departs from the current common practice in sociology journals. The dominant model de facto requires a majority (or sometimes even a unanimity) of reviewers to “vote” in favor of a paper before it will be published. Reviewers – who have little if any accountability to the journal or the author – can wield enormous influence over both the fate and content of papers. This practice has led to a bias toward errors of omission, to multiple rounds of revisions, and to ever-increasing review times.
Sociological Science, by contrast, will concentrate the evaluative function in the hands of the editors, held accountable to readers and authors. Carefully chosen specialists will decide whether a paper will be accepted. Authors will never be asked to “revise and resubmit” their papers; after the initial review, all subsequent decisions will be made by the Editorial Staff without soliciting further reviews. Though external reviews may be solicited, these reviewers will not be asked to recommend a course of action on the paper, but instead be asked to identify the submission’s strengths and weaknesses, and identify potential areas for debate. Finally, authors will be notified of the journal’s editorial decision within 30 days of submission."
Post Reply
-
Sarah Parkinson
Johns Hopkins University - Posts: 12
- Joined: Mon Apr 18, 2016 4:32 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
Post Reply
-
Sean Yom
Temple University - Posts: 3
- Joined: Wed Apr 20, 2016 10:43 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
We generally don't like to talk about, say, how many times we were wrong, whether we changed an argument in response to data, how many times we went back-and-forth between competing camps, how many times we found an interviewee completely by chance, how many times we had to reword something to publish it even if that was not in the original research design, how many times we reframed our knowledge to squeeze out some new insight that came from a conference discussant, how many times we stopped writing mid-sentence because of some new sudden realization, and everything else behind the scenes that somehow affected our personal journey to the conclusions we wish to publish -- the very conclusions we expect others to replicate, but without knowledge of what we did behind those scenes.
Instead, what we publish invokes not just what we *did* but also the image of what an idealized researcher would *do* in order to reach the same results. Justify some problem, propose some hypotheses or ideas, sift through some data, make the big conclusion, and make readers appreciate that contribution to knowledge. I often hear really interesting papers at APSA that read like this (call it the standard journal article format), and then meet the authors at the bar later -- and then they tell me what they *really* did behind the scenes. How many times they wrote and rewrote a case study, how many times they changed the entire meaning of a passage or text, how they decided to just include or exclude entire reams of data to reframe the study. They tell me of steps taken that were neither elegant or idealized, but still played a role in reaching the causal explanation that we are supposed to read, imbibe, and reproduce.
I wonder if this prevailing gap is healthy.
Post Reply
-
Alan Jacobs
University of British Columbia - Posts: 38
- Joined: Fri Feb 26, 2016 9:59 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
Post Reply
-
Guest
Re: [From Steering Comm.] Inviting input on specific transparency practices
Let me begin by noting that push for transparency is made using a reference point of dishonesty: "If they share their data, they can't cherry-pick," "If they share their process, we can replicate for accuracy," "quant work has to publish data and code" [because if they didn't, they could make up anything!]. It is both funny and sad that this debate has taken place on the heels of the LaCour scandal with no reference to it whatsoever. Researchers who want to invent data are going to do so – that's an issue of ethics and morality – and sadly not one that institutional rules and requirements are going to solve.
I propose that the DA-RT requirements are problematic for two related reasons. First, they are going to be most harmful, burdensome, and potentially dangerous for the most honest scholars, not the least honest. Inventing an interview with a former warlord won't get you killed. Publishing the names and quotes of people who were arrested during an insurrection might.
Second, and this gets back to Sean Yom's point: the push for transparency is ironically being made in a context in which institutionalized dishonesty about our work is the norm. We pretend that our theory happens in an arm chair, our DV data collection sequentially follows our IV data collection, and our evidence is confirmatory. We do not talk about failures, model changes, or exceptions to our rules. We least of all talk about the projects we tried and failed. I enjoin anyone reading this to think about a project – or just a hypothesis – that never came to fruition because the evidence was not there. Now imagine how many other scholars have wasted their time on the same project. If we want to institutionalize transparency, this is the place to start. However, we are actively discouraged from talking and writing about the iterative nature of that which does succeed, to say nothing of our ever-growing garbage bins of that which does not.
The discussions around the initiative suggest that access to others' hard-earned qualitative data will advance the quality and efficiency scholarship by making evidence more readily available. I argue instead it will incentivize haste and misuse of qualitative evidence by people who do not know the cases.
If we must desperately search for institutions to change and areas where increased transparency is truly beneficial to all scholars, incentivizing honesty about our process is considerably more pressing than forcing –not even honesty– but time-consuming transcription of our data. When I see APSR publish an article about the legwork that went into a failed-project in which the conclusion is, "We, thus, accept the null" – then I will believe our discipline has had a victory in the domain of transparency.
Comparativist Ph.D. Candidate – TopTen U.
Post Reply
-
Guest
Re: [From Steering Comm.] Inviting input on specific transparency practices
I remain puzzled by calls for scholars to release interview transcripts, photos of their archival materials, or field notes. Not only is there a risk that these materials will be misused by others who do not know the cases, but they suggest that we as researchers can know and anticipate all future threats to our interlocutors. Further, providing access to field notes or materials collected in the field can prevent being able to return to that field site. The discipline as a whole places little value on the creation of a dataset, and the thought of being asked to turn over materials that would prevent future research with a group or at a particular location will incentive certain types of research over others. I fear that requests for these materials will incentive those of us still in graduate school to abandon our qualitative and mixed-methods work in favor of quantitative work that fits under the myth of the laboratory model of research.
On transparency tools that would be of value, it would be useful and practical to have a standard for citing archival material. While acknowledging that different archives have different organizational systems, it should be clear from footnotes what archive the record comes from, the record group or collections identifier, the type of document, date, author, document identifier if known. Instead of asking scholars who use archives in their work to take photos, or link to key evidence, we should make it very easy for other scholars to identify precisely which archive and document we are working with.
- IR Ph.D. Candidate – TopTwenty U
Post Reply
-
Sara Niedzwiecki
University of New Mexico - Posts: 2
- Joined: Thu Apr 07, 2016 2:28 pm
Re: Inviting input on specific transparency practices
Marcus Kreuzer wrote:malahtun wrote:FOOTNOTES!
I think we need to underscore the importance of proper and consistent footnotes as a transparency practice. Ideally, these would include page numbers where the information came from, which too many texts these days do not do.
mala
Two additional suggestions on footnotes:
1) The historians' more discursive footnotes are their multi-purpose transparency tool. But, they also are lengthy, and lengthier than many social science journals are willing to accept. So, it might helpful if journal explicitly encourage longer footnotes.
2) What about excepting footnotes from the word count for a journal submission. I think that this is the idea for active citations which would be moved on line. But what about regular citations? If journals want better documented research transparency, they also have to make more space available. Otherwise research transparency comes at the expense of regular content.
I agree with the importance of footnotes. They can include detailed information about our field research. It therefore seems crucial, as Marcus Kreuzner suggests, that journals either increase the number of words to include footnotes or that detailed footnotes go online. I suggest two more pieces of information that should be included in footnotes in the specific context of interview-based field research:
First, direct quotes from interviews should be included also in the original language in which the interview was conducted. There is a loss in transparency when interviews are translated without including the direct voice of the interviewee.
Second, when anonymity is not required, when the interviewee occupies a position of power, and when it does not go against IRB rules, authors should include the complete information of the interviewee, including contact information. This not only has the potential of increasing transparency but also of paving the way for future researchers.
Post Reply
-
Guest
Re: [From Steering Comm.] Inviting input on specific transparency practices
In the African country where I usually do my field research, I always give interviewees the option of anonymity, as our protocols require. My subjects almost never ask for anonymity but in a few instances I have chosen to remove some identifying information (such as combinations of surname plus employer) that would make it easy for them to be harassed. The need for this sometimes does not become apparent to me until I begin writing. I always try to provide enough information about the person and place to convey that the interview did actually take place. In other African countries where I have worked, some of my subjects are genuinely at risk and they know it. Those citations might appear more suspect and would undermine my chances of complying with DART.
I think our profession is rigorous enough that if I misrepresented some of this information in a published article, I would eventually be challenged by another scholar or an ambitious grad student with conflicting information. (Remember the Yanomamo?) The DART proposal needs to value the iterative nature of research evaluation; publication of the article is not the end of scrutiny, it attracts scrutiny. I very much worry that DART will further discourage research in risky environments -- including the said country above where hardly anyone does field work.
Post Reply
-
Mneesha Gellman
Emerson College - Posts: 11
- Joined: Thu Apr 07, 2016 8:20 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
To be solution-oriented, I agree that active citations, carefully specified footnotes, and methodological appendices are potential tools to address some of the concerns that DA-RT raises. My forthcoming book, Democratization and Memories of Violence: Ethnic Minority Rights Movements in Mexico, Turkey, and El Salvador, includes, among other information, a list of interview questions I asked across all three countries as a way to allow a "peek under the hood" of how qualitative interview data was gathered and therefore contributed to my argument. But full transcripts of the 150 interviews used as the basis of the argument would not only be logistically and financially difficult, it would not be the right ethical choice for furthering transparency about my argument. Instead, I lay out conceptually how I measure each variable in the project. Readers can disagree with the choices I make, but the argument is provided in conceptual detail and in plain sight. Can that be a worthy alternative solution to the transparency dilemma?
Post Reply
-
Paolo Spada
Southampton - Posts: 11
- Joined: Wed Apr 06, 2016 8:24 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
I went to take a look at the reproducibility project and listened to this podcast http://www.econtalk.org/archives/2015/1 ... ek_on.html that explains how hard even in very simple and straightforward psychology lab-experimental research is actually to define what replication means without talking to the author of the study.
My current very partial and vague take is that some of the practices that might be interesting to look at are:
1) conversations like this one. Is it possible, at least for major journals to start an e-deliberation after each new issue in which the authors are invited to answer questions from readers and some preset commentators? It seems to me that a lot of fact checking and problems generated by the distortions of peer review methodology that is prone to group think fallacies can be solved by simply starting a broader conversation.
2) better training, this is obvious, maybe one of the outputs of QTD could be a syllabus on these issues that covers all the variety of perspectives and that we can teach in our universities.
3) more reproducibility projects and more career brownie points for starting one. Currently there is a value only on reproducibility for hostile take downs, and that makes no sense.
4) more opt-in repositories of research protocols, questionnaires, surveys. We are building one in my field for surveys on Democratic Innovations and even the partial one we have is such a great tool to have when creating surveys. This summer I had access to all the questions ever asked in a Citizens' Assembly and it was a breeze to create a new survey for the UK Citizens' Assemblies we were going to implement. In our field it is easy because there are very few innovations and the research is just starting, but even in mature fields simply collecting research protocols and questionnaires that were used and passed IRB might be an extremely valuable product. Having all the potential variations and translations of the same questions might also open interesting secondary research.
5) opt-in repositories of data for secondary research purposes, not fact checking. In many cases interviews are low risk and there is full consent to publish them. Creating an online repository for such interviews in a way that is very cheap and simple and completely under the control of the author might create interesting data. This could open-up the possibility of really interesting meta-studies. And the repositories could slowly promote better practices for data storage and interview protocols by showing that with small changes great meta-studies can be implemented.
6) I can also see that a new literature genre that stresses reproducibility might be interesting to explore. It seems to me that there is a space to create special issues of journals that focus on describing and updating the state of a research question/debate by replicating all the major studies around such question or by allowing researchers that use methods that are not conducive to simple replication to have additional space to re-present their research at the light of the additional work they have done and enter in direct debate with other researchers on the same topic/field site. Maybe these special issues could be the ones to pioneer e-deliberations attached to them (as described in 1).
Post Reply
-
Tim Buthe
HfP/Technical Univ of Munich & Duke University - Posts: 32
- Joined: Fri Feb 26, 2016 11:39 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
Post Reply
-
Guest
Re: [From Steering Comm.] Inviting input on specific transparency practices
Mendeley is a competitor of Zotero, so initially focused on curating references and sharing references, but now it is offering a dataverse https://data.mendeley.com/
I do not use it, and I am not trying to sell it (I understand the concerns Tim, I should have added a more extended explanation) but I think there might be something to learn from these freemium attempts to create a community around data, papers and references.
While updating this post I also did a google search and discovered that Harvard has a dataverse project: http://dataverse.org/
That I assume is linked with Gary King project http://gking.harvard.edu/files/dvn.pdf
But such dataverse seems completely dedicated to quantitative data. Still if there will be a working group that will look into dataverses it is an example to keep in mind and the article of Gary King describes a set of principles for a dataverse.
I also stumbled on a literature about dataverses of all sort of media and content beyond numbers: http://www.informationr.net/ir/13-1/paper333.html
It seems that some of the discussions we are having have links to discussions in the literature in library management (not sure what is the name of the discipline). So maybe if the dataverse will be something that will be explored in the next phases of QTD, then we should probably involve some of these researchers that specialize in storing a variety of media. As much as I have confidence that the journals in political science are going to do their best, it seems to me that libraries have been dealing with multiple media for around 50 years and so we should probably tap into the experience of the discipline that evolved around data storage in libraries. Basically we should do a literature review in the appropriate field, or we risk of being self-referential. This reflections are just the result of 3 minutes research on google.
There might be a model for an APSA opt-in dataverse, or for journals, and they might offer lessons about what NOT to do. Libraries store all sort of stuff and they might have dealt years ago with some of the legal and ethical problems we are just rising. And I am sure that some of the problems we are rising have a specificity that requires a lot of reinvention, but there are examples we can start from for an opt-in dataverse.
Post Reply
-
Tim Buthe
HfP/Technical Univ of Munich & Duke University - Posts: 32
- Joined: Fri Feb 26, 2016 11:39 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
Post Reply
-
Joachim Blatter
University of Lucerne - Posts: 1
- Joined: Mon Apr 11, 2016 3:36 am
Re: [From Steering Comm.] Inviting input on specific transparency practices
I should have posted my comment as a reply to this debate, since it is one of the more productive in this endeavor to debate the merits, costs and sensible practices of transparency.
Currently, it is published under this header as a discussion on its own: "Access to the broader evidentiary record in Congruence Analysis" by Guest » Tue Apr 12, 2016 12:15 pm
There, I point to my own practice of providing a longer paper that had been produced as an intermediate step in the research process as an online appendix to a journal article. In the online appendix the various steps necessary for a congruence analysis can be laid out in a much more systematic and extensive way compared to the shorter journal article.
When I posted that comment, I did it primarily with the impetus to warn against demanding too much from qualitative scholars. But after I followed the debate in the Newsletters of the APSA Sections and on this webpage, I would like to change sides and to stress that I believe that editors should demand more from case study scholars than they currently do.
If editors proceed in a sensitive and pragmatic way, this does not endanger qualitative research, but strengthens its reputation.
More precisely, I suggest that an online appendix of a case study should include the following:
a. an extended description of how the evidence has been gathered (a brief one should be in the article).
b. an interview methods appendix (as laid out by Bleich/Pekkanen)
c. an extended documentation of all sources that have been used (in order to keep the list of references in the article small)
PLUS
d. an extended description of the case(s) that depends on the methods one applies for linking evidence to theory. If the within-case analysis follows the congruence analysis method, this means that we have to document all empirical evidence that is in line, but also all evidence that is in contradiction to a theory. And we have to do this for each theory equally.
In the article, by contrast, we can focus on the most striking evidence, ideally the ones with discriminatory force between theories. Furthermore, in the article we can have a much better balance between "correspondence" and "coherence" - the two quality criteria for knowledge claims in qualitative research.
The first three elements contribute to production transparency, and the forth element does not only secure analytic transparency, but fairness towards the applied theories - something that is often much more missing than transparency.
I do not think that further data should be demanded - certainly not the field notes or transcripts of interviews.
Kind regards,
Joachim Blatter
Post Reply
-
Guest
Re: [From Steering Comm.] Inviting input on specific transparency practices
I think that it is easy to foresee that there is an exploratory business model behind these "free" online databases and repositories and that the academic journal space is not uncontested anymore.
The main argument in favor of academic journals was the higher quality of peer review. If some other mechanism to ensure quality in a more efficient and open way can be found the journals are in big trouble because the current system is full of problems, not only for the impossibility of replicating quantitative data, but also due to various publication biases.
It seems to me that a working group could think about the future of peer review and how these new players (Academia is another one) will affect it. The research question could be: are there new and better ways to ensure quality of published research?
Post Reply
-
Jonathan Laurence
Boston College - Posts: 1
- Joined: Thu May 19, 2016 2:00 pm
Re: [From Steering Comm.] Inviting input on specific transparency practices
The conversations I have with diplomats and civil servants almost always begin with their request that the interview not be for attribution. Politicians and religious leaders, however, are often keen on spreading their message and their names -- and would probably enjoy the amplifying effect of whatever search engine hosts such a transparency archive.
I would vote against required archiving because it removes one of the main reasons people talk to us. Academics benefit from the soft bias of low circulation. By this I mean the assumption by interview respondents that speaking with a university researcher is very different than speaking with a journalist. Even with journals now keyword searchable, academic articles are rarely read by more than a few hundred people -- and even that small audience is too much for some respondents who prefer to remain anonymous. Because of the infamously intrusive NSA, moreover, knowing that one's name or title would be entered into an American database could give second thoughts to anyone with sensitive social or political responsibilities.
It is already hard to get an appointment with key respondents. I fear that formalizing this process would further diminish our access to decision-makers and reduce their candor if we make it past the front door.
Post Reply
-
Amel Ahmed
University of Massachusetts, Amherst - Posts: 2
- Joined: Wed May 18, 2016 9:20 am
Re: [From Steering Comm.] Inviting input on specific transparency practices
Post Reply