Substantive Dimensions of the Deliberations
We encourage contributors to the Discussion Board to publicly identify by registering and logging in prior to posting. However, if you prefer, you may post anonymously (i.e. without having your post be attributed to you) by posting without logging in. Anonymous posts will display only after a delay to allow for administrator review. Contributors agree to the QTD Terms of Use.
Instructions
To participate, you may either post a contribution to an existing discussion by selecting the thread for that topic (and then click on "Post Reply") or start a new thread by clicking on "New Topic" below.
The transition to Stage 2 of the deliberations is currently underway but will take some time to complete. In the meantime, we very much welcome additional contributions to the existing threads in this forum.
For instructions on how to follow a discussion thread by email, click here.
-
Aisha Ahmad
University of Toronto - Posts: 1
- Joined: Thu Apr 07, 2016 1:45 pm
Transparency in Security Studies
I am an International Security scholar who works in active conflict zones, with a particular focus on modern jihadist insurgencies. I research the intersection between criminal business networks and militant armed groups in civil wars. My research uses a mixed methods approach: I conduct interviews with clandestine actors and I also run large surveys for regression analysis. I have published both quantitative and qualitative articles, including in International Security and Security Studies. My work is positivist and with a heavy rational choice theoretical orientation. I write from that perspective.
About DA-RT:
I wholly support the DA-RT initiative in requiring all quantitative researchers to release their datasets, coding, and replication materials upon moment of publication. This is especially important for security scholars, as our research often speaks directly to policy-makers. Withholding quantitative replication data is unethical and dangerous.
With respect to our qualitative research, however, the DA-RT initiative has failed to consider serious ethical and security considerations around releasing field notes. There are three key points I offer in this regard:
1. RESEARCH ETHICS BOARDS: Qualitative research conducted by security scholars often involves interviewing high-risk individuals. This work necessarily requires extreme levels of confidentiality, especially as some of respondents may be actively involved in violent or illicit activities. In such cases, our Research Ethics Boards very rightly require that we take great measures to ensure the confidentiality of such participants. Interviews with higher risk respondents often cannot be done with any technology in the room, such as using a computer for typing notes. In many case, only hand-written notes, with no identifiable information recorded, are ethical and appropriate.
Even post-interview transcription has restrictions, as digital security can be compromised by hackers and government agencies; this is especially important when both states or non-state actors are actively seeking information about certain militarized individuals and groups. A full digital transcript that records information about dates, locations, and other details that could be used to piece together the respondent's identity could put the individual and the entire research team at risk. Without a sound guarantee that the notes will be protected, it would be impossible to meet the requirements of confidentiality set forth in the ethics review process.
2. POLICY IMPLICATIONS: If security specialists are not given the right to complete our field work in a safe and ethical fashion, then we will stop doing this type of work entirely. Frankly, the consequences of losing the type of knowledge we bring to the table will be extreme. All security specialists who do field work in war zones are producing results that have immediate policy implications. We regularly advise governments on the impact of their decisions on the outcomes of violent insurgencies.
What we bring to the table is a granular, ground-level perspective, which complements system-level, large-N studies. For example, statistical studies show us how drone strikes affect insurgent groups with different organizational structures, but without rigorous qualitative research that investigates what the internal organization of each of those groups is, these results are rendered unusable. These are insights that policymakers need in order to devise effective solutions for some of the most pressing international security crises in the world today.
3. LOST DATA: Those of us who have spent years working in conflict zones have already made promises about confidentiality to our respondents, which we are bound to keep. For example, I promised every Somali business elite I met in Mogadishu that no one else would ever see my notes. Keeping this information in confidence is therefore an ethical requirement, which was stipulated by my Research Ethics Boards. We who have spent years in the field have extraordinary vats of qualitative data that were acquired before the DA-RT initiative began, and which we have yet to publish from. This would mean that years of hard-fought research that could help to save human lives would be laid to waste. Countless future publications would be left in desk drawers, never to be seen.
A Better Standard for Security Scholars:
I am in complete agreement with the overarching principle that all qualitative social scientists (most especially those working on international security) should be held to a higher standard of transparency. The DA-RT initiative, however, misses the most obvious and shockingly lack of transparency that qualitative researcher are actually guilty of. Rather than releasing interview notes (which is dangerous and inappropriate), I propose that the correct course of action on qualitative data transparency should be centred on three key areas: (1) research ethics, (2) data security, and (3) positionality. To that end, I heartily support a new requirement that leading journals require ALL researchers to include a brief online methodological appendix responding to these three key issues.
1. ETHICS: Researchers working with human subjects should be required to provide a methodological appendix that discusses their research ethics process, as well as formally disclose any ethical or security challenges faced during their research process. Too many scholars suppress this information; disclosure should be made mandatory.
2. DATA SECURITY: Researchers with confidential respondents should be required to discuss how they ensured the security of their data, with special consideration to the challenges of protecting information in our digital age. This conversation is essential to ensure the safety of respondents in the long-term, and will help future scholars to think sensibly about these issues.
3. POSITIONALITY: Researchers from ALL backgrounds should be required to comment in a methodological appendix how their intersectional positionality (gender, race, etc.) affected both their research process and results. I am appalled that it is more often female and minority scholars who discuss their positionality, whereas very few of my male and white colleagues in security studies feel obligated to do this. All qualitative data is affected by the researcher's positionality, whether the scholar is working with government elites or impoverished refugees; addressing this issue explicitly should be a professional requirement for all publication in leading journals.
Thank you for taking this initiative.
[5/19: QTD moderator change in the topic/thread title, with prior approval from the original poster, from "Security Research and DA-RT" to "Transparency in Security Studies" to reflect the broader focus of both the original post and the thread]
Post Reply
-
Tim Buthe
HfP/Technical Univ of Munich & Duke University - Posts: 32
- Joined: Fri Feb 26, 2016 11:39 pm
Transparency in Security Studies
I would like to invite others (both in security studies and beyond) to weigh in, particularly concerning the question: Which of the issues identified by Aisha are particular to security studies or raise peculiar concerns for security studies scholars (and thus might require discussion in a separate working group focused by subject matter) and which of these issues are broader (and should probably be discussed in broader fora)?
Post Reply
-
William J. Kelleher, Ph.D.
Independent Scholar - Posts: 19
- Joined: Thu Apr 07, 2016 4:38 pm
Re: Security Research and DA-RT
William J. Kelleher, Ph.D.
Post Reply
-
Tim Buthe
HfP/Technical Univ of Munich & Duke University - Posts: 32
- Joined: Fri Feb 26, 2016 11:39 pm
Re: Security Research and DA-RT
Post Reply
-
Guest
Re: Transparency in Security Studies
Post Reply
-
Guest
Re: Transparency in Security Studies
Post Reply
-
Caroline Beer
University of Vermont - Posts: 2
- Joined: Tue Apr 26, 2016 10:59 am
Re: Transparency in Security Studies
Post Reply
-
Kai Thaler
Harvard University - Posts: 2
- Joined: Fri Apr 08, 2016 3:37 am
Re: Transparency in Security Studies
In response to Tim's question about disclosure of sources and ability to challenge or evaluate research: while fraud and manufacturing of data are always a risk (for both quantitative and qualitative research), in cases where human subjects constraints or other issues mean that authors are unable or unwilling to provide detailed information on sources, this is where peer review comes in. Reviewers with deep contextual or subject knowledge can evaluate the data included in a paper and seek to parse out whether anything seems out of the ordinary. There will always be differences of interpretation, but the general plausibility of reported information should be able to be assessed.
Post Reply
-
Jesse Driscoll
UCSD - Posts: 5
- Joined: Tue Apr 26, 2016 4:19 pm
Re: Transparency in Security Studies
I have personally benefited from thinking more about positionality as a result of Aisha Ahmad's exceptional post. As young professionals working in many of the same conflict zones (Somalia and Central Asia), Ahmad and I are likely to continue attending the same conferences and citing (and assigning and potentially refereeing) each others’ work. We both think a lot about the privilege that makes our work possible. I think we both share an appreciation that doing this work ourselves, and training graduate students to try to do a better version of what we do, carries unusual challenges that are rarely appreciated by most of our colleagues. We are both improvising what it means to be public policy-minded academics against a quickly shifting technological and political frontier. In the spirit of this forum, therefore, I will push back a bit against a few of her rhetorical flourishes. I worry some of the language that she uses might divide scholars who value qualitative work at a time when there are advantages to a unified front. Normative proclamations about what researchers “should be obligated” to do or what “should be mandatory” can have unintended consequences.
POSITIONALITY
I was once given the following piece of advice by Barbara Walter, my primary academic mentor at UCSD: “Take whatever part of this thing [in this case, my book manuscript] that is closest to your heart and just cut it out. What remains on the page will be truer to more people because you won’t be a part of it.” I took this advice and moved most of the first-person voice to a methods appendix in my book, in the version I submitted to the press. About a year later, when I got anonymous referee reports back from the press, one of my two referees suggested that I move the personal materials up to the front of the book, since it’s one of the main reasons someone might read or assign my work to graduate students considering fieldwork in very dangerous settings. I took the reviewer’s advice and changed the book again. Some of the material migrated closer to the front of the book; some of it stayed in appendices.
The main inference I draw from this experience is that there is actually no shared standard for what high quality self-examination on positionality ought to look like. This is especially true in security studies. I am inclined towards sympathy with the community of scholars that believes these issues are best addressed through the lens of “evaluating respondent bias” and not words like “subaltern.” Invocations of positionality are difficult to square with the valuable norms of double-blind peer review. At best, I think it would be a misallocation of labor, both by authors and editors. At worst, it would invite journal editors to make somewhat-arbitrary judgement calls according to a sliding scale. Judgments on the aesthetic quality of contributions on a scholar’s positionality are almost certain to be highly subjective. Editorial statements in the recent Comparative Politics Newsletter have convinced me that top journal appendices are probably the wrong forum for this important set of conversations. I think conferences and books are probably the right forum.
DATA SECURITY
A strong set of data security norms for scholars attempting to do the kind of work that Ahmad does is vital. In principle, these norms could come into conflict with a strict reading of the letter of the DA-RT regime. But so long as there is transparency in the method of collection, and agreement within the field that conducting interviews with jihadists and criminals is a valuable thing for people in our field to do, then the spirit of the regime is very likely to prevail over a malicious reading of the letter. Editors at top journals clearly appreciate that protecting sources and credible promises of confidentiality is important to a certain kind of work. The fact that Sarah Parkinson and Anastasia Shesterinina managed to get their articles published in the APSR is evidence that excellent qualitative work in the security subfield can find a home in the flagship journal of the discipline. I’m not sure what evidence exists that the DA-RT is going to lead to apocalyptic “lost data” scenarios, though of course I understand (and empathize with) the source of the concern.
THE RHETORIC OF POLICY RELEVANCE
Some of us do work that involves cultivating strategies to have extended conversations with paramilitary militia members, criminals, or jihadists. Obviously these conversations are sensitive. Keeping names out of them can be a practical precondition of collecting the data in the first place. Ahmad argues that this kind of work is vital. The scholars who do it can advise governments and save lives. If this kind of work can save lives, and saving lives is ethically important, then it probably follows that cultivating a cadre of specialists like Ahmad (who actually know how to do this work safely and might be able to pass along best practices to others) ought to be valued. It is a potentially powerful argument. Still, I worry that “selling” the policy relevance of our work in this way — implying that policymakers need micro-data that only we are in a position to provide — is not a good rhetorical strategy.
One disadvantage to making this claim is that it probably isn’t true that national security policymakers read our top journals or consult with us to make policy as Ahmad implies. Policy impact, when it happens (which is not often), tends to be extremely indirect. Most of us that conduct qualitative research in violent places do not save lives and should not pretend that we do in order to score debate points. A second disadvantage is that it probably isn’t true that qualitative political scientists have a comparative advantage over SOCOM in the gathering of microdata on the internal organization of rebel groups in war zones. We may have a comparative advantage in area “feel” and have cultivated a sense of area, including a working model of how different proper nouns relate to one another, which can be valuable to military audiences. The claim that our secret data or our appreciation of local nuance ought to be given special accommodation by our top scientific field journals (which is why we are talking about this in the context of DA-RT) does not follow. A third potential disadvantage is that once the question is articulated in terms of guiding efficacious policy on the disruption of different network configurations of terrorist cells, we wade into a set of empirical debates where all of the relevant terms tend to be defined and judged by the causal inference crowd. These are minor concerns.
The big concern, for me, is that using this kind of rhetoric to justify our contribution to science — especially in a public forum like this one — can contribute to the perception that academics working on topics relevant to security studies are, functionally, NATO spies conducting information operations. I did not take this problem very seriously when I was a graduate student. Now that I am on the other side of the desk I find myself thinking about this problem a lot. Completely solving that problem is probably not possible. (To the extent that it is, it will eventually just lead to a reciprocal, much stickier, set of problems with American security services). But if we see qualitative work and ethnographic methods as a kind of arms-race where we imitate the practices of guerrilla journalists, it is going to raise flags. Proclamations that we are policy relevant because we assist policymakers strikes me as contributing directly to the problem, since it raises skepticism about our true motives and can erode the political neutrality that is supposed to set academics apart from state agents. Asking politically sensitive questions in post-conflict authoritarian states entails real risks. The risks are borne not only by us as researchers, but by our survey enumerators, translators, host families, research assistants and their families. Alex Sadikov is not the only one whose life was changed with the Tajik government decided that his qualitative research was indistinguishable from espionage. A lot of members of the Russian and Chinese state security services, and probably a lot of members of Al Shabaab, don’t see much of a distinction anymore between the American academy and the national security state.* In order to be as transparent as possible (and to avoid charges of Orientalism...) most of us have an instinct that it is appropriate to involve locals in our research. But in certain war zones or authoritarian regimes, the locals are functionally pre-positioned hostages. It might be more important, on balance, to take steps to shield locals from predatory prosecution and allow them to maintain plausible deniability.
I am hopeful that all of this will gradually work itself out. As best practices evolve with the march of technology, I would be very surprised if we find ourselves in an equilibrium where none of our graduate students want to “go get the real story” or hang out on the front lines of revolution. War is intoxicating to young people. (It was to me). Frankly I worry more about limiting the costs associated with a generation of guerrilla security studies scholars trying to pick fights with the security infrastructure in far away states.
ETHICS
I have published on research ethics elsewhere (http://www.amazon.com/Ethics-Experiments-Scientists-Professionals-Experimental/dp/1138909165), in the first person, and doing so has led me to wonder whether there could be unanticipated costs associated with a norm of amending an “ethics appendix” to every piece of social science research. What would appendix essays look like in security studies? At first, probably, what we would see is stock paragraphs cut-and-pasted from IRB protocols. With time, if the norm were implemented, it would probably evolve into a platform for encoded kinds of positionality. It has been noticed that, in the discipline of political science, “when one invokes the language of ethics, one is engaging in interest group politics.” In order to imagine how a norm of appending ethics essays to every top journal submission would evolve over time, I decided to map divisions in the discipline spatially, into a 2x2 matrix.
On one dimension (the horizontal) is the perceived division between "quantitative haves" and the "qualitative have-nots". In caricature: The former see themselves in solidarity with a march of science led by economists and data scientists, while the latter often see themselves in solidarity with those who care deeply about the correct representation of powerless people and tend to cultivate relationships with anthropologists, area studies departments, self-defined “post-positivists.” The other dimension (the vertical) is the relationship between the social scientist herself and political power, broadly defined. Again, at some risk of crude caricature: A social scientist can see herself in solidarity with the state, or with insurgents, but usually not with both at the same time. What is at stake, some would say, is the normative value of increasing the R^2 on building a more efficacious American war machine. This is analytically distinct from the methodological disputes about the correct way to measure key concepts in our science; both left-wing and right-wing variants of the latter critique exist. Some reasonable people wonder whether it is ethically appropriate to assume a “right to treat” at all, and whether the wrong kinds of observations might commit representational harm on our human subjects.
I do not think it is a good idea to force scholars to commit, in print, to a position on either side of either one of these debates as a precondition of publishing in a top journal in our field.
CONCLUSION
There is little consensus in political science on what ought to constitute “good work.” As a result, publication at top journals is often perceived as a numbers game. One must be very good, but also somewhat lucky, drawing sympathetic reviewers through a process that sometimes feels arbitrary. The key to large numbers of top publications, therefore, is volume, volume, volume — which puts a big premium on certain types of research. This skews the discipline in unhealthy ways. High quality qualitative work, when it requires very large (and emotionally intense) sunk costs up-front, can be a relatively risky professional venture in this setting. I would be surprised if implementing DA-RT changes any of this.
Jesse Driscoll
Assistant Professor of Political Science
School of Global Policy and Strategy
jdriscoll@ucsd.edu
======= == ======= == ======= == ======= == ======= ==
* It is very difficult for qualitative researchers to credibly commit to collecting data for only one purpose. Later, after they are safely removed from their field site, data collected for one set of questions can be re-purposed for topics relevant to national security in their home state. A working model of how different countries practical politics work — how different proper nouns fit together — is a long-term national security asset that can be acquired via years of fieldwork. As I understand it, this is how “area studies” used to be justified when it was subsidized during the Cold War.
[Cosmetic Modifications To Post By Author 4/29/2016]
Post Reply
-
Tim Buthe
HfP/Technical Univ of Munich & Duke University - Posts: 32
- Joined: Fri Feb 26, 2016 11:39 pm
Re: Transparency in Security Studies
1) What do you consider to be the most important transparency issues for security studies research that require discussion during the 2nd stage of the QTD process?
2) Is security studies scholarship as such is a good focus for a working group? Or are the transparency concerns of security studies scholars more sensibly addressed in broader, cross-cutting working groups? What should be the foci of the working groups that you would consider most important for security studies scholars?
Please feel free to weigh in here or by posting a reply on the general thread about how the working groups should be defined.
Post Reply