I.1. Ontological/Epistemological Priors
We encourage contributors to the Discussion Board to publicly identify by registering and logging in prior to posting. However, if you prefer, you may post anonymously (i.e. without having your post be attributed to you) by posting without logging in. Anonymous posts will display only after a delay to allow for administrator review. Contributors agree to the QTD Terms of Use.
Instructions
To participate, you may either post a contribution to an existing discussion by selecting the thread for that topic (and then click on "Post Reply") or start a new thread by clicking on "New Topic" below.
For instructions on how to follow a discussion thread by email, click here.
-
Catherine Boone
London School of Economics - Posts: 6
- Joined: Thu Apr 07, 2016 3:33 pm
Dishonesty in research raises concern
There is a problem, but some of the current DA-RT remedies are not only painfully misguided, but also obviously aimed at the wrong targets.
The current debate over whether field research notes should be subjected for review by journal editors is an absurd extreme in the quest for a silver bullet that will extinguish all possibility that a researcher will somehow intentionally corrupt his or her research results. For the advocates of this innovation, I would like to ask "Who do you think produces these interview notes?" Do courts of law accept as definitive evidence the accused's own record of his or her activities on the night of the crime? Shall we ask field researchers wear body-cameras as they do their interviews, so that these recordings can later be to submitted for review by suspicious journal editors? The focus on "turning over" field research notes for inspection is either a disingenuous attack on those who generate and use field interviews as a research method, or a clear indication of a profound lack of understanding of this way of gathering and using information from the field.
It seems quaint to repeat Steve Van Evera's (1997) appeal to reason, but unfortunately, this really needs to be starting point of our DA-RT deliberations: “Infusing social science professionals with high standards of honesty is the best solution.” (Van Evera 1997:48).
Catherine Boone, Professor, LSE
Post Reply
-
Ingo Rohlfing
Cologne Center for Comparative Politics, Universität zu Köln - Posts: 20
- Joined: Tue May 24, 2016 5:45 am
Re: Dishonesty in research raises concern
On that end, I find it curious that, in some cases, people argue against DART and at the same time praise the paper by Lieshout (Lieshout, Robert H., Mathieu L. L. Segers and Anna M. van der Vleuten (2004): De Gaulle, Moravcsik, and the Choice for Europe: Soft Sources, Weak Evidence. Journal of Cold War Studies 6 (4): 89-139.) In this paper, they take a very close look at the sources that Moravcsik used in his seminal book and find that his key argument does not hold (at least not for the episode they are looking at). Now, this can be turned into a point against DART because they managed to scrutinize Moravcsik's use of sources without DART standards. However, I rather wonder how much easier it would have been with easier access to sources (no charge against Moravcsik here because he mostly seems to have referenced sources properly).
Of course, source transparency can be challenging and sometimes too costly to achieve, but it is worth pursuing to the extent it is possible. Admittedly, nobody can tell at the moment what is "possible", but I do not believe that current practice is the maximum of transparency that can be achieved.
Post Reply
-
Marcus Kreuzer
Villanova University - Posts: 26
- Joined: Sat Apr 16, 2016 9:48 am
Re: Dishonesty in research raises concern
Having said this, I do not think that replacing DA-RT with self-enforcing appeal for "higher standards of honesty" is the solution either. As the Dar es Salaam news story illustrates, such standards are ineffective in the face of the competitive pressures scholars face for grants, tenure, and finding employment. We might be a little bit "saintlier" than investment bankers because our dishonesty is not rewarded with huge bonuses. But we still sufficiently mortal that more effective transparency guidelines (far more broadly defined as the current ones) would benefit us. Ingo' reply provides one of many possible illustrations of such potential benefits.
cboone wrote:In April 2014, The Citizen, a newspaper in Dar es Salaam, ran an article under the headline "Dishonesty in research raises concern." The authors explained how heightened competition for jobs, publications, and research funding could compromise researchers' ethnical standards, and incentivize them to skew or deliberately slant research results. Unfortunately, US-based political science has learned the hard way that this is not a concern for producers and consumers of research in Tanzania only.
There is a problem, but some of the current DA-RT remedies are not only painfully misguided, but also obviously aimed at the wrong targets.
[....]
It seems quaint to repeat Steve Van Evera's (1997) appeal to reason, but unfortunately, this really needs to be starting point of our DA-RT deliberations: “Infusing social science professionals with high standards of honesty is the best solution.” (Van Evera 1997:48).
Catherine Boone, Professor, LSE
Post Reply
-
Jane Mansbridge
Harvard Kennedy School - Posts: 8
- Joined: Sat Apr 23, 2016 4:53 pm
Re: Dishonesty in research raises concern
If there is a problem in qualitative research, it would be useful to focus on cases in which there has been dishonesty in that field. I do not happen to know of any, but am happy to be educated. If there are very few, it might be worth considering the costs and benefits of any form of proposed policing. As in current claims of "voter fraud" in the US, one could possibly do more harm by creating problematic restraints on the overwhelming majority in order to catch the few or none actually engaging in fraud.
Post Reply
-
Nancy Hirschmann
The University of Pennsylvania - Posts: 6
- Joined: Wed Apr 06, 2016 1:59 pm
Re: Dishonesty in research raises concern
Post Reply
-
Catherine Boone
London School of Economics - Posts: 6
- Joined: Thu Apr 07, 2016 3:33 pm
Re: Dishonesty in research raises concern
Post Reply
-
Deborah Avant
University of Denver - Posts: 1
- Joined: Wed May 04, 2016 8:12 am
Re: Dishonesty in research raises concern
Post Reply
-
Guest
Re: Dishonesty in research raises concern
Post Reply
-
Jesse Driscoll
UCSD - Posts: 5
- Joined: Tue Apr 26, 2016 4:19 pm
Re: Dishonesty in research raises concern
DeborahAvant wrote:Dishonesty is a problem in many places but it is very difficult to devise procedures that cannot be gamed by those intent on dishonesty. Rather than focusing only on the procedures, and making them more and more complex, we could also aim to better teach what is honest and ethical research. I think more discussion of honesty and a strong commitment to good citations is about the best we can do.
Hear hear.
I also want to echo the point articulated by Jane Mansbridge above: I am not convinced, at all, that fabrication scandals in qualitative research "are even a thing," as my undergrads would say. The DA-RT debates have often been framed in a way that makes it seem as if there's a huge "data validity" crisis in the discipline. I think it's appropriate for qualitative researchers to say "we reject the premise that there's a crisis. And, to the extent it is, it's not coming out of our shop. The incentives of our part of the guild did not produce LaCour. Look inward, leave me out of it, and notice that the implied claim of fabrication is one of the most insulting things you can accuse a scholar of."
I don't believe that the goal of DA-RT is to dis-incentivize qualitative work, and (perhaps unlike some here) I would be surprised if these norms are implemented in a way that has a causal decline in the publication of high-quality qualitative work in top journals. It would be a disappointing surprise if I were wrong, of course. What worries me are *other trends* that are sometimes thrown on the campfire, when we have circled the campfire in the first place to talk about DA-RT. I think we all worry, reasonably, that we are being slowly replaced by technologists and that we are fighting an uphill battle to get a wedge in with limited space at top journals. Three things are frustrating to me. First, that there’s clearly no shared standard for what ought to constitute “good work” in the discipline. This makes getting an article into print at a top journal feel very arbitrary, dependent on whether you are matched with a sympathetic reviewer. Second, very small deviations from an ever-evolving frontier of empirical best practices are now often used as evidence of bad design or sloppy thinking, and in this framework (in caricature, perhaps, but it has certainly been my experience) practically anything can become a reason to reject at top journals. Third, the optimal response to these paired constraints is to do the kind of work that *always* allows you to have 1-3 papers under review, all the time -- you can't really control demand, only supply. The answer, like internet dating, becomes volume, volume, volume, volume. This skews the entire discipline in favor of a certain kind of work, and away from the kinds of very costly up-front sacrifices inherent in fieldwork-heavy qualitative work. And none of this, incidentally, has very much to do with the DA-RT regime.
What I think we are all grappling with is that there are a lot of people that are powerful in the discipline who have no desire to understand, or really (feel that they ought to have to pretend to) value, what it is that people like me and Anastasia Shesterinina and Dipali Mukhopadhyay and Will Reno and others do, how we have evaluated best practices in the past (why some books/papers rise and other are ignored), etc.. It is not clear that we have a better option than to try to write eloquently and hope audiences will find the smell of truth in our work. It is the labor we have chosen to engage in, and we can be proud of it. I am.
I wrote a bunch of other stuff but upon further reflection decided to remove it; I don't think it is relevant to this conversation. But I would still really like to know: Is there evidence (that, like Jane, I have also missed) for the qualitative fabrication meme?
Jesse Driscoll
Assistant Professor UCSD
Post Reply
-
Tom Pepinsky
Cornell University - Posts: 1
- Joined: Thu Apr 07, 2016 1:37 pm
Re: Dishonesty in research raises concern
I'd like to weigh in specifically on the premise of dishonestly as a motivation for transparency, which I see as the basis for this particular discussion thread. I can understand why this has been animated much of this discussion, but I would like to propose that we separate the discussion of transparency from the discussion of dishonesty. I think that conceptualizing "transparency-enhancing" practices as a procedure to "guard against dishonesty" is a mistake. It allows a careless consumer of this discussion to infer---incorrectly---that the only way that one could object to some transparency initiative is because one supports research fraud. If there is value in taking further steps in favor of transparency in qualitative work, it does not come from policing potentially dishonest practices or research fraud, but rather from shedding more light on how researchers construct their arguments.
A more constructive engagement would start from the premise that as political scientists, we are involved in a communal exercise of creating knowledge. That communal exercise requires us to communicate with one another. I recommend that we jettison any justification for or objection to greater transparency that rests on a logic of policing some kind of misbehavior. If one is to build an argument that current best practices---like citations and bibliographies, which seem quite functional to me---do not suffice, then the case ought to be made in terms of improving communication. Objections ought to be made along similar lines, without conceding that the purpose of transparency is to reduce fraud.
Post Reply
-
Sandra Resodihardjo
Radboud University - Posts: 3
- Joined: Thu Apr 28, 2016 8:42 am
Re: Dishonesty in research raises concern
DeborahAvant wrote:Dishonesty is a problem in many places but it is very difficult to devise procedures that cannot be gamed by those intent on dishonesty. Rather than focusing only on the procedures, and making them more and more complex, we could also aim to better teach what is honest and ethical research. I think more discussion of honesty and a strong commitment to good citations is about the best we can do.
I concur with Deborah's assessment. Based on many years of experience in the Board of Exams at two different universities, I've noticed that those who are really bent on being dishonest, are quite creative in finding ways to beat the system (e.g. plagiarizing books instead of articles as the plagiarism software does not have access to books). Of course, we still catch them as they do not seem to realize that we notice how well they suddenly write.
The discussion about research data management and which data needs to be made public is a good discussion, but we need to be careful not to go overboard with policies that hamper our research (e.g. I am against making interview transcripts public for numerous reasons). We need to take a step back and ask ourselves 'What do we want to achieve here?' Once we have clarified that, we need to consider whether the proposed policies actually help to achieve these aims.
Deborah's right that one way to achieve these aims is to teach (and discuss amongst ourselves) what is honest and ethical research and to have a strong commitment to good citations. I still come across articles where citations are sorely lacking. In those instances, we are expected to just trust the researchers that what they wrote is correct. Instead of having to trust the researcher blindly, we need these researchers to do their job properly. That does not mean uploading every source used for the article. It does mean proper citations so readers can go to the original source if they want to.
Post Reply
-
Mark Beissinger
Princeton University - Posts: 1
- Joined: Fri Nov 18, 2016 4:31 pm
Re: Dishonesty in research raises concern
For qualitative researchers, however, some proponents of DA-RT are asking a great deal more--to post all fieldnotes on which any "coding" is made. We should not be holding qualitative scholars to a different standard. This will simply make the publication of qualitative research all the more difficult. Yes, people should document their material to a reasonable degree; this is simply common sense, and journals have generally been holding qualitative researchers to this standard. But requirements to publish research notes place a burden on qualitative researchers that is way beyond what anyone else in the profession is being asked to fulfill.
Post Reply
-
Mneesha Gellman
Emerson College - Posts: 11
- Joined: Thu Apr 07, 2016 8:20 pm
Re: Dishonesty in research raises concern
I hope that as the dust settles in these debates we can begin to have conversations about revaluing ethics at every level of our work, to address concerns about dishonesty through norms of honesty, rather than through a forced protocol to disclose information (such as interview transcripts etc) that for scholars like myself would actually entail a violation of the ethics that guided fieldwork in the first place.
jdriscoll wrote:DeborahAvant wrote:Dishonesty is a problem in many places but it is very difficult to devise procedures that cannot be gamed by those intent on dishonesty. Rather than focusing only on the procedures, and making them more and more complex, we could also aim to better teach what is honest and ethical research. I think more discussion of honesty and a strong commitment to good citations is about the best we can do.
Hear hear.
I also want to echo the point articulated by Jane Mansbridge above: I am not convinced, at all, that fabrication scandals in qualitative research "are even a thing," as my undergrads would say. The DA-RT debates have often been framed in a way that makes it seem as if there's a huge "data validity" crisis in the discipline. I think it's appropriate for qualitative researchers to say "we reject the premise that there's a crisis. And, to the extent it is, it's not coming out of our shop. The incentives of our part of the guild did not produce LaCour. Look inward, leave me out of it, and notice that the implied claim of fabrication is one of the most insulting things you can accuse a scholar of."
I don't believe that the goal of DA-RT is to dis-incentivize qualitative work, and (perhaps unlike some here) I would be surprised if these norms are implemented in a way that has a causal decline in the publication of high-quality qualitative work in top journals. It would be a disappointing surprise if I were wrong, of course. What worries me are *other trends* that are sometimes thrown on the campfire, when we have circled the campfire in the first place to talk about DA-RT. I think we all worry, reasonably, that we are being slowly replaced by technologists and that we are fighting an uphill battle to get a wedge in with limited space at top journals. Three things are frustrating to me. First, that there’s clearly no shared standard for what ought to constitute “good work” in the discipline. This makes getting an article into print at a top journal feel very arbitrary, dependent on whether you are matched with a sympathetic reviewer. Second, very small deviations from an ever-evolving frontier of empirical best practices are now often used as evidence of bad design or sloppy thinking, and in this framework (in caricature, perhaps, but it has certainly been my experience) practically anything can become a reason to reject at top journals. Third, the optimal response to these paired constraints is to do the kind of work that *always* allows you to have 1-3 papers under review, all the time -- you can't really control demand, only supply. The answer, like internet dating, becomes volume, volume, volume, volume. This skews the entire discipline in favor of a certain kind of work, and away from the kinds of very costly up-front sacrifices inherent in fieldwork-heavy qualitative work. And none of this, incidentally, has very much to do with the DA-RT regime.
What I think we are all grappling with is that there are a lot of people that are powerful in the discipline who have no desire to understand, or really (feel that they ought to have to pretend to) value, what it is that people like me and Anastasia Shesterinina and Dipali Mukhopadhyay and Will Reno and others do, how we have evaluated best practices in the past (why some books/papers rise and other are ignored), etc.. It is not clear that we have a better option than to try to write eloquently and hope audiences will find the smell of truth in our work. It is the labor we have chosen to engage in, and we can be proud of it. I am.
I wrote a bunch of other stuff but upon further reflection decided to remove it; I don't think it is relevant to this conversation. But I would still really like to know: Is there evidence (that, like Jane, I have also missed) for the qualitative fabrication meme?
Jesse Driscoll
Assistant Professor UCSD
Post Reply
-
Guest
Re: Dishonesty in research raises concern
The motion toward stricter transparency standards seems to be an attempt to further solidify the preferential position of quantitative work in the profession. This discussion board reveals that the proposed DA-RT “reforms” were not designed with close attention to the implications for qualitative scholars. Quantitative research is increasingly portrayed as the ideal type of political science scholarship. Designing a series of transparency requirements better suited to quantitative research, but applied to all forms of research, is one way to depict quantitative research as inherently “more transparent” (along with the implicit assertions that it is also “more scientific” or “more rigorous”).
This cannot be separated from the higher demand on graduate students to publish rapidly and to have statistical skills. Publication can go faster if you can access data from anywhere (no field work), design and re-design models, or use the same data set to ask a series of questions (for multiple papers). Furthermore, this type of work is explicitly encouraged by faculty. On a separate but related note, graduate students who can teach statistics are somewhat better assured of continued funding (in my experience). At the university level, faculty lines are easier to get for professors that are more technical or policy-oriented (incidentally, mostly quantitative scholars).
To be clear, I don’t think this is some kind of conspiracy to further solidify quantitative research as the “one true way” to conduct political science scholarship, but I do see the potential for these guidelines to limit the possibilities for qualitative scholarship moving forward. However, I see this as part of a larger trend toward privileging quantitative research in the discipline, which is in itself part of a larger trend toward the scientization of the social sciences in general.
With regard to transparency in qualitative research: scholarly debate and engagement with other area experts is a good way to encourage transparency in contexts where it may not be feasible or desirable to share all components of the project (the idea of wearing a body camera during interviews illustrates the limits of transparency in these cases). I was under the impression that in social sciences, which despite efforts to the contrary remain highly subjective, the community of scholars is responsible for reading and engaging critically with knowledge produced in the field, with the expectation that no single interpretation of an event or causal process is completely accurate or comprehensive.
Post Reply
-
Timothy Pachirat
UMass Amherst Political Science - Posts: 2
- Joined: Thu Dec 01, 2016 3:10 pm
Re: Dishonesty in research raises concern
Please excuse my lateness to the party. (This digital carriage turns into a pumpkin at midnight tonight, I believe.)
Excuse, also, the length of this post. Rather than disassemble my thinking and ship it off to different conversation topics, I’ve kept it all intact in one place. But much of what I post here is also germane to the “Ethnography” and “Fieldnotes and Data Repository” sections of this site.
In this post, I would like to recycle portions of my essay, "The Tyranny of Light," in order to connect the purported title of this section--"Ontological/Epistemological Priors"--with the curious fact that the only thread under this section with a pulse is titled "Dishonesty in Research Raises Concern."
But first, I want to make the simple yet increasingly overlooked point that DA-RT does not equal transparency and transparency does not equal DA-RT. (I honestly feel like the preceding sentence should be read out loud in a super-fast, breathless voice like the disclaimers at the end of car dealership ads on the radio before the further reading of any post on this QTD site.) Instead, what DA-RT has done (and continues to do) is to catalyze very specific and very particular kinds of discussions about transparency (and ethics). There is nothing wrong with these specific and particular discussions in and of themselves, but they become unfortunate to the extent that they exert a kind of discursive alchemy under whose spell we soon forget that we are playacting on a stage not of our own making, and that the props provided are just that: provided props. DA-RT becomes unfortunate to the extent that people misrecognize it for transparency (and ethics) itself, rather than naming it for what it is: a specific, and frankly partisan, enactment of transparency (and ethics). This is why, in this post, I will try to use the clunky phrase “DA-RT catalyzed discussions of transparency” rather than just “transparency.” (End of super-fast, breathless disclaimer.)
So then: why is that DA-RT catalyzed discussions of transparency have moved us so quickly from “ontological & epistemological priors” to “dishonesty?”
As Tom Pepinsky rightly points out in this thread, it's not immediately obvious why DA-RT catalyzed discussions about transparency lead us to arguments about “dishonesty” instead of any number of other generative conversations we might be having.
And yet, I think a closer look at the history of DA-RT catalyzed discussions of transparency helps to illuminate:
1) why we find ourselves discussing dishonesty (a dead-end, I think: more on this later!) instead of any number of other things, and
2) how DA-RT sets the agenda (second face of power, folks!) for discussions—no matter how inclusive--that implicitly privilege some social research ontologies over others, invocations of neutrality notwithstanding.
So then, why the transposition of “ontological & epistemological priors” into debates about “dishonesty?”
It is crucial to understand that, on its proponents’ own account, the original motivation for both DA-RT and for the APSA Ethics Guidelines Revisions that authorized the DA-RT committee to do its work derive directly from concerns about replicability in empirical research conducted within positivist logics of inquiry (for a granular account of DA-RT’s origins, see Dvora Yanow and Peregrine Schwartz-Shea, link at end of this essay). Specifically, “APSA’s governing council, under the leadership of president Henry E. Brady, began an examination of research transparency. Its initial concerns were focused on the growing concern that scholars could not replicate a significant number of empirical claims that were being made in the discipline’s leading journals.” As the dominant DA-RT narrative has it, this emerging crisis of replicability in positivist political science was soon found to also exist, in different registers, for a range of scholars “from different methodological and substantive subfields.” Thus, while the DA-RT narrative acknowledges its specific and particular origins in concerns over replication of empirical studies conducted within positivist logics of inquiry, it moves quickly from there to claiming a widespread (discipline-wide?) set of shared concerns about similar problems across other methodological and substantive subfields (citations for the quotations in the original essay accessible via the link provided at the end of this post).
It’s not surprising, to me, that when concerns about replicability travel from their original ontological homes to forms of social research premised on ontologies that don’t privilege replicability as a “gold standard,” we begin to detect a shift in language from “replicability” to “dishonesty.” This is what some people might refer to as a problem with specifying DA-RT’s “scope conditions” (fancy jargon for, “Yo, your money don’t work around these parts.”) Because DA-RT is unwilling or unable to recognize and acknowledge the particular ontology that gave it birth—indeed, quite the opposite, it explicitly purports be ontologically motherless--all kinds of weird things start happening when it is forced to pack a suitcase and board a flight to ontologically distant territories. Weird things, for example, like an unhealthy and unhelpful obsession with honesty and dishonesty.
On this thread, Jane Mansbridge, Jesse Driscoll, Nancy Hirschman, and others have asked for specific examples of qualitative work charged with dishonesty. I can’t think of an especially great example in political science proper, but if we look just beyond the disciplinary castle walls, a lightning-rod, shrapnel-flinging exemplar of a case comes readily to mind: sociologist Alice Goffman’s 2014 University of Chicago book, On the Run.
I’m currently writing a seven-act play that includes a mock trial of this book, so let me just give you a taste of some of the charges that have been publicly leveled against On the Run by academics and the reading public:
At the heart of this controversy [over Alice Goffman's book] are the fundamental limitations of ethnography as a mode of inquiry. Ethnography can look like an uncomfortable hybrid of impressionistic data gathering, soft-focus journalism, and even a dash of creative writing.
-- Leon Neyfakh, Slate
That those flaws [in Alice Goffman’s book] managed to go unnoticed for so long reflects a troubling race-related blind spot among academic and media elites. The failure of On the Run is not only the failure of an individual book and an author, but of the system that produced them.
-- Paul Campos, Chronicle of Higher Education
And, perhaps most tellingly:
Qualitative “research” is useless because there is no way to tell if what is claimed is a reflection of reality or simply the “researchers” gullibility and biases, or even if it’s all a fabrication…. At least [quantitative research] can be put to the test in replication studies, as is increasingly done in social science. To use a book like Alice’s as a guide to understanding social problems is to put enormous trust in her judgment and honesty—even when she openly admits to being a politically motivated advocate. There’s no way to verify many of her claims.
-- Anonymous comment, Marginal Revolution.com
In list form, these charges look something like this:
- inventing events
- embellishing facts
- changing her versions when challenged
- making egregious factual and methodological mistakes
- refusing to provide the means to verify any empirical claims, including failing to file her dissertation, keeping it under wraps years
later, destroying her data, etc.
- misrepresenting her site
- misrepresenting her relationship to informants
- participating in a conspiracy to commit murder
Now, an actual discussion of the merits of On the Run is beyond the scope of this post (for that, please read my play!), but suffice to say two things here: none of the current proposals put forward by DA-RT would do anything to address any of these concerns. As I wrote in “Tyranny of Light,” and as Catherine Boone, Mneesha Gellman, and others have pointed out in this thread, the idea of a foolproof “verification” device for ethnography is patently absurd. It is the performative equivalent--as Mark Beissinger superbly points out on this thread--of making numerical datasets openly available without disclosing the micro-decisions that went into the alchemy of coding that turned the messiness of the world into those “transparent” numbers. As Beissinger puts it: “If we are talking about real transparency in research, large-n researchers would need to provide extensive documentation on every single coding in their datasets. This is simply not being asked because it is not practicable--even though the real instances of fraud that we are aware of have come from falsified codings in large-n data sets.”
So then, would the depositing of Alice Goffman’s fieldnotes for On the Run in a publicly available database placate the critics of her book? Unlikely.
But, really, why stop with requiring ethnographers to post their fieldnotes, diaries, and personal records? Why not also require the ethnographer to wear 24 hour, 360 degree, Visual and Audio Recording Technology (VA-RT) that will be digitally livestreamed to an online data repository and time-stamped against all fieldwork references in the finished ethnography? Would the time-stamped, 24 hour, 360 degree VA-RT then constitute the raw “data” that transparently verifies both the “data” and the ethnographer’s interpretation and analysis of those data? VA-RT for DA-RT!
VA-RT dramatizes a mistaken view that the ethnographer’s fieldnotes, diaries, and personal records constitute a form of raw “data” that can then be checked against any “analysis” in the finished ethnography. The fallacy underlying the mistaken proposal that ethnographic fieldnotes, diaries, and other personal records should be posted to an online repository derives from at least three places.
The first is an extractive ontology inherent in a view of the research world as a source of informational raw material rather than as a specifically relational and deeply intersubjective enterprise. Fieldnotes, and even VA-RT, will always already contain within them the intersubjective relations and the implicit and explicit interpretations that shape both the substance and the form of the finished ethnographic work. Quite simply, there is no prior non-relational, non-interpretive moment of raw information or data to reference back to. What this means is not only that there is no prior raw “data” to reference back to, but that any attempt to de-personalize and remove identifying information from fieldnotes in order to comply with confidentiality and human subjects concerns will render the fieldnotes themselves unintelligible, something akin to a declassified document in which only prepositions and conjunctions are not blacked out.
Second, fieldnotes, far from being foundational truth-objects upon which the “research product” rests, are themselves texts in need of interpretation. Making them “transparent” in an online repository in no way resolves or obviates the very questions of meaning and interpretation that interpretive scholars strive to address.
And third, neither fieldnotes nor VA-RT offer a safeguard “verification” device regarding the basic veracity of a researcher’s claims. The researcher produces both, in the end, and both, in the end, are dependent on the researcher’s trustworthiness. For it would not be impossible for a researcher to fabricate fieldnotes, nor to stage performances or otherwise alter a VA-RT recording.
The notion of a “data repository,” either for ethnographic fieldnotes or for VA-RT, is dangerous both because it elides the interpretive moments that undergird every research interaction with the research world in favor of a non-relational and anonymized conception of “information” and “data,” and because it creates the illusion of a fail-proof safeguard against researcher fabrication where in fact there is none other than the basic trustworthiness of the researcher and her ability to communicate that trustworthiness persuasively to her readers through the scaffolding and specificity of her finished work.
(An aside that could really be a whole different post: it’s fascinating to compare Goffman’s On the Run with another highly celebrated recent ethnography, Matthew Desmond’s Evicted. Responding in part to the concerns about the “verifiability” of Goffman’s book, Desmond had his own book extensively “fact-checked.” And yet, in his quest for facticity, Desmond writes himself completely out of his own ethnography, so much so that when he appears in the book as an actor, he refers to himself in the anonymized third-person, as “a friend” who helped one of his informants move. We are left to ask: which account is ultimately more trustworthy and persuasive? Desmond’s exhaustively “fact-checked” book in which the reader has no sense at all for how his own presence and positionality shaped the research world he interacted with to get his “facts,” or Goffman’s non fact-checked book in which she offers a careful and detailed (transparent?!) account of her own involvement with her research world and “subjects?” A useful thought exercise indeed.)
“Ok, ok,” you say. “We get it. Conversations about dishonesty are a dead-end. What instead?”
Indeed, what instead? If we were to push back against the agenda-setting second face of DA-RT’s power, if we were to push back against the weirdness that comes with DA-RT’s cross-border ontological travels, what are some things that a non-DA-RT catalyzed conversation about transparency and ethics might lead us to talk about?
In this thread, Tom Pepinsky suggests that instead of talking about dishonesty we should be talking about “improving communication.” To that unobjectionable goal, I would add that we might also be talking about:
1) our relationships with and obligations to the individuals, social worlds, and ecosystems we study and interact with, whether they live in the middle of civil wars or rural Wisconsin (Cramer; Parkinson and Wood; Pachirat)
2) “the systems of patron-client ties, nepotism, and old boy’s networks that keep our institutions [read, “Political Science”] stuck in time, resistant to change from within, and impervious to social problems from without” (Fujii);
3) “asymmetric conditions of knowledge production in the field” (Htun); and
4) a “broader, long-standing, and ultimately unresolvable struggle over foundational questions that social scientists cannot ‘sidestep’” (Sil, Guzman, and Calasanti) and that cannot be managed by any procedural antipolitics machine, no matter how self-consciously inclusive the people in charge of that machine might be.
This alternative conversation agenda is just a start. It is an an agenda for conversation, of course, that has been central to the work many of us have been doing long before DA-RT arrived on the scene. It is an agenda for conversation that our work—with its attention to reflexivity, positionality, and power—makes it impossible for us not to talk about. We can only hope, as Jesse Driscoll puts it so eloquently on this thread, that others might also be persuaded and compelled by “the smell of truth” in such an alternative agenda.
Timothy Pachirat
Link to my Tyranny of Light essay, along with highly recommended essays by: Kathy Cramer; Sarah Parkinson & Elisabeth Wood:
http://scholar.harvard.edu/files/dtingl ... 2015-1.pdf
Link to other referenced essays by Lee Ann Fujii; Mala Htun; and Rudra Sil, Guzman Castro, and Anna Calasanti:
https://dialogueondartdotorg.files.word ... ng2016.pdf
Link to essay by Dvora Yanow and Peregrine Schwartz-Shea:
https://www.cambridge.org/core/journals ... 27FF2B7287
Post Reply
-
Rudra Sil
University of Pennsylvania - Posts: 6
- Joined: Thu Apr 07, 2016 4:50 pm
Re: Dishonesty in research raises concern
Scholars more invested in qualitative research have been approaching the DA-RT debate from a defensive standpoint, trying to argue against being forced to comply with expensive, time-consuming procedures in order to publish research in the flagship journals of the discipline. The real issue is the locus of "dishonesty" or, in some cases, more a case of "laziness" or unwarranted "short-cuts." In quantitative work, uploading data-sets and the relevant code into repositories is neither terribly problematic, nor as revealing of where the locus of transparency ought to be: THE CODING DECISIONS. This is not so much a crucial issue for pre-formed quantitative data supplied by third-party organizations (such as World Bank data on GDP per capita or percentage of workforce in agriculture). We can certainly critique the origins of that data as well, but it is all open-source material that can be easily checked and deciphered. Much more problematic is the decision on how to code contested historical or interpretive information into numbers that allow for the application of one model or another. The assumptions that go into each and every coding decision, along with countless decisions on which sources to rely on in the process, are not going to be subject to careful analysis and potential replication simply by following DA-RT procedures. And, to the extent one wishes to take on this challenge, this can be done WITHOUT standardized submission procedures or publication norms that may have unanticipated consequences and unnecessary anxieties and divisions (as demonstrated in Marcus Kreuzer's effort in his 2010 APSR article to systematically and carefully evaluate coding choices in the work of Cusack, Iverson and Soskice by using a broader sample of historical sources).
Post Reply
-
William J. Kelleher, Ph.D.
Independent Scholar - Posts: 19
- Joined: Thu Apr 07, 2016 4:38 pm
Re: Dishonesty in research raises concern
cboone wrote: Do courts of law accept as definitive evidence the accused's own record of his or her activities on the night of the crime? Shall we ask field researchers wear body-cameras as they do their interviews, so that these recordings can later be to submitted for review by suspicious journal editors? The focus on "turning over" field research notes for inspection is either a disingenuous attack on those who generate and use field interviews as a research method, or a clear indication of a profound lack of understanding of this way of gathering and using information from the field.
Catherine Boone, Professor, LSE
In a post entitled “Methods War: How Ideas Matter within Political Science,” at
http://www.e-ir.info/2016/11/24/methods ... l-science/
Nov 24, 2016, Patricia Woods gives an excellent statement in defense of qualitative/interpretive methods. But in the comments an important question is raised by "Shannon" which Patricia has not answered. So here is my suggested solution:
First, I hardily agree with everything Patricia says in defense of qualitative research, including the too obvious power play of the Physics Envy bunch to dominate the political science profession. But Shannon raises an important issue.
The problem here is, when research notes must be kept private to protect the persons involved, how can another scholar scrutinize the claims made based on the proprietary data?
In other words, how can the political science profession guard against fraudulent claims being accredited as valid?
In law, sometimes evidence is scrutinized “in camera.” That is, the lawyers present their sensitive evidence to the judge in chambers. The same rules of evidence are applied in this proceeding as would be applied in a public trial. In this way, bad or fraudulent evidence can be detected and exposed. Claims based on bad evidence can be dismissed, and the ruling can be made public without revealing the nature of the bad evidence.
Before an in camera proceeding is conducted, the lawyers have to make their case as to why it is necessary. The political science profession could devise a set of rules for ordering in camera proceedings where a sound case has been made for its necessity. Experienced retired judges can be hired to conduct the proceedings, and apply the well-established rules of research, such as for data collection, and the common law rules of evidence.
Of course, variations on this model are possible. But the point is, the profession can guard against fraudulent research claims made in its name without prejudicing qualitative research in total.
William J. Kelleher, Ph.D.
Post Reply
-
Guest
Re: Dishonesty in research raises concern
"Please answer two questions.
1. Can you provide a link to a political science journal that requires qualitative scholars to attach field notes as a condition for submitting an article for review?
2. Can you provide a link to a political science journal that requires qualitative scholars to attach field notes as a condition for publication?
I am not looking for articles by others that repeat the claim that journals do these things, I am looking for an actual valid link to an actual political science journal that has either of these requirements.
Thank you in advance for responding."
Two days later, Roger posts:
"The silence here is deafening. Professor Woods, Professor Kelleher, or others who have claimed that political science journals require submission of field notes, please provide an active link to a single journal that does this."
Note that these posts predate Bill Kelleher's comments here.
So to all readers of this board, I pose Roger's question to you -- and extend it to writings by DA-RT advocates as well. Please offer evidence that DA-RT or any JETS journal has this policy.
FWIW, my reading of the evidence over the last two years is that the only people who have *ever* made this claim are people arguing against DA-RT. In other words, its a ghost story.
[quote="Bill Kelleher"][quote="cboone"] Do courts of law accept as definitive evidence the accused's own record of his or her activities on the night of the crime? Shall we ask field researchers wear body-cameras as they do their interviews, so that these recordings can later be to submitted for review by suspicious journal editors? The focus on "turning over" field research notes for inspection is either a disingenuous attack on those who generate and use field interviews as a research method, or a clear indication of a profound lack of understanding of this way of gathering and using information from the field.
Catherine Boone, Professor, LSE[/quote]
In a post entitled “Methods War: How Ideas Matter within Political Science,” at
http://www.e-ir.info/2016/11/24/methods ... l-science/
Nov 24, 2016, Patricia Woods gives an excellent statement in defense of qualitative/interpretive methods. But in the comments an important question is raised by "Shannon" which Patricia has not answered. So here is my suggested solution:
First, I hardily agree with everything Patricia says in defense of qualitative research, including the too obvious power play of the Physics Envy bunch to dominate the political science profession. But Shannon raises an important issue.
The problem here is, when research notes must be kept private to protect the persons involved, how can another scholar scrutinize the claims made based on the proprietary data?
In other words, how can the political science profession guard against fraudulent claims being accredited as valid?
In law, sometimes evidence is scrutinized “in camera.” That is, the lawyers present their sensitive evidence to the judge in chambers. The same rules of evidence are applied in this proceeding as would be applied in a public trial. In this way, bad or fraudulent evidence can be detected and exposed. Claims based on bad evidence can be dismissed, and the ruling can be made public without revealing the nature of the bad evidence.
Before an in camera proceeding is conducted, the lawyers have to make their case as to why it is necessary. The political science profession could devise a set of rules for ordering in camera proceedings where a sound case has been made for its necessity. Experienced retired judges can be hired to conduct the proceedings, and apply the well-established rules of research, such as for data collection, and the common law rules of evidence.
Of course, variations on this model are possible. But the point is, the profession can guard against fraudulent research claims made in its name without prejudicing qualitative research in total.
William J. Kelleher, Ph.D.[/quote]
Post Reply
-
Guest
Re: Dishonesty in research raises concern
The DA-RT Ad Hoc Committee proposed precisely that fieldnotes, diaries, and other personal records be made publicly available because they are "qualitative source materials." The specific wording reads: “The document’s contents apply to all qualitative analytic techniques employed to support evidence-based claims, as well as all qualitative source materials [including data from interviews, focus groups, or oral histories; fieldnotes (for instance from participant observation or ethnography); diaries and other personal records.…]”
See DA-RT Ad Hoc Committee. 2014. “Guidelines for Data Access and Research Transparency for Qualitative Research in Political Science,
Draft August 7, 2013.” PS: Political Science and Politics vol. 47, no. 1: 25–37.
Your assertion of dishonesty and your two questions conflate what the DA-RT Ad Hoc Committee proposed with what journals are or are not currently implementing. The entire point of DA-RT is to get journals to sign on to a set of common criteria (see the JETS statement).
The threat of a requirement that qualitative researchers post their fieldnotes, diaries, and other personal records is not dishonest, it is real: the original DA-RT proposal, referenced above, advocates for requiring "all qualitative source materials [including data from interviews, focus groups, or oral histories; fieldnotes (for instance from participant observation or ethnography); diaries and other personal records," to be made public. Whether or not individual journals will sign on to that and other DA-RT policies is still being negotiated, journal by journal, and advising them to think twice before doing so is exactly one of the reasons for this QTD forum.
[quote="Guest"]I would like to discuss an instance of dishonesty on this message board. Many people have now repeated the claim that DA-RT requires submission of field notes as a condition for review or publication at journals. I put forward a proposition: the claim is not true. Moreover, in the article that Bill Kelleher references, and on which he has commented, a person named Roger asks the following question.
[....]
William J. Kelleher, Ph.D.[/quote][/quote]
Post Reply
-
William J. Kelleher, Ph.D.
Independent Scholar - Posts: 19
- Joined: Thu Apr 07, 2016 4:38 pm
-
Guest
Re: Dishonesty in research raises concern
Mr. Kelleher and Guest 2,
I want to make certain that I understand your claims.
First, neither of you were able to cite a single instance of a journal that requires submission of field notes for submission or publication at any actual journal.
Second, "Guest 2" conveniently left the following information off of the cherry-picked quote from the cited article...
"While the new standards encourage as much data sharing as possible, they should not be viewed in all-or-nothing terms: These activities often face friction, for example in the form of human subjects or copyright concerns. Sharing some data and being as transparent as possible, within those or other limits, will generally be better than nothing at all."
"A shared commitment to openness, however, does not oblige all research traditions to adopt the same approach. Rather, transparency should be pursued in ways and for reasons that are consistent with the epistemology of the social inquiry being carried out. There are several reasons why qualitative scholars should not (and sometimes simply could not) adopt the transparency practices employed by quantitative political scientists, but must instead develop and follow their own."
"Sometimes data are collected in circumstances that require discretion to protect the rights and welfare of subjects. This will, quite properly, limit transparency... As noted below, scholars should only make qualitative data (and information about the decisions and processes that produced them) available in ways which conform to these social and legal imperatives."
Then there is the entire section on "What limitations might there be on making qualitative data available?"
Before making claims about what journals are doing, it would be helpful to provide even a modicum of evidence that they are doing so.
Before generalizing about what "DA-RT" is about, it would be helpful to read more than one cherry-picked paragraph or sentence from documents that clearly provide for protection of materials such as field notes. By proceeding in this manner we can have more honesty and fewer ghost stories.
Post Reply
-
Guest
Re: Dishonesty in research raises concern
Your reasoning evidences a failure to understand the relationship between norms and policies. You asked, in your original post, for a single instance in which a DA-RT proponent advocated for the sharing of fieldnotes and other records produced by qualitative scholars. You further stated:
"FWIW, my reading of the evidence over the last two years is that the only people who have *ever* made this claim are people arguing against DA-RT. In other words, its a ghost story."
When another Guest (Guest 2) provided evidence of precisely this norm in the form of a citation linked to an article outlining official DA-RT recommendations as authored by an ad hoc DA-RT committee and published in a Political Science journal, you then accused her/him of "cherry picking." As I read her/him, Guest 2 never said that the DA-RT proponents didn't allow room for exceptions. The point that Guest 2 was making, as I take it, is that it is the introduction of the norm itself that is potentially dangerous. Saying that Guest 2 is cherry picking in this instance is a bit like saying that when a state imposes a new norm that all driver licenses include a default agreement by the applicant to be an organ donor (with possible exceptions for those with religious beliefs) and someone cites this new default without citing the exception for those with religious beliefs, that person is "cherry picking."
Your citing of all the (not so reassuring) language governing the exception for religious beliefs amounts to so much hand waving to distract from the real issue: the state has imposed a new norm that everyone is an organ donor by default, except for those who make themselves exceptions.
Folks on this thread have been raising concerns about the very real attempt to articulate a new default norm. A default norm that, as Guest 2 points out, "apply to all qualitative analytic techniques employed to support evidence-based claims, as well as all qualitative source materials [including data from interviews, focus groups, or oral histories; fieldnotes (for instance from participant observation or ethnography); diaries and other personal records.…]”
Many folks on this and other threads who are far more eloquent than I understand well the negative implications of this norm becoming a new default and its possible deleterious downstream consequences for policies, including but not limited to journal publication policies.
This default norm, and not your allegations of dishonesty and of ghost stories, is what this discussion is about. Saying otherwise--to return to the organ donor example--amounts to so much hand waving that foregrounds the exemptions for religious beliefs instead of what should be foregrounded: the state's attempt to roll out a new default norm that makes those exemptions a topic of conversation in the first place.
Post Reply
-
Guest
Re: Dishonesty in research raises concern
Two points.
1 Your reasoning equates what you call a proposal, but is actually a sentence taken from a much longer inquiry into whether and how greater transparency is possible in qualitative research, to a norm.
A sentence in a document with many qualifications to that sentence is not a norm.
No one, moreover, has proposed this sentence standing on its own as a norm.
Is the DA-RT debate about norms? Very much so. It also explicitly allows for different types of researchers to develop their own norms.
2. You mention handwaving. With that critique in mind, please recall my original questions.
"1. Can you provide a link to a political science journal that requires qualitative scholars to attach field notes as a condition for submitting an article for review?
2. Can you provide a link to a political science journal that requires qualitative scholars to attach field notes as a condition for publication?"
I followed up with "I pose [these questions] to you -- and extend it to writings by DA-RT advocates as well. Please offer evidence that DA-RT or any JETS journal has this policy."
I ask you now to answer these without waving your own hands. Hint: The answer is "No."
--Original Guest (aka, the OG)
Post Reply
-
John Gerring
University of Texas at Austin - Posts: 2
- Joined: Sat Apr 09, 2016 8:28 pm
Re: Dishonesty in research raises concern
Evidently, there are ethical issues to resolve surrounding the confidentiality of sources. But this could probably be handled through systems in existence or under development. See…
Dessi Kirilova, Nic Weber, Sebastian Karcher. 2016. “Rethinking Data Sharing and Human Participant Protection in Social Science Research: Applications from the Qualitative Realm”
https://figshare.com/articles/Rethinkin ... lm/3823281
There are also practical hurdles, e.g., the time it would take (in an already time-intensive occupation) to redact fieldnotes so that informant identities are not revealed.
But perhaps there are situations in which human subjects could be protected and in which logistical obstacles would be outweighed by scholarly value added.
I am not thinking about discouraging outright fabrication. If people are determined to do this then they can fake their fieldnotes (though it seems like it might raise the barrier to fabrication just a little).
As several have pointed out, the main problem facing us is surely not fabrication of data. It is the difficulty of interpreting that data. And in this respect, it seems to me that access to fieldnotes might be helpful. Not that it would resolve all questions. The interpretive issues remain; as Geertz would say, it’s turtles all the way down. But if the published text is helpful in understanding and judging the ethnographer’s work, surely more text – especially text that was generated “on the spot” when the ethnographer was doing her/his work – should also be enlightening.
Consider the boon it has been to later generations to be able to pore over Malinowski’s fieldnotes. What if these had not been kept, or never made public? How much poorer would our knowledge of the early history of anthropology – and of the ethnographic process – be?
In recent years, ethnographic work in anthropology has gone a long way toward bringing the ethnographer into the ethnographic narrative. Published ethnographic studies are often accompanied by extensive discussion of how the ethnographer found his/her subjects, the nature of their interaction, his/her feelings about the process, and so forth. One might argue about whether this sort of detail belongs in a published piece of work. But it speaks to a growing consensus that the position the ethnographer relative to his/her subject matters quite a lot, and is something that readers ought to be aware of as they judge the conclusions of a work (and try to understand the ethnographer’s perspective). For this purpose, fieldnotes are much more informative than a published work, as Malinowski’s journal demonstrates.
There aren’t a lot of examples of anthropologists purposefully sharing their fieldnotes. However, I find that there is a very old but still functioning repository devoted to maintaining fieldnotes of ethnographies focused on American Indians. I quote from a recent article on the subject https://ateliers.revues.org/3132:
"The National Anthropological Archives began as the Archives of the Bureau of American Ethnology (BAE), the Smithsonian’s first research bureau. The BAE was founded in 1879 by John Wesley Powell to promote anthropological research and to serve as the permanent repository for manuscripts and photographs concerning American Indians that had been collected earlier by the US geographical and geological surveys of the western United States, particularly those of Ferdinand V. Hayden and George M. Wheeler….
Today, our collections include more than 8,500 linear feet of manuscripts (2.5 km), 635,000 ethnographic photographs (including some of the earliest images of indigenous people worldwide), 21,000 works of native art, more than 11,000 sound recordings, and more than 8 million feet of original ethnographic film and video.
Although the BAE and NAA were originally seen as a place to deposit raw ethnographic field materials, the archives is increasingly seen as a place from which new knowledge arises. In 2006, our archives served more than 600 researchers, mainly anthropologists, and our web site was visited by more than 600,000 individuals. NAA researchers use our collections to conduct comparative ethnographic studies, diachronic research into social, cultural and linguistic change, and to review and reinterpret the date collected by other anthropologists, particularly those working in the same ethnographic region.
In addition, many of our researchers are writing biographies of anthropologists, histories of Western exploration; preparing new exhibitions and studying previous exhibitions with an eye toward issues of representation and “first encounter” narratives; and writing intellectual histories.
Although the majority of the researchers who visit our archives are ethnologists and archaeologists, field materials are increasingly consulted by non-anthropologists, particularly native peoples studying their own cultural heritage. Many of these native researchers are re-using anthropological field notes for genealogical research; for assistance in supporting the revitalization of endangered languages; and to seek historical support for Native land claims and mineral rights."
Far from denigrating the work of ethnography, any initiative to archive fieldnotes should, I think, be viewed as an attempt to publicize and celebrate ethnographic work and – more broadly – to generate and sustain a community of professional and lay scholars who do this sort of work.
Post Reply
-
Jessica Teets
Middlebury College - Posts: 6
- Joined: Fri Apr 08, 2016 10:46 am
Re: Dishonesty in research raises concern
pepinsky wrote:A more constructive engagement would start from the premise that as political scientists, we are involved in a communal exercise of creating knowledge. That communal exercise requires us to communicate with one another. I recommend that we jettison any justification for or objection to greater transparency that rests on a logic of policing some kind of misbehavior. If one is to build an argument that current best practices---like citations and bibliographies, which seem quite functional to me---do not suffice, then the case ought to be made in terms of improving communication. Objections ought to be made along similar lines, without conceding that the purpose of transparency is to reduce fraud.
I agree with Tom that deciding on some "best practices" that facilitate our work as scholars is how we should spend our time versus framing this discussion about dishonesty. I haven't seen many cases that I would call "dishonesty", but I have seen some work that does not properly evaluate alternative arguments and also some very brief "methodology" discussions. In these cases it is hard for me to evaluate the author's argument and I am left to rely on my own biases for deciding how believable the argument is. I am at fault for making these same mistakes, partially because I forget to give myself enough space to do these things properly, and partially because of the ever-shrinking word count at journals. In that light, I would suggest that we think through what would help us better evaluate the claims made by our colleagues, such as more attention to weighing alternative arguments, and more space given to qualitative articles to explain research methodology and analysis (like an appendix that would not count toward the word count). I think a discussion of how to create best practices to better facilitate the evaluation and creation of knowledge could be a really positive use of our time, and I look forward to that discussion.
Post Reply
-
Guest
Re: Dishonesty in research raises concern
First, I think it is crucial to ask whether there is a crisis in transparency for qualitative research. It is not clear to me that the advocates of DA-RT have ever made the case that there is a problem of transparency within qualitative research, simply transposing a quantitative template about what constitutes transparent data onto qualitative research. The failure of the DA-RT initiative and the JET statement to reflect carefully on different research ontologies, and how that affects the very nature of observations, evidence, and data is deeply problematic.
Second, given the fact that we have no clear examples of bad qualitative research due to lack of transparency, perhaps one should ask what the exact value-added would be for qualitative work that we already know well? For example, why not ask whether James Scott’s widely acclaimed Weapons of the Weak, would have been more effectively assessed, and therefore given greater value, if Scott had given us access to his fieldwork notes? Specifically, would access to Scott’s fieldwork notes have strengthened our assessment of the theoretical claims in his book or of the validity of his observations?
Anyone who reads Scott will see how precise and careful he is in his extensive footnotes, noting for example the exact Malay word that was employed in the conversations he uses in the book. It is not clear to me that Weapons of the Weak would have been more robust because of awareness of DA-RT guidelines on transparency. It already was transparent. Would it not have been more effective for the DA-RT proponents to point to Scott, or any other exemplary study, as models for qualitative research, rather than seeking rigid rules to impose on a diverse community of scholars? Did the proponents of DA-RT actually look carefully at qualitative research, especially ethnographic research, to see what the best practices in the field already are? My general point is this: is it not possible that qualitative research already has high levels of transparency in work that is highly regarded in the discipline? Is this not the place to begin the debate within qualitative research, rather than assume there is a crisis of transparency?
Third, the costs to qualitative researchers have been noted by many already, but ultimately this is where the dangers of DA-RT lie. Numerous posts have already noted the problem of excessive time spent on appendices such as TRAX and the costs to junior scholars that will result if this disciplining initiative is granted such a broad mandate. Crucially, the disincentives to carrying out qualitative research that will inevitably be forced to comply with a higher and different standard from quantitative research (as Mark Beissinger notes in this thread), will become institutionalized in the discipline. If PhD students and junior scholars conclude that the costs of publishing qualitative research in top-tier journals is prohibitive both in terms of time and capacity, the outcome in terms of a pluralistic discipline will be obvious to all. This is the worst-case scenario of the DA-RT initiative.
Fourth, as the QTD Steering Committee moves to make recommendations about increasing transparency, I urge that they will center on the term “reasonable.” Tasha Fairfield and Kent Eaton have made a number of valuable comments about what “reasonable” means, including summaries of the types of interviews, or a sense of the universe of interviews that one was conducting, etc. Summaries of one’s type of sources or somewhat extended, but still concise, discussion of how one chose one’s cases (as Kenneth Roberts noted in his post) are helpful to the reader and provide broader context, but they are not excessively burdensome or impractical to the point of deterring actual research and publication.
Ultimately, I do very much hope that the QTD Steering Committee will reflect carefully on the effects their recommendations will have on the future pluralism of the discipline, but also reflect closely on what this debate says about the direction of political science and the kind of priorities that are being set by some. It is worth asking if a forceful emphasis on methodology, and specifically on detailed formalization of methodology, is where scarce resources (time, space, money) should be spent. I say this as a professor who teaches qualitative methodology and is fully committed to causal, explanatory social science.
Post Reply
-
Guest
Re: Dishonesty in research raises concern
If there is a problem in qualitative research, it would be useful to focus on cases in which there has been dishonesty in that field. I do not happen to know of any, but am happy to be educated. If there are very few, it might be worth considering the costs and benefits of any form of proposed policing. As in current claims of "voter fraud" in the US, one could possibly do more harm by creating problematic restraints on the overwhelming majority in order to catch the few or none actually engaging in fraud.[/quote]
Peri Schwartz-Shea and I have been studying IRBs for over a decade, and the point you make here, Jenny, has been ours as well: Unlike in the world of medical and psychological experimentation, the sources of IRB policy, we have found no evidence in the qualitative or interpretive social sciences of scientific fraud. The methods textbooks that discuss research ethics and other literature on the need for IRBs commonly cite the work of one or more of 3 researchers -- Milgram, Zimbardo, and Humphreys -- as evidence and rationale for policing field research. Note that the first 2 of these did experiments; only Humphreys did field research, and all evidence suggests that he did not harm his 'subjects' in any way, even if the potential for harm was there. [But he was very careful to keep from harming them. We discuss this at length in a 2015 APSA paper.] None of the methods or research ethics literature finds any other examples, nor has our search produced any.
As to LaCour, the whole business started unraveling as Broockman and his colleague-friend tried to repeat LaCour's research. Their discovery of flaws in that research led eventually to the revelations that he had fabricated the things you mention, in addition to others. The other major case in political science in recent time was the interventions in Montana, NH, and California elections. Here, too, the research was not qualitative, but a field experiment.
Dvora Yanow [who hasn't figured out how to sign on to this thing, hence the 'guest' post]
Post Reply
-
Guest
Re: Dishonesty in research raises concern
Defining the Problem
As we learn from the great policy scholar Deborah Stone, how we define a problem frequently privileges a particular solution. Consequently, problem definition is political. Particular solutions often have vested interests behind them with access to various resources that can be leveraged to define a problem in a particular way. My approach to understanding DART comes from a similar perspective of asking how the problem is defined and who the stakeholders are behind the proposed solutions.
I believe this perspective on problem definition is a helpful one because, on first glance, it is far from clear what the problem is that DART purports to solve. As I understand the claims made on behalf of DART, there is concern about the integrity of academic work. Yet, I have not seen convincing evidence about the scope of this problem or why current practices of scholarly conduct and editorial discretion are somehow not up to the task. As someone who works with archival materials, I expect scholars to follow norms of academic citation, including precise notation regarding sources. I have not seen any evidence to suggest that such practices have led to widespread abuse or are somehow particularly vulnerable to fabrications of one sort or another. The History profession provides further evidence that disciplinary standards of citation and peer review perform well as safeguards against academic fraud in the use of archival materials.
I am willing to accept that some academic fraud exists. However, I do not see evidence that this problem is of such a scale or severity that would justify the fundamental changes in academic work proposed by advocates of DART. At the risk of sounding polemical, to my ear, the concern about academic fraud echoes debates about voting fraud or food stamp fraud. Although the possibility of fraud exists, the incidence is very low and current mechanisms of policing are up to the task of preventing or punishing instances when they occur. Pushing the comparison further, concerns about voting fraud and food stamp fraud are actually part of a larger political struggle over representation, voice, and access to resources.
Without ascribing motives, I see similarities with DART. This debate is not about fraud. Concerns over transparency and integrity are a proxy fight over a much larger issue: epistemology and what counts as social scientific knowledge.
Rival epistemologies
This is not the place to rehearse debates over quantitative versus qualitative methods. The point I wish to make is that qualitative scholars are engaged in a different kind of knowledge production from quantitative work and these differences are particularly evident around issues of research materials and replicability. Scholars on both sides of the qual/quant divide need to recognize this in order to understand the strengths and limitations of different traditions in social scientific research. I think an exemplary way to do this is found in the work of Robert Mickey. I quote at length from his award-winning book Paths out of Dixie, which examines how the democratization of authoritarian enclaves unfolded in three states of the Deep South of the United States. As a work of comparative historical analysis, Paths Out of Dixie is meticulously researched and copiously footnoted. Its aim is to “identify configurations of causal forces” in order to explain observed outcomes as well as “develop new concepts and generate theories that can be tested on additional cases (p. 22).”
As Mickey explains, his approach has limitations.
“The comparison of richly detailed narratives has some important disadvantages. One is that it cannot suffice as a testing ground for theory. The theoretical approach…is in part informed by the narratives; the narratives cannot then be said to test this account. Another is that narratives resist replication, in part because they are not assembled from systematically collected datasets. Rather, this study relies on process tracing to describe and explain democratization challenges and their consequences (ibid).”
As Mickey further elaborates, process tracing through comparative case analysis
“puts a premium on internal validity, but at the expense of external validity. Despite these (and other) problems, this brand of research is appropriate given our state of knowledge on subnational authoritarian politics…Theory generation remains a priority, and this research design advances this goal (p. 23).”
As this passage shows, Mickey is exceptionally clear (e.g. transparent) about his methods, its strengths, and its weaknesses. It is model social scientific reasoning.
But there is a deeper point here, and that is about knowledge production. Whereas the DART debate acknowledges differences between qualitative and quantitative research, the discussion largely elides the epistemological priors (and corresponding choice of materials and methods) scholars bring to the study of politics. More precisely, scholars have different views on what can be known and how. In part, this is because they are asking different kinds of questions.
To illustrate, and to bring the discussion back to the matter of DART, Mickey discusses in a footnote the distinction made by David Collier, Henry Brady, and Jason Seawright between “casual process observations” and “data-set observations.” The former is “an insight or piece of data that provides information about context, process, or mechanism” (Collier, Brady, and Seawright quoted in Mickey, p. 368n59). The latter “appear in quantitative format and constitute comparable observations gathered…as part of systematic comparison” (Mickey, p. 368n59). Both are data in the sense of discrete pieces of information, but they offer researchers leverage on different kinds of questions. Whereas casual process observations are building blocks for producing knowledge about configurations of causes, data-sets provide the raw materials for comparative statics. In the right hands, both kinds of data can produce social scientific insights that contribute to the accumulation of knowledge. In the wrong hands, both kinds of data can produce garbage.
My point is that proposals currently on offer in DART, such as active citations or a “transparency appendix”, are attempts to transform one kind of data, that used by qualitative researchers, into something that looks more like another kind of data, that used by quantitative researchers. The reason for this has nothing to do with transparency. It has everything to do with ongoing struggles within Political Science over what is considered social scientific knowledge. Debates over epistemology or methods belong in journal articles and books on the subject. It is disingenuous and dangerous to let these academic differences be executed through a set of editorial rules that will systematically diminish the voice of qualitative researchers in the discipline.
Adam Sheingate
Professor of Political Science
Johns Hopkins University
References:
Collier, David, Henry E. Brady, and Jason Seawright, “Sources of Leverage in Causal Inference: Toward an Alternative View of Metholdology,” in Henry E. Brady and David Collier, eds., Rethinking Sodcial Inquiry: Diverse Tools, Shared Standards, 2nd ed. (Lanham, MD: Rowman and Littlefield, 2010).
Mickey, Robert, Paths Out of Dixie: The Democratization of Authoritarian Enclaves in America’s Deep South, 1944-1972 (Princeton: Princeton University Press, 2015).
Stone, Deborah, Policy Paradox: The Art of Political Decision Making (New York: W.W. Norton, 2012).
Post Reply
-
Guest
Re: Dishonesty in research raises concern
Why speculate about motives? You can't really claim to know what they were. It is very unlikely that everyone associated with DART even shared motives. Do you think that many people share motives with Moravcscik? Think about it.
Of course, this is an empirical question. Even a little bit of qualitative inquiry could reveal quite a lot -- like the fact that DART's website does not mention fraud as motivation. So why go straight to speculation?
Many if not most of the DART leaders appear to be qualitative scholars. Lots of people lost track of this fact in the post-Isaac hysteria. Why do you think that their goal was to "systematically diminish the voice of qualitative researchers in the discipline?" Or do you think that the quants tricked them? Seems unlikely.
Which leads to your claim that debates about how different types of scholars know things shouldn't be adjudicated "through a set of editorial rules." But that's exactly where these types of claims have been adjudicated for a long time -- and will continue to do so as long as scholars value peer-review publications. This fact does not preclude other fora for working through these questions.
Your reaction disappointed me. In the weeks and months after DART, too many scholars sought refuge in scapegoats and too few took seriously the question of how to increase our intersubjectivity. Looking back, this was embarrassing. It is good that more forward-looking approaches like this website have emerged.
I'd sign my name, but I'm a vulnerable professional position.
[quote="Guest"]Let me begin by thanking my colleagues who have devoted considerable time and thoughtful reflection on these issues. I am skeptical about DART and the motivations behind it, for reasons I discuss below. Nevertheless, I appreciate the effort on the part of those who are working very hard to address the implications DART has for qualitative research in Political Science.
Defining the Problem
As we learn from the great policy scholar Deborah Stone, how we define a problem frequently privileges a particular solution. Consequently, problem definition is political. Particular solutions often have vested interests behind them with access to various resources that can be leveraged to define a problem in a particular way. My approach to understanding DART comes from a similar perspective of asking how the problem is defined and who the stakeholders are behind the proposed solutions.
I believe this perspective on problem definition is a helpful one because, on first glance, it is far from clear what the problem is that DART purports to solve. As I understand the claims made on behalf of DART, there is concern about the integrity of academic work. Yet, I have not seen convincing evidence about the scope of this problem or why current practices of scholarly conduct and editorial discretion are somehow not up to the task. As someone who works with archival materials, I expect scholars to follow norms of academic citation, including precise notation regarding sources. I have not seen any evidence to suggest that such practices have led to widespread abuse or are somehow particularly vulnerable to fabrications of one sort or another. The History profession provides further evidence that disciplinary standards of citation and peer review perform well as safeguards against academic fraud in the use of archival materials.
I am willing to accept that some academic fraud exists. However, I do not see evidence that this problem is of such a scale or severity that would justify the fundamental changes in academic work proposed by advocates of DART. At the risk of sounding polemical, to my ear, the concern about academic fraud echoes debates about voting fraud or food stamp fraud. Although the possibility of fraud exists, the incidence is very low and current mechanisms of policing are up to the task of preventing or punishing instances when they occur. Pushing the comparison further, concerns about voting fraud and food stamp fraud are actually part of a larger political struggle over representation, voice, and access to resources.
Without ascribing motives, I see similarities with DART. This debate is not about fraud. Concerns over transparency and integrity are a proxy fight over a much larger issue: epistemology and what counts as social scientific knowledge.
Rival epistemologies
This is not the place to rehearse debates over quantitative versus qualitative methods. The point I wish to make is that qualitative scholars are engaged in a different kind of knowledge production from quantitative work and these differences are particularly evident around issues of research materials and replicability. Scholars on both sides of the qual/quant divide need to recognize this in order to understand the strengths and limitations of different traditions in social scientific research. I think an exemplary way to do this is found in the work of Robert Mickey. I quote at length from his award-winning book Paths out of Dixie, which examines how the democratization of authoritarian enclaves unfolded in three states of the Deep South of the United States. As a work of comparative historical analysis, Paths Out of Dixie is meticulously researched and copiously footnoted. Its aim is to “identify configurations of causal forces” in order to explain observed outcomes as well as “develop new concepts and generate theories that can be tested on additional cases (p. 22).”
As Mickey explains, his approach has limitations.
“The comparison of richly detailed narratives has some important disadvantages. One is that it cannot suffice as a testing ground for theory. The theoretical approach…is in part informed by the narratives; the narratives cannot then be said to test this account. Another is that narratives resist replication, in part because they are not assembled from systematically collected datasets. Rather, this study relies on process tracing to describe and explain democratization challenges and their consequences (ibid).”
As Mickey further elaborates, process tracing through comparative case analysis
“puts a premium on internal validity, but at the expense of external validity. Despite these (and other) problems, this brand of research is appropriate given our state of knowledge on subnational authoritarian politics…Theory generation remains a priority, and this research design advances this goal (p. 23).”
As this passage shows, Mickey is exceptionally clear (e.g. transparent) about his methods, its strengths, and its weaknesses. It is model social scientific reasoning.
But there is a deeper point here, and that is about knowledge production. Whereas the DART debate acknowledges differences between qualitative and quantitative research, the discussion largely elides the epistemological priors (and corresponding choice of materials and methods) scholars bring to the study of politics. More precisely, scholars have different views on what can be known and how. In part, this is because they are asking different kinds of questions.
To illustrate, and to bring the discussion back to the matter of DART, Mickey discusses in a footnote the distinction made by David Collier, Henry Brady, and Jason Seawright between “casual process observations” and “data-set observations.” The former is “an insight or piece of data that provides information about context, process, or mechanism” (Collier, Brady, and Seawright quoted in Mickey, p. 368n59). The latter “appear in quantitative format and constitute comparable observations gathered…as part of systematic comparison” (Mickey, p. 368n59). Both are data in the sense of discrete pieces of information, but they offer researchers leverage on different kinds of questions. Whereas casual process observations are building blocks for producing knowledge about configurations of causes, data-sets provide the raw materials for comparative statics. In the right hands, both kinds of data can produce social scientific insights that contribute to the accumulation of knowledge. In the wrong hands, both kinds of data can produce garbage.
My point is that proposals currently on offer in DART, such as active citations or a “transparency appendix”, are attempts to transform one kind of data, that used by qualitative researchers, into something that looks more like another kind of data, that used by quantitative researchers. The reason for this has nothing to do with transparency. It has everything to do with ongoing struggles within Political Science over what is considered social scientific knowledge. Debates over epistemology or methods belong in journal articles and books on the subject. It is disingenuous and dangerous to let these academic differences be executed through a set of editorial rules that will systematically diminish the voice of qualitative researchers in the discipline.
Adam Sheingate
Professor of Political Science
Johns Hopkins University
References:
Collier, David, Henry E. Brady, and Jason Seawright, “Sources of Leverage in Causal Inference: Toward an Alternative View of Metholdology,” in Henry E. Brady and David Collier, eds., Rethinking Sodcial Inquiry: Diverse Tools, Shared Standards, 2nd ed. (Lanham, MD: Rowman and Littlefield, 2010).
Mickey, Robert, Paths Out of Dixie: The Democratization of Authoritarian Enclaves in America’s Deep South, 1944-1972 (Princeton: Princeton University Press, 2015).
Stone, Deborah, Policy Paradox: The Art of Political Decision Making (New York: W.W. Norton, 2012).[/quote]
Post Reply