Reconciling evidence-based practice, evidence-informed practice, and practice-based research: the role of clinical data-mining.
Article Type: Report
Subject: Social case work (Models)
Data mining (Models)
Data mining (Usage)
Author: Epstein, Irwin
Pub Date: 07/01/2011
Publication: Name: Social Work Publisher: National Association of Social Workers Audience: Academic Format: Magazine/Journal Subject: Sociology and social work Copyright: COPYRIGHT 2011 National Association of Social Workers ISSN: 0037-8046
Issue: Date: July, 2011 Source Volume: 56 Source Issue: 3
Topic: Computer Subject: Data warehousing/data mining
Geographic: Geographic Scope: United States Geographic Code: 1USA United States
Accession Number: 262582579
Full Text: One can't talk about empowering social work without considering what disempowers social workers or how we disempower ourselves. One way is by constantly demeaning each other. Nowhere is the intramural practice of disparaging social work practitioners more pervasive than in the rhetoric of the evidence-based practice (EBP) movement. Alongside rigorously conducted research studies, one often finds disparaging references to practitioners' lack of critical thinking, the weakness of practice wisdom, and the dangers of interventions that have not first been tested via randomized controlled trials (RCTs) and meta-analyses.

Practitioner disparagement appears to be both the raison d'etre and the sine qua non of EBP rhetoric. Having taught social work research for four decades, I am painfully aware, however, that the easiest and most self-defeating strategy for winning over "research-reluctant" students is to harangue them about how "soft headed" social workers are (Epstein, 1987). Instead of producing an appreciation of what research can tell us and what it can't or building on practitioners' concern for clients, a pedagogy grounded in disparagement only produces further research alienation--except, perhaps, in those few who go on to teach research and extend this unfortunate pedagogical practice for yet another generation.


Although disparagement of practitioners by researchers did not begin with EBP, I was reminded of it when I received my latest issue of Research on Social Work Practice. Vainly hoping to see a review of my most recent book (be careful what you wish for), I settled on Stoesz's (2010) review essay on Gray, Plath, and Webb's (2009) new "critique" of EBP Rather than dwell on the strengths and weaknesses of the book, Stoesz uses the opportunity to rail against social workers' sloppy record keeping, poor math skills, and seeking of "safe haven" in qualitative research and postmodernism. His catalogue of research calumnies is titled "Second-Rate Research for Second-Class Citizens" (Stoesz, 2010, p. 329). In his review, Stoesz righteously remarks,

In the same issue, an invited article by Gambrill (2010)--perhaps the most prolific producer of pro-EBP and antipractitioner rhetoric--posits EBP as the "antidote to propaganda" and practitioner "palaver" in social work. Emphasizing the harm potential in social work services, Gambrill charges that social workers and those they serve are "bamboozled by false claims in professional journals and textbooks" (p. 303); that interventions not vetted by EBP methodology are "propagandistic" and "key in quackery, fraud and corruption" (p. 312); that criticisms of EBP are "propaganda ploys" and "distortions and misrepresentations"; and that EBP'S critics "have no concern for truth, only to create credibility and for guile and charm" (p. 304). By her logic, criticism equals "censorship."

I cannot do justice to the prodigious effort (going back to Pope Gregory's first papal propaganda office in 1622), inflammatory language, and arguable logic that Gambrill (2010) invests in her representation of the dangers inherent in social work practices that are not guided by the EBP movement she champions. EBP is portrayed as a victim: "Distortion of the deeply democratic, participatory and transparent process and philosophy of EBP is an illustration of the power of propaganda to maintain authority-based decision-making" (Gambrill, 2010, p. 304).

How curious that the abuses of authority on which she makes her case were perpetrated not by social workers but by the medical profession and the pharmaceutical industry (for example, creating "pseudodiseases," lobotomizing patients, pushing questionable drugs). Isn't evidence-based medicine the paradigmatic model for EBP in social work and "Big Pharma" the Mecca of RCTs? And how ironic that social work's self-proclaimed opponent of "authority-based" social work uses rhetoric, language, and logic that is so familiarly authoritarian?

Writing in Social Work Research and offering a less florid and more "evidence-based" discussion of EBR Pignotti and Thyer (2009) reported the results of a self-administered survey of the use of "novel unsupported and empirically supported therapies" (p. 5) by licensed clinical social workers and those social workers' attitudes toward EBR I suspect that several of those "novel unsupported" therapeutic interventions would be classified by Gambrill as "quackery" and "fraudulent." Pignotti and Thyer don't go that far, but the abbreviation that they use for the scale that measures the use of non-evidence-based interventions (and presumably those who use them) is "NUTS." This sobriquet is scattered throughout the article. They could just as easily have named it the "NEBI" scale, but they didn't. Following Gambrill's logic, is "NUTS" a "propaganda ploy"? Or is it simply an abbreviation?

To their scholarly credit, and to my secret pleasure upon reading it, Pignotti and Thyer (2009) reported that practitioners who scored higher on pro-EBP attitudes were more likely to use not only EBP-supported interventions but "NUTS" y ones as well. They also found that women are more likely to use "NUTS" Although they offer a face-saving explanation for the first set of findings, they wisely leave the second one alone.


How much more balanced and respectful is Haight's (2010) guest editorial in the April 2010 issue of Social Work devoted to integrative reviews. Instead of making a fetish of the RCT, Haight recognizes the valuable roles that various strategies of applied social science research can play in promoting a form of practice-research integration that is both methodologically pluralist and of broad use to practitioners. Her conception of evidence-informed practice (EIP) embraces a range of research methodologies that can "provide a richly contextualized analysis of social phenomena through practices such as sustained engagement and use of multiple methods--including direct observations, in-depth interviewing, and record reviews--[and] may be seen as the gold standard within this tradition" [italics added] (Haight, 2010, p. 102).

In that regard, her conceptual approach parallels Petr's (2008) more pluralist "multidimensional" EBP. Even the review of a new book by Solomon, Cavanaugh, and Draine (2009) on RCTs that concludes the integrative reviews special issue recognizes the problems of RCT implementation in field settings (Osteen, 2010) without trashing practitioners for sabotaging them (Rubin, 2006). Certainly Haight's (2010) discussion of EIP represents a less doctrinaire and more "practitioner-friendly" approach to practice-research integration than the prevailing rhetoric of EBP. However, even EIP as Haight defined it and multidimensional EBP as Petr operationalized it treat practitioners as essentially consumers and appliers of research rather than as legitimate producers of knowledge as well. A truly pluralistic EIP would do both.


Seeking ways to integrate practice and research has been the leitmotif of my career. Rather than blaming practitioners, however, it became clear to me over the course of a very enjoyable career journey that the route to successful practice-research integration was via an appreciative and respectful collaboration with practitioners and a mutual research exploration of questions that emerged from their practice. I called this more inductive approach practice-based research (PBR) (Epstein, 1995, 1996) and had the good fortune to be able to develop and "practice" it in a congenial academic environment (the Hunter College School of Social Work) and in a practice--research-oriented agency environment (the Mr. Sinai Hospital Department of Social Work Services). Mt. Sinai had a long tradition of supporting practitioner research and publication (Rehr, Rosenberg, & Blumenfield, 1998).

Though Mt. Sinai practitioners routinely generated vast quantities of patient and intervention data as a routine requirement of their practice, our earliest PBR studies involved the collection and analysis of original data. However, relatively late in my Mr. Sinai consulting experience and entirely by accident, I "discovered" clinical data-mining (CDM) as a PBR strategy (Epstein, 2001). Simply stated, CDM is the extraction, analysis, and interpretation of available clinical data for practice-knowledge building, clinical decision making, and practitioner reflection. CDM can be qualitative or quantitative, and it can be meaningfully combined with original data collection (Epstein, 2010).

In my first published article on CDM--subtitled "Mining for Silver while Dreaming of Gold" (Epstein, 2001), I began by making a conceptual distinction between research-based practice (today we might call it EBP) and PBR. The latter refers to research studies conducted by practitioners that are intended to inform practice decision making and self-reflection. Admittedly not intended to prove the effectiveness or efficacy of intervention, CDM was intended to improve practice. As my subtitle acknowledged, CDM involved less than perfect, non-"gold-standard" studies, conducted by practitioners, compatible with their values but nonintrusive to practice.

Unlike RCTs, which minimize contextual differences to demonstrate the efficacy of interventions, PBR studies seek less summative objectives. Nonetheless, quasi-experimental quantitative CDM studies may use "gold-standard," cause-effect logic and may even approximate RCTs (Sainz & Epstein, 2001). Whatever their design, when conducted properly, CDM studies allow for unanticipated findings--positive, prosaic, and negative--but in no way are they intended to merely justify existing practice. Their ultimate purpose is to promote critical, "evidence-informed" reflection by practitioners, and that is what they do.

CDM studies are generally retrospective and always nonintrusive. The practices studied and outcomes achieved have already taken place. Because data are easily deidentified and are analyzed by the workers who provided services, CDM studies represent no challenge to practice norms, no ethical dilemma for practitioners, and no exploitation of their labors. Moreover, in conducting these studies, practitioners have demonstrated remarkable integrity and respect for sound research principles. Every CDM study proposed has been institutional review board approved.

Over the past decade, I have collaborated with hundreds of practitioners in child welfare, medical, and mental health settings in Australia, Hong Kong, Israel, New Zealand, Singapore, the United Kingdom, and the United States. Three CDM collections and several individual articles have been published in peer-reviewed journals. The practice-knowledge functions these studies perform directly parallel the "multiple roles of applied social science research" described by Haight (2010, p. 101)--everything from quantitative need studies, to studies of "program fidelity," to outcome studies, to qualitative studies of underserved client experiences and complex forms of practice. These studies have been conducted in child welfare, employee assistance, health, and mental health settings--wherever practitioners keep records. Most recently, I have begun to explore the potential of CDM for multidisciplinary studies conducted by social work and allied health practitioners (for example, nutritionists, music therapists, occupational therapists, physiotherapists, speech pathologists) on subjects as diverse as the intervention methods they use and the problems they address.

In addition, practice-oriented PhD students both here and abroad have begun to use CDM in their dissertation research, making use of more and more sophisticated qualitative, quantitative, and mixed-method approaches in analyzing routinely available clinical and administrative data. Their research concerns subjects ranging from neonatal birth defects in the United Kingdom to "good death" in Hong Kong. Currently in the works is a CDM dissertation on the successes and failures of cognitive--behavioral treatment groups for depression among Hong Kong Chinese agency clients, combined with an RCT intended to provide and test a more culturally compatible intervention. I don't call that "censorship." I call it "research." The possibilities of CDM research seem endless, as long as we are intellectually open enough to entertain them.

What is striking about my CDM consulting experience is both the enthusiasm with which allegedly "research-phobic" practitioners take to systematically reflecting on their practice via their own data and the integrity with which they approach their findings--whether those findings are predictable, inspiring, or disappointing. Despite (or possibly because of) the prevailing practitioner-bashing climate of EBP, practitioners appear to find the CDM experience empowering and affirming even when their findings do not support their preconceived practice wisdom.

Admittedly, CDM is not without its methodological problems. These include missing data, nonstandardized measures, absent key variables, and validity and reliability issues, among others. For these reasons and others, academic researchers have historically rejected using available agency data for research purposes (Shyne, 1960). However, problems of missing data, validity, and reliability are not unique to CDM. Alternatively, CDM studies have distinct advantages over RCTs in that experimental attrition is never a problem and they do not require enormous research grants. Maybe that is the problem for academic researchers.


In an article recently published in Social Work in Health Care, I call for a more "harmonious" and less acrimonious relationship between social work researchers and practitioners (Epstein, 2009). Much as I did in my 2001 "Mining for Silver" article, Gray et al. (2009) ground their postmodernist "critique" of EBP enterprise in a discussion of the RCT as EBP's "gold standard." Likewise, their concluding chapter, devoted to the "practitioner's perspective" on EBP, is consistent with what practitioners have told me in every CDM workshop I have given in every country in which I have given it--that is, that they resent the threat to their professional autonomy and creativity that EBP rhetoric and the "manualization" of interventions represent. Here, no doubt, Gambrill would claim "oversimplification" and "distortion," but it describes what practitioners report.

Gray et al. (2009) provided little by way of a research alternative for practitioners other than adopting a critical, postmodernist stance. Though they make no reference to CDM, they offer a compelling political and metaphorical justification for it:

What is common to these [EBP] approaches is their attempt to get social workers to engage in "research-based practice" ... and its attempt to move social workers beyond using research or evaluating their own practice in the style of the empirical-clinical practitioner approach, to being able to locate and critically appraise empirical evidence.... All this is being done without any concerted engagement with practitioners at the coalface [italics added] (Gray et al., 2009, p. 95)."

CDM directly engages practitioners in a research process at the coalface, whereas current EBP rhetoric reinforces a disempowering dichotomy between those who produce knowledge and those who are supposed to apply it. In this hierarchy, practitioners are truly "second-class citizens" in the knowledge-production project of social work. Sustaining the hierarchy only enhances the research alienation of practitioners that research academics like Gambrill, Gray, Rubin, Solomon, Stoesz and Thyer, and I have struggled with for decades.

It's a credit to the openness of Oxford University Press editors that my CDM (Epstein, 2010) and Solomon et al.'s (2009) RCT "Pocket Guides" can sit side-by-side on the bookshelves of students, practitioners, and academic researchers alike. Though Stoesz (2010) is writing about practitioner opposition to EBP, and I anticipate academic opposition to CDM, I conclude this commentary with the one quote from his review with which I totally agree:

Dare we hope for more from our EBP academic colleagues?


Epstein, I. (1987). Pedagogy of the perturbed: Teaching research to the reluctants. Journal of Teaching in Social Work, 1, 71-89.

Epstein, I. (1995). Promoting reflective social work practice: Research strategies and consulting principles. In P. Hess & E. Mullens (Eds.), Practitioner-researcher partnerships: Building knowledge from, in, and for practice (pp. 83-102). Washington, DC: NASW Press.

Epstein, I. (1996). In quest of a research-based model for clinical practice: Or, why can't a social worker be more like a researcher? Social Work Research, 20, 97-100.

Epstein, I. (2001). Using available information in practice-based research: Mining for silver while dreaming of gold. Social Work in Health Care, 33(3/4), 15-32.

Epstein, I. (2009). Promoting harmony where there is commonly conflict: Evidence-informed practice as an integrative strategy. Social Work in Health Care, 48, 216-231.

Epstein, I. (2010). Clinical data-mining: Integrating practice and research. New York: Oxford University Press.

Gambrill, E. (2010). Evidence-informed practice: Antidote to propaganda in the helping professions? Research on Social Work Practice, 20, 302-320.

Gray, M., Plath, D., & Webb, S.A. (2009). Evidence-based practice: A critical stance. New York: Routledge.

Haight, W. L. (2010).The multiple roles of applied social science research in evidence-informed practice [Guest Editorial]. Social Work, 55, 101-103.

Osteen, P. (2010). [Review of the book Randomized controlled trials: Design and implementation of community-based psychosocial interventions, by P. Solomon, M. M. Cavanaugh, & J. Draine]. Social Work, 55, 189.

Pert, C. G. (Ed.). (2008). Multidimensional evidence-based practice: Synthesizing knowledge, research, and values. New York: Routledge.

Pignotti, M., & Thyer, B. A. (2009). Use of novel unsupported and empirically supported therapies by licensed clinical social workers: An exploratory study. Social Work Research, 33, 5-17.

Rehr, H., Rosenberg, G., & Blumenfield, S. (Eds.). (1998). Creative social work in health care: Clients, the community and your organization. New York: Springer.

Rubin, A. (2006). Foreword. In L. Alexander & P. Solomon (Eds.), The research process in the human services: Behind the scenes (pp. xii-xiv). Belmont, CA: Thomson-Brooks/Cole.

Sainz, A., & Epstein, I. (2001). Creating experimental analogs with available clinical information: Credible alternatives to "gold-standard" experiments? Social Work in Health Care, 33(3/4), 163-183.

Shyne, A. W. (1960). Use of available material. In N. A. Polansky (Ed.), Social work research (pp. 106-124). Chicago: University of Chicago Press.

Solomon, P., Cavanaugh, M. M., & Draine, J. (2009). Randomized controlled trials: Design and implementation of community-based psychosocial interventions. New York: Oxford University Press.

Stoesz, D. (2010). Second-rate research for second-class citizens [Review of the book Evidence-based social work: A critical stance, by M. Gray, D. Plath, & S.A. Webb]. Research on Social Work Practice, 20, 329-332.

Original manuscript received May 4, 2010

Accepted May 24, 2010

Irwin Epstein, PhD, is H. Rehr Professor of Applied Social Work Research, School of Social Work, Hunter College, City University of New York, 129 East 79th Street, New York, NY 10025; e-mail: A version of this article was presented at the Joint World Conference on Social Work and Social Development, Hong Kong, June 10, 2010.
EBP represents a challenge to social work as a
   prescientific activity, a muddle of humanism,
   psychoanalysis, and most recently postmodernism.
   For proponents of EBP, little of social work,
   as it has evolved, makes much logical sense; its
   validity is based ultimately on the "authority"
   of practitioners as opposed to any independent
   rational assessment of efficacy. In the absence
   of evidence, social work is a grab bag of good
   intentions ... just as likely to inflict harm as
   benefit. (pp. 329-330)

Because [research] paradigms consume entire
   careers of groups of scientists, their up-ending
   represents a fundamental threat to the intellectual
   community. Thus, the status quo resists paradigm
   shifts, tooth and claw. (p. 329)
Gale Copyright: Copyright 2011 Gale, Cengage Learning. All rights reserved.