Evidence-based practice and social work: an illustration of the steps involved.
Social workers (Practice)
Weinman, Maxine L.
|Publication:||Name: Health and Social Work Publisher: National Association of Social Workers Audience: Academic; Professional Format: Magazine/Journal Subject: Health; Sociology and social work Copyright: COPYRIGHT 2007 National Association of Social Workers ISSN: 0360-7283|
|Issue:||Date: May, 2007 Source Volume: 32 Source Issue: 2|
|Topic:||Event Code: 200 Management dynamics|
The current focus of the health care system involves a commitment
to the process of considering evidence when making decisions (Donald,
2002). The principles of an emerging paradigm referred to as
evidence-based practice (EBP) are ideally suited to fulfill this
commitment. "Evidence-based practice is the conscientious,
explicit, and judicious use of current best evidence in making decisions
about the care of clients" (Gibbs & Gambrill, 2002, p. 452).
Theoretically, EBP works by converting information from thousands of
studies into risk estimates and thereby provides a framework for
addressing questions asking "what is the chance that" certain
things harm or benefit people, according to a fair and scientifically
rigorous process (Donald).
EBP offers social work practitioners and administrators a philosophy of practice that is compatible with obligations described in the profession's code of ethics and educational accreditation policies and standards (for example, informed consent, to draw on practice--and policy-related research findings) (Gambrill, 2003). Unfortunately, however, Gambrill noted that the philosophy and technology of EBP described in health care are typically not described in the social work literature. "If social workers do not take the time to comprehend the systemic, client-oriented vision of EBP, then we risk losing critical opportunities to decrease gaps between professional, ethical obligations and accreditation standards and what is done in everyday practice" (Gambrill, p. 18). Thus, in an era in which social workers must defend their domain from encroaching disciplines, adding the role of evidence-based practitioner to the plethora of other social work roles is fundamental. Using the vaccine safety dilemma as an example, we present the five steps involved in EBP and demonstrate their utility to social work practice.
Step 1: Converting Client Needs into Answerable Questions
The first step of EBP involves converting client information needs into answerable questions (Sackett, Richardson, Rosenberg, & Haynes, 1997; Sackett, Straus, Richardson, Rosenberg, & Haynes, 2000). Questions that lend themselves to searching for the best evidence must be specific enough to generate an answer in an electronic search, by designating the patient population or problem, identifying the intervention or area of interest, identifying a comparison intervention or status if applicable, and designating measurable outcomes (Gibbs, 2003; Melnyk & Fineout-Overholt, 2002). Features of a well-built question include the client type or problem, the intervention or what you (the social worker) might do, an alternate course of action, and what you (the social worker) would like to accomplish (Gibbs).
Social workers working in a health care domain are confronted with an array of client concerns and needs. One example of a client concern stems from the current health care debate about the safety of childhood vaccines. An increase in the number of vaccinations that children receive has yielded an increase in adverse events following vaccinations and thereby an increase in parental concern about the safety of vaccines (Centers for Disease Control and Prevention [CDC], 2002; Gellin, Maibach, & Marcuse, 2000). Using this as an example, the client type is children, the problem is adverse reactions to vaccines, the intervention is vaccines, an alternate course of action is no vaccines, and what you (the social worker) would like to establish is the probability of vaccine adverse reactions based on data from empirical studies. Incorporating these features to form an answerable question would result in the following: "If a child receives a mandatory childhood vaccine, what is the probability of the child's developing an adverse reaction to the vaccine?"
Step 2: Locating the Best External Evidence to Answer the Question
The second step of EBP involves searching for, with maximum efficiency, the best evidence with which to answer the question. This step requires technological access to bibliographic databases and the skills to search these databases efficiently and thoroughly (Gibbs & Gambrill, 2002). Fortunately, numerous EBP databases provide high-quality systematic reviews of research. For example, the Cochrane Library, a product of the Cochrane Collaboration (http:// www.cochrane.org/), an international association of researchers involved in preparing, maintaining, and promoting the accessibility of systematic reviews of the effects of health care interventions, produces and regularly updates hundreds of systematic reviews on various topics. The Cochrane Database of Systematic Reviews can also be accessed through Medline (http://www.medline.com), the largest database of biomedical journal literature in the world, containing 11 million references and abstracts, by limiting the search to evidence-based reviews (Melnyk & Fineout-Overholt, 2002). In addition, the Campbell Collaboration (http://www.campbellcollaboration. org/) is a registry of more than 10,000 randomized and possibly randomized trials in education, social work and welfare, and criminal justice. Specific terms from the question that will effectively mark documents in an electronic search must be identified, although the terms may or may not work as markers in a search, and additional terms may need to be added. It is important to note that even the most efficient electronic searchers can have a negative finding, but a well-planned and extended search that finds nothing is a finding that must be conveyed to clients (Gibbs, 2003). Furthermore, Gibbs recommended searching with equal diligence for evidence that disconfirms or supports your beliefs.
Step 3: Critically Evaluating the Evidence
The third step of EBP involves critically evaluating the obtained evidence for its validity and usefulness to the client (Sackett et al., 1997, 2000). In critically evaluating the evidence, Gibbs (2003) recommended using a rating form, such as the Quality of Study Rating Form, which can be located in his book Evidence-Based Practice for the Helping Professions. This book also provides a Web site (http://www. evidence.brookscole.com/index.html) with useful information for formulating questions and searching for evidence. Other professionals suggest following evidence-based guidelines, which are rigorously designed recommendations for practice by a panel of experts (Melnyk & Fineout-Overholt, 2002). An extensive number of evidence-based guidelines produced by the Agency for Healthcare Research and Quality can be accessed at http://www. guidelines.gov.
Parental concern about the possible link between childhood vaccines and autism came to the forefront in 1998, when Dr. Andrew Wakefield of London, England, published a report in the Lancet suggesting that the measles, mumps, rubella (MMR) vaccine was the cause of autism in children (Wakefield et al., 1998). Wakefield's study had low generalizability, as there were only 12 participants, and the study did not include a control group, making it impossible to establish a cause--effect relationship between autism and the MMR vaccine. Correspondingly, Taylor and colleagues (1999) analyzed the records of 498 children with autism and further examined the age of diagnosis in vaccinated and unvaccinated children. The findings from this study indicate that the onset of regressive symptoms of autism did not occur within a specified time of receiving the MMR vaccine. Interestingly, Dr. Paul Offit, chief of infectious diseases and director of the Vaccine Education Center at the Children's Hospital of Philadelphia, noted that parents embraced Wakefield's study with only 12 participants and no control group and rejected Taylor's 498-participant study with a control group (as cited in Howard, 2000).
To aid parents and caregivers in critically evaluating the evidence, social workers must possess basic knowledge of research methodology to evaluate the study's internal and external validity, appraise the evidence for different kinds of biases and confounding variables, and ultimately determine the power of the study by its ability to determine an effect of an intervention (Donald, 2003). Experimental studies attempt to establish cause-effect relationships between variables through assignment of participants to control and experimental groups. To increase the power of the cause-effect relationship, experimental studies must control for confounds--outside variables that could have an effect on a study's results. A failure to control for such confounds compromises the study's internal validity, which is the certainty with which a study's results can be attributed to the manipulation of study variables rather than possible confounds. External validity refers to the degree to which a study's results may be generalized to other populations and settings and is increased through proper sampling techniques and participant size. Social workers can assist parents and caregivers in critically evaluating the obtained evidence by using a table or chart that includes the source, hypothesis, or purpose of the study; the design; the number of participants; and the findings.
Steps 4 and 5: Applying the Results of the Evaluation to Policy or Practice Decisions and Taking Appropriate Action
The fourth and fifth steps in the process of EBP require consideration of clients' similarity to those studied, client access to interventions described in the studies, and consideration of client preferences (Gibbs & Gambrill, 2002). A significant feature of EBP is client involvement as informed participants in the decision-making process (Gambrill, 2003), which is achieved by considering individual client differences with available research, personalizing evidence to specific clients, and encouraging client involvement in developing critical appraisal skills (Gambrill). EBP uses the same sources for clinical decisions and advice as in the past (for example, clinical experience, expert opinions, and published materials), but passes them through the filtering question "On what evidence is the advice based?" (Berg, 1998). Thus, an evidence-based consultation provides a recommendation and the supporting data rather than the recommendation only (Berg).
There are a variety of ways to approach parental fears regarding childhood vaccines, and concerned parents expressing reluctance to vaccinate their children need a quantitative analysis of vaccine risks and benefits as well as professional understanding, for knowledge of risk--benefit ratios must be paired with practical communication skills to ensure a good outcome (Ball, Evans, & Bostrom, 1998). The obtained and evaluated research becomes the social worker's answer to the posed question, and again, the finding that no research findings are available must be shared with clients. Ultimately, the social worker would present the findings to the parents or caregivers and thereby involve them in the decision regarding their child's vaccines.
SOCIAL WORK IMPLICATIONS
Intuition and unsystematic clinical expertise are insufficient for making decisions, particularly in the era of managed care (Gambrill, 2003; Guyatt & Rennie, 2002). Fortunately, the advent of large electronic databases of research on which health care providers can build evidence-based answers makes EBP feasible, even in this current health care climate. However, decisions cannot be based solely on evidence; the clinician also requires compassion, listening skills, and broad perspectives from the humanities and social sciences, qualities inherent to the social work profession (Gambrill; Guyatt & Rennie). Thus, as social workers continually strive for the most efficient strategies to address client needs and concerns, the role of evidence-based practitioner must be adopted. Social workers could successfully accomplish this additional mode of practice if they adopt a process of lifelong learning that involves continually posing specific answerable questions of importance to clients, searching electronically for the current best evidence relative to the question, critically evaluating the evidence while considering its applicability to the client, and then reporting the evidence, even if none exists, to the client (Gibbs, 2003; Sackett et al., 1997, 2000).
Original manuscript received August 29, 2003
Final revision received April 12. 2004
Accepted April 27, 2004
Ball, L. K., Evans, G., & Bostrom, A. (1998). Risky business: Challenges in vaccine risk communication. Pediatrics, 101, 453-458.
Berg, A. O. (1998). Dimensions of evidence. Journal of the American Board of Family Practice, 11, 216-223.
Centers for Disease Control and Prevention and the Food and Drug Administration. (2002, October 23). Vaccine adverse event reporting system. Retrieved June 18, 2003, from http://www.vaers.hhs.gov
Donald, A. (2002). Evidence-based medicine: Key concepts. Medscape General Medicine, 4(2). Retrieved June 18, 2003, from http://www.medscape. com/viewarticle/430709
Donald, A. (2003). How to practice evidence-based medicine. Medscape General Medicine, 5(1). Retrieved June 18, 2003, from http://www.medscape. com/viewarticle/448226
Gambrill, E. D. (2003). Evidence-based practice: Sea change or the emperor's new clothes? Journal of Social Work Education, 39, 3-24.
Gellin, B. G., Maibach, E.W., & Marcuse, E. K. (2000). Do parents understand immunizations? A national telephone survey. Pediatrics, 106, 1097-1102.
Gibbs, L. E. (2003). Evidence-based practice for the helping professions: A practical guide with integrated multimedia. Pacific Grove, CA: Brooks/Cole.
Gibbs, L., & Gambrill, E. (2002). Evidence-based practice: Counterarguments to objections. Research on Social Work Practice, 12, 452-476.
Guyatt, G., & Rennie, D. (Eds.). (2002). Users' guide to the medical literature: A manual for evidence-based clinical practice. Chicago: American Medical Association.
Howard, M. (2000). Behind the vaccine controversy. Baby Talk, 65(9), 56-63.
Melnyk, B. M., & Fineout-Overholt, E. (2002). Key steps in implementing evidence-based practice: Asking compelling, searchable questions and searching for the best evidence. Pediatric Nursing, 22, 262-266.
Sackett, D. L., Richardson, W S., Rosenberg, W., & Haynes, R. B. (1997). Evidence-based medicine: How to practice and teach EBM. New York: Churchill Livingstone.
Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). New York: Churchill Livingstone.
Taylor, B., Miller, E., Farrington, C. P., Petropoulos, M. C., Favot-Mayaud, I., Li, J., & Waight, P. A. (1999). Autism and measles, mumps, and rubella vaccine: No epidemiological evidence for a causal association. Lancet, 353, 2026-2029.
Wakefield, A. J., Murch, S. H., Anthony, A., Linnell, J., Casson, D. M., Malik, M., Berelowitz, M., Dhillon, A. P., Thomas, M. A., Harvey, P., Valentine, A., Davies, S. E., & Walker-Smith, J. A. (1998). Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. Lancet, 351, 637-641.
Megan Gossett, MSW, is a social worker, Houston. Maxine L. Weinman, DPH, LCSW, is professor and PhD program director, Graduate College of Social Work, University of Houston, 237 Social Work Building, Houston, TX 77204- 4013; e-mail: MWEpstein@uh.edu. Address correspondence to Maxine L. Weinman.
|Gale Copyright:||Copyright 2007 Gale, Cengage Learning. All rights reserved.|