Designing evaluation systems based on empirical evidence: William Trochim and his colleagues develop evaluation systems for huge and complex organizations, helping researchers, educators, and public employees to implement programs that address society's challenges.
Subject: Evaluation (Methods)
Evaluation (Social aspects)
Evaluation research (Social action programs) (Management)
Interdisciplinary research (Management)
Author: Bauman, Liz
Pub Date: 11/01/2008
Publication: Name: Human Ecology Publisher: Cornell University, Human Ecology Audience: Academic Format: Magazine/Journal Subject: Health; Science and technology; Social sciences Copyright: COPYRIGHT 2008 Cornell University, Human Ecology ISSN: 1530-7069
Issue: Date: Nov, 2008 Source Volume: 36 Source Issue: 2
Topic: Event Code: 290 Public affairs; 200 Management dynamics Computer Subject: Company business management
Geographic: Geographic Scope: United States Geographic Code: 1USA United States
Accession Number: 231021639
Full Text: All of us--whether educators, researchers, health care providers, or just taxpayers--want to know that our time, effort, and money are well spent and will bring the results we desire. And the same can be said for the vast number of programs and activities carried out by schools, research institutions, community organizations, and state and federal agencies.

[ILLUSTRATION OMITTED]

"While we hope that programs are selected for and survive using rational criteria, in many situations they probably survive because people like them, get used to them, or because institutional, political, or economic forces favor their survival," said William Trochim.

Trochim is a professor in the college's Department of Policy Analysis and Management and a national leader in designing evaluation systems that help assess how programs function and examine whether they are actually accomplishing their intended goals. He is leading an innovative effort to develop evaluation approaches that are based on evolutionary theory from the life sciences. Trochim argues that evaluation can play a key role in both developing program variations and in providing feedback that influences selection, just as natural selection does in biological evolution.

"Like evolutionary theory, evaluation can encourage program managers, decision-makers, and policymakers to use a 'trial-and-error' approach to evolving better programs that have a greater 'fitness' to their environment," he said. Trochim is creating such evolutionary evaluation systems and testing them in real-world contexts.

The ecology of science

Trochim's work is at the intersection of science and human ecology.

"We live in a dynamic world, with complex systems of human organizations," he said. "I am an ecological systems thinker, and evaluation is a central human ecological function. It is essential to learning, because evaluation provides feedback about whether and how the things we create actually work."

As we look to science to try to solve the major problems our society faces, Trochim said we need to realize that "basically, science is a human social endeavor--and that is where human ecology becomes absolutely essential to its success in the 21st century."

Trochim, who has been on the Human Ecology faculty since 1980, has many roles at the college and beyond. He directs the Cornell Office for Research on Evaluation (CORE), a team that includes CORE manager Claire Hebberd and that works to develop evaluation systems for large and complex organizations and scientific endeavors. Trochim is the director of evaluation for the new Clinical and Translational Science Center, based at Weill Cornell Medical College in New York City. He is also the director of evaluation for Extension and Outreach at Cornell, as well as the principal investigator on a new grant to develop the next generation of evaluation approaches for assessing and improving science, technology, engineering, and mathematics (STEM) education and outreach programs funded by the National Science Foundation. Currently, Trochim is also serving as president of the American Evaluation Association.

"My life is a continuous triangle trip between Ithaca, New York, and Washington," he said with a laugh.

From basic biomedical science to the bedside and beyond

In the last few years it has become clear that massive investments in biomedical research have not translated into desired health outcomes, according to Trochim. On average, it takes 17 years from the time a new medical treatment or device is discovered until it's used widely in practice and "that's almost certainly a lower-end estimate."

"That's a system problem," he said. "We have systems of researchers and systems of health care practitioners, but we haven't been successful in connecting them effectively." Earlier on, biomedical researchers attempted to better disseminate the information about new discoveries--"we shouted louder," as Trochim put it--but that didn't significantly improve the time it took to move discoveries from lab bench to bedside and beyond to health impacts.

In the past three years, the National Institutes of Health (NIH) created the Clinical and Translational Science Awards to form a network of 60 centers nationwide with the ultimate goal of enhancing research translation to improve public health. Weill Cornell Medical College received $49 million from the NIH to lead a center in New York City. Called the Clinical and Translational Science Center (CTSC), it is a multidisciplinary collaboration among biomedical research institutions on the city's Upper East Side, including Weill Cornell, Memorial Sloan Kettering Cancer Center, the Hospital for Special Surgery, Hunter College, and Cornell in Ithaca.

Each center was required to develop an evaluation system. Trochim and his colleague, Cathleen Kane, the CTSC manager of evaluation, along with Julianne Imperato-McGinley, CTSC's principal investigator, worked to develop systems to evaluate what works in translating biomedical research into clinical practice. The CTSC is encouraging cross-institutional collaboration and trying to break down disciplinary and specialization silos. It is organized into 11 key functions concerned with everything from ethical and regulatory issues, novel and pilot research, clinical services, community engagement, and educating the next generation of medical researchers.

"If we approach this only from a biomedical perspective, we won't get the translation of research to practice that we need," Trochim said, He and others expect that by connecting more directly with community and patient interests, scientists can be encouraged to apply their research to issues more directly relevant to society's needs, such as vaccines for infectious diseases or the problem of widespread tobacco use.

The center's staff is in the process of developing working relationships throughout New York City's diverse communities. Trochim and his colleagues will be asking practitioners and patients in community-based settings to articulate their health and medical needs and impediments to accessing health care.

"There is a pressing need for broad-based multidisciplinary collaborations that can fulfill the incredible promise of recent research advances in areas like genetics and bioinformatics, and efficiently translate them into real-world interventions that benefit the community," said David Skorton, president of Cornell University and professor of medicine at Weill Cornell Medical College.

Evaluating extension's efforts

For Cornell Cooperative Extension (CCE) educators to better develop, implement, and assess their thousands of programs across New York State, they recognized the need for improved evaluation and feedback mechanisms. As director of evaluation for Extension and Outreach at Cornell, Trochim is working with Helene Dillard, director of CCE; Mike Duttweiler, assistant director of CCE; and Monica Hargraves, manager of evaluation. They have created a "system evaluation protocol" that facilitates essential evaluation steps, gathers information from all of CCE's efforts, and feeds it back to people beyond the local context, so that they can learn from each other.

"If people follow the steps in the protocol, they will generate higher-quality and more appropriate evaluations for their programs. We're trying to create a generic protocol that can be used in any evaluation context," Trochim said.

Pilot testing for the protocol began two years ago at CCE's New York City office with 24 programs. Last year, Trochim expanded the pilot testing to six more counties: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, and Ulster.

To implement the new protocol, Trochim realized they needed to design a novel cyber infrastructure to communicate across program areas and around the state. Thus was born the "Netway," short for "networked pathways." (Pathway evaluation models show which inputs or activities lead to which results and impacts.) To date, staff members in these first six counties and New York City have created pathway models for 50 programs. According to Trochim, Netway allows users to see other people's models and use ideas from those to create new programs, activities, and outcomes.

"The more users you get, like with Wikipedia or YouTube, the greater the dynamic interactions, so you can connect people doing programs who wouldn't find each other otherwise," he said. "We're trying to network an essential architecture of evaluation and program thinking for the 21st century."

Getting to the root of STEM

The first phase of Netway development attracted attention and a grant from the National Science Foundation (NSF). Trochim has just received a second NSF grant ($2.3 million for five years) to take his cyber infrastructure and evaluation systems to the next level: to improve science, technology, engineering, and mathematics (STEM) education nationwide.

"The NSF and Congress have become extremely interested in how science integrates with our society," said Trochim. Thus, the NSF is funding Materials Research, Science, and Engineering Centers to enhance the broader impact science has on society and to draw in the next generation of scientists.

Trochim and Jennifer Brown Urban, who received her PhD in the college's field of human development last year, had earlier applied their evaluation systems to the Cornell Center for Materials Research, one of these centers. Now, they will be partnering with the Cornell center to extend these approaches to some 27 centers and beyond. In the process of doing this, they will be developing a virtual evaluation protocol that any organization can use to develop better evaluation systems on their own with the cyber infrastructure and resources developed at Cornell.

The NSF grant also funds work that begins this year in which Trochim and his team will build an evaluation system for Cooperative Extension's 4-H STEM education programs.

More sensible and sensitive evaluation

As president of the American Evaluation Association, Trochim is encouraging the evolution, understanding, and awareness of the field of evaluation in our society. He is especially trying to influence how the federal government evaluates its myriad programs.

Currently, he said, federal agencies don't have coherent evaluation systems. He explains that the Office of Management and Budget (OMB), which oversees every program in every federal agency, had implemented a new evaluation system called Program Assessment Rating Tool (PART) several years ago. The system was poorly conceived, especially in its evaluation requirements and guidelines, and met with considerable controversy and resistance, Trochim said. He is now working with the OMB to develop improved and more sensible evaluation guidelines and approaches. He sees the impacts of this effort as considerable, because it cuts across the entire federal government.

"We bring in evolutionary and systems thinking in our efforts at Cornell to create better evaluation approaches that can be scaled to different sizes and types of organizations and are sensible and sensitive to the needs of different programs," he said.

In all his roles, Trochim is creating and evolving new methods and resources for evaluation. He is testing his new approaches in many practical and interdisciplinary contexts, while developing the technologies and systems to support them.

Trochim said that the college is the right place for him and his research in evaluation systems: "The College of Human Ecology represents a diversity of fields and specialties and has a strong emphasis on their use in real-world contexts. This is ideal for the kind of work we're doing in evaluation. Dean Alan Mathios and his staff have been incredibly supportive of this work and recognize its importance for the future of the college and our society.

For more information:

William Trochim

wmt1@cornell.edu
Gale Copyright: Copyright 2008 Gale, Cengage Learning. All rights reserved.