PEPH Evaluation Metrics Manual now available
By Kristi Pettibone
After more than three years of collaboration, the Partnerships for Environmental Public Health (PEPH) Evaluation Metrics Manual is finally available to the public, free of charge, on the NIEHS PEPH website.
“This manual represents a collective effort to identify and document examples of measures our grantees can use to assess and showcase their work in the community,” said Christie Drew, head of the Program Analysis Branch (PAB) in the NIEHS Division of Extramural Research and Training (DERT). NIEHS program and evaluation staff, grantees, contractors, and other interested parties worked together to develop, review, and revise the manual, to make it user friendly and helpful for a range of stakeholders involved in PEPH programs.
NIEHS has had a long-standing commitment to facilitate and engage community groups in environmental health science research. “You can’t do environmental health without the community,” said NIEHS/NTP Director Linda Birnbaum, Ph.D., during a forum in March on community-engaged research.
The PEPH program
In 2008, NIEHS established the PEPH program, to formalize its commitment to outline a coordinated vision for community and academic partnerships. Since then, more than 400 grantees have participated in activities designed to foster networking among grantees within the various NIEHS programs, including webinars and workshops on communicating PEPH findings and translating research to public health policy.
A key principle of the PEPH is community engagement, and NIEHS reached out to the extramural community during the process of re-visioning the program. In response to an NIEHS Request for Information in 2008, the community shared concerns about the lack of evaluation capacity, and the need for tools and approaches to develop project-specific evaluation metrics for public health-related program areas. In response, NIEHS developed the PEPH Evaluation Metrics Manual.
Evaluation is vital
The PEPH Evaluation Metrics Manual provides ideas about how to measure and document success. It also aims to build a common evaluation language that grantees and program staff can use in discussing PEPH programs. NIEHS will also use the approaches in the manual to evaluate the success of the program as a whole.
After speaking with more than 50 grantees at training sessions over the last two months, Drew said, “Grantees understand the reasons evaluation is necessary for their programs. They tell us it helps them improve their programs, obtain additional funding, and identify opportunities for new partnerships. They just aren’t sure what they should be evaluating. One of the key messages in our training has been that programs should evaluate those things that are important to them.”
Development and expected use of the manual
NIEHS developed the manual with significant input from PEPH grantees, program staff, and experts in the field. Drew and her colleagues vetted the manual, at over 30 professional meetings, to more than 250 individuals. Given that this is the most participatory program funded by NIEHS, staff thought it was critical to employ a participatory strategy to develop the tool.
NIEHS anticipates that the manual will be a living document and that it will need updating periodically. Opportunities for expansion include new evaluation topics, such as cost-benefit analyses and econometric evaluations, new examples of metrics drawn from the ever-expanding network of PEPH grantees, and new approaches used in programs, such as social media.
NIEHS program staff has been conducting training related to evaluation metrics, and developing stand-alone materials that will be available to the public through the PEPH website and resource center. Staff is also available to conduct webinars related to the PEPH Evaluation Metrics Manual. For more information about the manual and developing metrics, visit www.niehs.nih.gov/pephmetrics.
Developing the manual was a truly collaborative process and the PAB team offers great thanks to the grantees, community partners, colleagues, and NIEHS staff who contributed.
(Kristi Pettibone, Ph.D., is a health scientist administrator in the NIEHS Program Analysis Branch and a co-author of the new manual.)
Evaluating PEPH programs
Typical approaches to evaluating research outcomes involve analyzing publications. However, because many PEPH programs do not publish findings related to their community engagement, the team worked with grantees and community members to identify appropriate metrics to demonstrate success in these areas. The manual describes the approach to developing metrics for five crosscutting areas: partnering, leveraging, disseminating findings, training, and capacity building.
Sample metrics from grantee programs include:
- Demonstrating success at identifying partners — The University of Cincinnati’s anti-idling campaign provided a description of the partners involved and the resources they bring to the project. Cincinnati Public Schools (CPS) provided access to students and schools, Cincinnati Health Department provided nursing services, a Councilwoman provided credibility and the ability to attract attention to the project, and the Hamilton County Department of Environmental Services provided training and information to CPS staff and students.
- Demonstrating that they communicated their findings in a variety of products — The Bay Area Breast Cancer and the Environment Research Center described the number and demographics of their social media audience. The center has more than 1,000 followers on twitter and 864 Facebook friends. Followers are 70 percent female and more than half are age 40 or older.
- Demonstrating the policy impacts of their advocacy — The Trade, Health, and Environment Impact Project at the University of Southern California documented its contribution to the formation of the San Pedro Bay Ports Clean Air Action Plan. The plan stated that the Ports of Los Angeles and Long Beach would reduce air pollution by 45 percent by 2011. The project also documented its involvement in passing the Clean Air Action Plan, which established a progressive ban on polluting trucks. The plan resulted in a 70 percent reduction in port truck emissions in the Port of Los Angeles in the first year.