TCI Featured Evaluator: Heather Lewis-Charp

by Heather Lewis-Charp / Dec 3, 2015

This post is part of a blog series called Transformative Change Initiative (TCI) Featured Evaluators, that includes interviews with members of TCI’s Evaluation Collaborative. This community of evaluators has a wealth of knowledge, experience and insights into evaluation of the TAACCCT grants that are being implementation throughout the United States. Want to be profiled or know someone who would make a great feature? Email us at

Name: Heather Lewis-Charp
Current position: Senior Associate at Social Policy Research Associates (SPR)

Short bio: Heather Lewis-Charp has an M.A. in Education from the University of California, and over 17 years of experience evaluating workforce and education programs. She coordinates Social Policy Research Associates’ community college research area and is currently leading the Evaluation of the Michigan Coalition for Advanced Manufacturing (M-CAM), a round 3 TAACCCT grant. Her research interests include the role of community colleges and community-based organizations in creating career pathways and opportunities for low-income youth and adults.

Q. What is the design and predominant methods for your TAACCCT evaluation?
A. The evaluation of the Michigan Coalition for Advanced Manufacturing is comprehensive. The implementation study includes four rounds of site visits to each of the eight college, 16 longitudinal student case studies, and interviews with policy and systems leaders to assess systems change. We are using an assessment tool to assess fidelity to the TAACCCT model and a pre-and post-social networking assessment to measure shifts in partnerships and the strength of ties between the colleges. We are also conducting a quasi-experimental impact study, using a difference-in-difference design.

One of the more unique features of the evaluation is how we have been able to partner with the colleges to improve outcomes tracking and to make the data more accessible, transparent, and useful. SPR worked with the eight community colleges to create a comprehensive ETO database to track program services and outcomes. We created a user guide for the database system, FAQ document, and webinars on database features. We also hold monthly meetings with data entry staff to answer data questions and flag data quality concerns. Finally, we create bi-monthly data dashboard updates and monthly data snapshots to make data accessible so that college staff can use data in a meaningful way. These snapshots are also a tool college staff members use to regularly update internal and external stakeholders on the status of the project.

Q. What advice do you have for new TAACCCT evaluators?
A. My biggest advice to new TAACCCT evaluators is to be flexible. For example, when SPR designed the M-CAM evaluation, we did not anticipate that the database would become such a huge part of the project. It became clear over time, however, that if we wanted accurate and meaningful information on program services and outcomes than we would need to provide ongoing data quality support to the colleges. Thus, in consultation with Macomb Community College (the lead college), we recalibrated the evaluation and channeled more resources into the database and dashboards. This is just one example, and in fact we are constantly making minor shifts in the evaluation design and timeline to ensure that what we are producing is relevant and that it is helping to inform meaningful program improvement.

Q. What questions do you have for others about TAACCCT evaluation?
A. I am very curious about how evaluators are evaluating the systems change and collaborative aspects of the grant. How are you assessing changes in partnerships? How are you tracking shifts in policy at the college level?

Q. Do you have tools, reports or other products you are willing to share?
A. We can share our fidelity assessment tool and examples of the data dashboard.

Heather Lewis-Charp can be reached at