TCI Featured Evaluators: John and Maggie Cosgrove

by John and Maggie Cosgrove / Nov 24, 2015

This post is part of a new blog series called Transformative Change Initiative (TCI) Featured Evaluator, that includes interviews with members of TCI’s Evaluation Collaborative. This community of evaluators has a wealth of knowledge, experience and insights into evaluation of the TAACCCT grants that are being implementation throughout the United States. Want to be profiled or know someone who would make a great feature? Email us at

Name: Cosgrove & Associates, John Cosgrove & Maggie Cosgrove
Current position: Senior Partners Cosgrove & Associates, Inc.

john-cosgroveJ. Cosgrove, Short bio: Mr. John Cosgrove holds an MA in Sociology and has more than 30 years of community college experience in database design, program/grant evaluation, institutional research, strategic planning, and student outcome assessment. John also has similar experience as classroom faculty including courses in Statistics and Research Methods. His firm understanding of the needs of community college students and how such needs impact instructional program design and related student support strategies has allowed Mr. Cosgrove to help Trade Adjustment Assistance Community College and Career Training (TAACCCT) partners evaluate programs and use data for continuous improvement.. Mr. Cosgrove’s work with community college institutional research, strategic planning, and the use of evaluation data for continuous improvement is nationally recognized. His approach to using assessment and evaluation to improve student learning outcomes is documented in “Designing Effective Assessment: Principles and Profiles of Good Practice (Banta, Trudy 2009) and “Assessment Is More Than Keeping Score: Moving from Inquiry, Through Interpretation, To Action.” (League for Innovation: Learning Abstracts 12.2, 2009).

maggie-cosgroveM. Cosgrove, Short bio: Ms. Margaret Cosgrove holds a MS in Urban Affairs and Policy Analysis. She has served as an adjunct professor for the University of Missouri-St. Louis (courses include Research Methods and Statistics), and has extensive grant management, data analysis, and evaluation experience. Ms. Cosgrove has served as the lead researcher and managed all quarterly and annual reporting processes associated with several TAACCCT consortia.  Working closely with grant leadership and faculty/staff, Ms. Cosgrove designed customized quarterly evaluation reporting processes which allowed colleges to more closely examine the relationship between program/strategies implementation and impact/outcome evaluation.  In addition, Ms. Cosgrove is leading a statewide effort to promote the scaling of successful innovations through the use of Network Improvement Communities


Q. What is the design and predominant methods for your TAACCCT evaluation?
A. We prefer a mix-method approach and when possible quasi-experimental designs with an appropriate comparison group. Given the complexities involved with TAACCCT grants and related strategies, we have found that TAACCCT projects tend to challenge traditional evaluation approaches. To help over such challenges we recommend Patton’s Developmental Evaluation approach (2011) to help ensure that evaluation can help meet DOL reporting requirements, and equally, if not more importantly help ensure the use of evaluation data for continuous improvement and scaling. We also recommend Preskill and Gopal’s Evaluating Complexity as an excellent source.

Q. What advice do you have for new TAACCCT evaluators?
A. First, it is critically important for the evaluator to pay attention to implementation and outcome/impact evaluation. Given our experience, grant implementers often need to modify and adapt grant programs and strategies during the course of the desired implementation. In short, if one only pays attention to what was written in the grant SOW, one is likely to miss a number of key ingredients. Before evaluating outcomes and impacts, the evaluator must be sure of what actually took place. Look for unintended consequences.

Second, although the grantee is required to address DOL metrics, please, please, please work with the grantee to explore the difference between outcomes and impact, and stress the importance of exploring grantee questions which are beyond what DOL requires. Many TAACCCT grants are experimenting with significant changes to traditional program structures, delivery systems and student support processes. By working with the grantee to outline important questions which go beyond DOL data metrics, the evaluator can help direct attention to what is being learned during the project.  By focusing on what the grantee is learning, the evaluator and the grantee are more likely to discover what strategies actually took place, what strategies worked, and why certain strategies proved successful.

Q. What questions do you have for others about TAACCCT evaluation?
A. This is a great question. Efforts like TCI are creating opportunities for reflection and sharing of experiences, data, and lessons learned from the TAACCCT projects. Grantees and evaluators are often so busy “doing the grant” and examining project-specific data they often don’t time to schedule time for data reflection and interpretation. Keeping in mind the importance of determining what was learned, we would like to learn from others how have they have balanced the need for DOL compliance reporting with the need to spend more time examining and discussing data/evaluations results in an effort to support continuous improvement and scaling.

Q. Do you have tools, reports or other products you are willing to share?
A. Since the evaluation team may not be a part of the grant-writing stage, disconnect between grant implementers and grant evaluators can occur as the project is starting. To help alleviate this problem we employ the following IDID Model: Inquire, Discover, Interpret & Develop to help ensure that grant designers, grant implementers and evaluators are connected.

The IDID Model

  • Inquiry: What is it that we want to know and how to best define specific compliance and evaluation/policy questions.
  • Discover: What data are currently available? What data gaps exist and how will such gaps be addressed?
  • Interpret: How will the Evaluator partner with the Grant Team and Grant Stakeholders to interpret data on a continuous basis?
  • Develop: How will the Evaluator partner with the Grant Team to examine strategies for scaling and sustainability?

John Cosgrove can be reached at and Maggie Cosgrove can be reached at