We invite you to explore our case studies and learn how our evaluation coaching process has helped organizations develop and evaluate their own outcomes to refine their programs like never before.
Robert R. McCormick Foundation
As one of the largest philanthropic organizations in the nation, The Robert R. McCormick Foundation has a mission to foster communities of educated, informed and engaged citizens. Behavioral health and prevention services are among the many causes the Foundation supports.
There is often a gap between funders and their grantees regarding what the funder needs and the grantees realities. The McCormick Foundation sought to bridge this gap by bringing together 29 social service agencies it funds and exploring how to best measure and improve evaluation within the child abuse prevention and treatment funding area. Foundation staff spoke with grantees one-on-one, introducing the idea of unifying outcomes by creating uniform evaluation tools, building evaluation communities and providing capacity-building support. With buy-in from the 29 grantees, the Foundation launched the Unified Outcomes Project, partnering with PIE for evaluation coaching.
We approached the Project in three distinct phases.
In Phase One, foundation personnel and our evaluation coach met with all 29 grantees to listen, learn and discuss how data was collected and reported. In this meeting, the group collectively agreed on the three areas to focus their evaluation efforts. Then, the grantees were divided into three cohorts to match those areas—these cohorts became communities of practice.
We surveyed grantees about the assessment tools they most preferred. The results showed that the grantees were largely aligned on six common tools they were willing to use. Our evaluation coach then trained each cohort on implementing the tools and developing common protocols for all grantees to follow.
In Phase Two, we facilitated in-person meetings with each community of practice. Coaching support was offered at three levels of intensity. Grantees who chose the lowest level of intensity simply participated in cohort meetings throughout the year. Effectively, the remainder of the grantees chose the most intense level that included cohort meetings, one-on- one coaching and support on additional items such as logic models and using data for program improvement.
In Phase Three, we added benchmarking grantee practices as the evaluation coaching and capacity building work continued. Based on the cohorts’ feedback, the Foundation revamped the grant application as well as the rubric for assessing the application.
Convening grantees and the Foundation created a spirit of collaboration that ultimately led to program improvement. The communities of practice and coaching wove grantees into a network of learning and shared reflection on evaluation. With unified measurement and assessment in place developed directly by the stakeholders, the McCormick Foundation and its grantees have a new way to build consensus and deliver the greatest value to the populations they serve.
For a deeper dive, read the published research on this project or view the video below.
The Unified Outcomes Project
National Museum of Mexican Art
Chicago’s Pilsen neighborhood is the heart of the Mexican community and home to the National Museum of Mexican Art. The museum embraces its responsibility to educate its patrons about the breadth and depth of Mexican art, culture and history. Education is so important that one-fourth of the museum’s annual operating budget is allocated to it and one-third of their full-time staff are educators.
Like many nonprofits, the museum first engaged in program evaluation after receiving a grant with funds specifically set aside for external evaluation. However, with the exception of the Education Director, none of the staff had ever worked with an evaluator before. Previous evaluation activities were limited to qualitative questionnaires that drove decision making anecdotally.
PIE began working with the museum as an external evaluator, providing tools, database development, data collection, data entry, data analysis and reporting. Then, our evaluator moved into the role of coach, helping the staff develop collaboratively the skills and knowledge to conduct their own internal program evaluations on a limited budget.
The evaluation coaching approach began with on-site professional development for the museum’s education staff in a large group setting. During these workshops, the coach taught about the purpose of evaluation, how to create logic models, data collection methods and more.
Coaching also included one-on- one support with staff members. Developing rapport and trust with staff, our coach built the evaluation capacity of each individual, meeting them where they were at in their understanding of the evaluation cycle and tailoring their training accordingly.
Evaluation coaching helped museum staff develop and internalize a culture of evaluation. They began to take ownership of their program reporting with confidence, and used what they learned to improve their grant writing. Today, the museum’s education department lives and breathes evaluation, allowing them to further deliver on their mission of celebrating and teaching the community the rich history of Mexican art.
For more information, access the published research article.
South Suburban Family Shelter
South Suburban Family Shelter provides services to over 1000 adult domestic victims and 100 children per year. They have a holistic approach to fighting domestic violence with seven separate programs that provide community education, counseling, and advocacy throughout the South Suburbs of Chicago, including a separate intervention specifically for abusers.
With seven separate programs, this agency experienced a problem common to small non-profits without an internal evaluation person. They were inundated with an unmanageable number of different mandated tools and processes used to manage data. They struggled to manage data across paper files, many internal tools, and funder/partner data systems. Without dedicated evaluation staff to organize the flow of data collection and reporting, the director of counseling became the defacto respondent to all data requests; taking up approximately 40 hours per month in reporting.
The director first asked PIE to develop a data collection and reporting protocol for one specific tool. After discovering the clarity this new protocol provided, she asked PIE to provide a streamlined protocol for all data collection and reporting. We had a three-stage approach.
First, we conducted a logic model workshop for all staff and board members for them to identify their theories of change; aligning their philosophy of service with daily activities and SMART outcomes.
Next, we worked closely with the director to have one on one discussions with each program to clarify expectations and strategically plan how each of the programmatic outcomes aligned to agency outcomes and mission. We also documented reporting requirements for each program.
Collaborating to define unified outcomes across the agency was a huge step in strategic planning and staff professional development. The logic models also provided institutional memory for the agency when the director retired and a new director was hired. Most impactful was that this new automated reporting protocol saves the director approximately 35 hours per month in time she used to spend tracking down data and organizing it into reports for funders and partners.
Loyola University Chicago
Since 1983, The Lloyd A. Fry Foundation has been committed to supporting organizations that tackle the toughest, persistent problems in urban Chicago. The Foundation’s mission specifically states its goal to “build the capacity of individuals and the systems that serve them.” While the Fry Foundation awards grants across many program areas, funding programs that strengthen leaders in public schools is a cornerstone of the foundation’s activities.
The problems facing Chicago Public Schools (CPS) are complex and require strong, transformative leadership. In response to this need, Mayor Rahm Emmanuel announced the Chicago Leadership Collaborative (CLC). The CLC, in partnership with ten universities, strives to recruit, train, support and retain effective principals who can meet the evolving challenges of the diverse school district.
Loyola University is one of these ten partners. Its three-year Principal Preparation Program is designed to give candidates the necessary skills and knowledge to graduate with an M.Ed. in Administration & Supervision with Principal Endorsement.
Due to its unique partnership with CLC, Loyola needed to develop formative evaluation tools to improve program quality and program documentation in alignment with CLC’s criteria. Loyola also sought to provide the support of an evaluation coach to its Leadership Coaches (principals who serve as mentors to the Program’s candidates).
With a grant from the Fry Foundation, PIE partnered with Loyola to:
- Define the purpose of the formative evaluation;
- Clarify assessment goals and objectives;
- Determine key assessment needs;
- Develop, pilot and refine formative evaluation tools;
- Collect data;
- Share results;
- And develop the evaluation feedback loop to ensure that evaluation became part of the program’s process rather than an external add-on activity.
In collaboration with Loyola faculty and Leadership Coaches, our work involved creating:
- A logic model to be used as a strategic planning tool;
- An e-Portfolio onboarding assessment;
- A site visit summary form;
- A mentor principal meeting log;
- Formative satisfaction questionnaires of special sessions;
- Syllabus with grading rubric for internship;
- Sortable database of previous graduates for future longitudinal analysis;
- And an audit form for the candidate file.
100% of the Leadership Coaches in the Principal Preparation Program have increased their ability to conduct formative evaluation processes with the tools we created. These tools have automated reporting processes systematized into the programmatic calendar to create a “pathway of convergence” where coaches, candidates, and faculty can instantly provide feedback and monitor progress status.
In addition to developing the tools for internal formative data collection, our evaluation coach helped Loyola faculty and leadership coaches more clearly define the purposes of all the tools toward a systematized reporting system for development during years three and four of the Program.
In the first two years of program implementation, we integrated all interviews, document analyses, observations, and literature review into a comprehensive case study for Program stakeholders. Candidates, Leadership Coaches, and faculty have all used this case study to re-examine their role in relation to that of the student, parent, classroom teacher, and school initiatives as a way to create and identify shared purposes and practices connected through their leadership.
Finally, the faculty has found tremendous value in having PIE provide an outside perspective on the measurement and reporting necessary to create an evaluation cycle within the Program. Our drive to leverage technology created efficiencies in the onboarding process, moving it from paper to e-portfolios. We created a shared box drive to expand the digital tools available to stakeholders, enabling them to quickly upload data as well as easily retrieve and view it.