One of the challenges of demonstrating success for STEM-focused programs is measuring impact. For funders of STEM programs, it can be challenging to compare peer programs when each nonprofit reports their impact using different language and measures of success. Additionally, STEM outcomes span many criteria depending on subject area, intensity of programming, and proficiency level of beneficiaries.

YWCA El Paso del Norte’s After School STEM Program offers focused attention in math to school–age children primarily in grades K-5 at local public schools. Instruction from highly-qualified tutors is offered in an informal setting, during both the Fall and Spring academic terms, and during intersession, spring and summer breaks. Participants who engage with the curriculum will demonstrate improved grades in math and attitudinal changes toward math (increased interest and reduced anxiety).  

Longitudinal analysis, allows the agency to eliminate anomalous outcomes from its data reporting and provides funders, donors, clients, and other stakeholders with the most accurate program impact dataFor instance, evaluations conducted during a previous grant cycle may have yielded more striking results for the student grade improvement outcomes, but over multiple cycles, longitudinal analysis provides the agency with a more realistic picture of the program’s impact on student academic outcomes.” – Gus Cohen, grant writer at YWCA El Paso del Norte

The After School STEM Program demonstrates high quality impact measurement by conducting a longitudinal analysis with a sample of their students (over a third) to track grades and increased math performance from the first nine–week program period through the third nine–week period. The program uses a measurement that is proximal to the beneficiary population (students) and has high validity. Additionally, their pre/post measurement of whether students improved their grades fits closely with the STEM Proficiency outcome criteria. Based on this data, YWCA El Paso del Norte Region’s After School STEM Program demonstrates an efficacy rate of 74%, slightly below the benchmark range for STEM Proficiency (84-89%).

IGP benchmarks give important context to outcomes and evaluation data. Without knowing the evidence quality of other STEM Proficiency programs, it might appear that the After School STEM Program is performing more poorly than its peers. Layering in analysis of sector evidence quality benchmarks, however, we know that their impact data is stronger than average—just 55% of programs in the sector use similarly robust evidence. The program’s success rate may actually be more accurate than programs with higher efficacy rates who implement lower-quality evaluation. Put simply, by demonstrating a high quality of evidence, their funders can be more confident in the success of the program.

Establishing and maintaining high quality measurement isn’t an easy lift on an organization and requires strategic planning. As Diana Hastings, YWCA Senior Programs Administrator, notes:

Ongoing measurement and reporting require substantial personnel investment to be done correctly. It also requires that personnel have a clear understanding of the program and funder’s desired outcomes. For that reason, as the agency has begun to understand measurement more comprehensively for our various programs, we have started to budget for evaluation in grant proposalsProper performance measurement requires a significant time commitment from staff. Consistent staffing with low rates of turnover facilitates a consistent process of ongoing performance measurement. Also, performance measurement will always be prioritized below direct client services. If staff have to be reallocated to ensure that childcare ratios are in compliance with licensing requirements, for instance, data collection and reporting is often the first thing to suffer.”  

To nonprofits with similar programming who are looking to improve the quality of their evaluation their quality of measurement, Diana Hastings recommends that they “recognize that performance measurement and program evaluation are investments with the potential to yield returns. As resources allow, budget for evaluation in grant proposals, and work to fund positions that can facilitate data collection and reporting.”

Are you a nonprofit interested in learning more about measuring your impact? Or are you a funder who wants to learn how to better support grantee’s evaluation? Go to impactgenome.org to check out our Impact Genome outcomes taxonomy, explore levels of evidence, and see how your nonprofit or a program you fund compares to peer programs.