Improving Evaluation Rigor: A Framework for Data Quality and Use in Observational Studies
Thursday, November 13, 2025
10:20 AM - 10:25 AM CST
This presentation introduces a practical framework to enhance data quality and usage in program evaluations grounded in observational study designs. The framework centers on two interdependent pillars: data accuracy and data utilization efficiency. While data scientists, evaluators, and methodologists often approach these domains separately, their integration is essential for producing credible, actionable results. We propose a cohesive process that aligns data validation techniques with evaluation practice. Using cross-validation on existing program datasets, we demonstrate how targeted improvements—applied during data collection and preparation—can mitigate common issues such as missing values, fabricated responses, and inconsistent entries. The approach includes strategies to address both intentional and unintentional data gaps and improves reliability in settings where randomized controlled trials are not feasible. This framework offers a replicable model to strengthen the methodological rigor and practical impact of community-based and Extension program evaluations using secondary or observational data.