Project evaluation, carefully designed and conducted in collaboration with project developers, can help any broader or societal impact project improve its deliverables, achieve its goals, and measure the degree to which its targeted objectives have been met. For many funded projects, evaluation is a required component. Sometimes the evaluation must be conducted by an external (outside of UA) evaluator. Other times, it is sufficient to use an independent (not affiliated with your project or department/unit) internal evaluator.
Proposal developers should contact an evaluator at least three months in advance of the proposal’s due date—earlier if possible. A good evaluation plan should be closely aligned with the project’s goals and activities. To achieve good alignment, the evaluator needs time to review a draft of the proposal, ask questions, and develop a sound evaluation plan. With short notice, some evaluators may offer to provide a generic evaluation plan. However, seasoned proposal reviewers will give your proposal a more favorable review if it has a well-integrated, tailored evaluation plan.
Whether you are writing a new proposal or are interested in evaluating an existing program, Societal Impact can help you to get started with your evaluation planning. While we may not have the capacity to conduct a project evaluation, we can help you design your project evaluation and provide periodic check-ins, or we can help to connect you with an independent internal (UA) evaluator or external evaluator.
The best place to start with any project evaluation is by completing a logic model for the project. A logic model helps you to identify the project resources, activities, outputs (observable products or completed activities) and outcomes (shot-, mid- and long-term impacts). Once the logic model is complete, it can help you to plan and monitor project evaluation. Using the logic model as a guide, you will want to determine your evaluation questions which will then inform an evaluation data matrix. Please explore additional evaluation resources below, or request a consultation with our team.
Logic Model Template
Download the PDF
Indigenous Evaluation Toolkit
To find more information about the Indigenous Evaluation Toolkit visit : indigenousphi.org
Download the PDF
Logic Models: A tool for designing and monitoring program evaluations
Download the PDF
EvaluATE: Evaluation Data Matrix Template
Download the PDF
Evaluation Questions Checklist for Program Evaluators
Download the PDF
The 2010 NSF User-Friendly Handbook for Project Evaluation
Download the PDF
Broader Impacts Project Evaluation
This website presents a number of tools for evaluating your broader impacts projects.
Evaluation Flash Cards: Embedding Evaluative Thinking in Organizational Culture
Download the PDF
Principal Investigator’s Guide: Managing Evaluation in Informal STEM Education Projects
Download PDF
Evaluating Learning Outcomes from Citizen Science
Download the PDF
Finding and Selecting an Evaluator
Download the PDF
Evaluate Undergraduate Research
Visit the "Evaluate UR" website
NSF Framework for Evaluating Impacts of Broadening Participation Projects
Download the PDF
References will be available shortly
Program Evaluators
Evaluator | Affiliation | Area of Specialization | |
|---|---|---|---|
Sanlyn Buxner | Department of Teaching, Learning, and Sociocultural Studies - University of Arizona | Extensive experience in federally funded grant program evaluation Science education projects
| |
Melodie A. Lopez, Veronica Hirsch, Nicholas Wilson | Indigenous strategies utilizes Indigenous frameworks in evaluation and program development including tenets from Indigenous Data Sovereignty and the AIHEC Indigenous Evaluation Framework. We assess both quantitative and qualitative measures of the 7Rs (Respect, Relevance, Reciprocity, Responsibility, Relatedness, Relationships and Redistribution). Evaluations are centered in native nation building. | ||
Carol Haden | STEM programs | ||
Jill Williams | Southwest Institute for Research on Women - University of Arizona | Process, impact, and outcome evaluations Foci: Diversity, inclusion, and equity-focused programs and initiatives; STEM educational programs Methods: qualitative (interviews, focus groups, observation) and descriptive statistics Approach: I'm most interested in conducting evaluations that are collaborative with the project team and develop in response to changing circumstances/needs. | |
Jo Korchmaros | Jo Korchmaros, Director, jkorch@arizona.edu Beth Meyerson, Director of Graduate Studies, bmeyerson@arizona.edu | Southwest Institute for Research on Women - University of Arizona | Sexual health Housing insecurity Substance abuse treatment Public health Social policy We also offer a 100% online master's degree in Program Design and Evaluation |