Large ‐scale evaluation efforts and their implications for the field
AbstractThe BUILD initiative is part of the Diversity Program Consortium, which the National Institutes of Health funded to increase diversity in biomedical research. This chapter aims to identify implications for the field from the multisite evaluation of BUILD initiative programs by reviewing the work undertaken by the authors of the other chapters in this issue. Given the complexities involved in multisite evaluations, innovative approaches and methods were used to balance the needs of each site with the overall objectives of the broader initiative. These approaches included a flexible orientation to the evaluation, mix...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Tarek Azzam Tags: ORIGINAL ARTICLE Source Type: research

Advice from local/site evaluators: How to manage “up” within a large‐scale initiative
AbstractBUilding Infrastructure Leading to Diversity (BUILD), an initiative of the National Institutes of Health (NIH), provides grants to undergraduate institutions to implement and study innovative approaches to engaging and retaining students from diverse backgrounds in biomedical research. The NIH awarded BUILD grants to 10 higher education institutions in multiple states, including funding for local evaluations. This chapter presents findings from an online survey and interviews with 15 local evaluators from nine of the 10 BUILD sites. Participants shared their perspectives on the role of professional local evaluators...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Melanie Hwalek, Matt Honor é, Shavonnea Brown Tags: ORIGINAL ARTICLE Source Type: research

Theoretical and conceptual frameworks across local evaluation efforts in a nationwide consortium
AbstractThis paper describes the theoretical and conceptual frameworks used to guide the site-level evaluations of Building Infrastructure Leading to Diversity (BUILD) programs, part of the Diversity Program Consortium (DPC), funded by the National Institutes of Health. We aim to provide an understanding of which theories informed the evaluation work of the DPC and how the frameworks guiding BUILD site-level evaluations are conceptually aligned with one another and with the consortium-level evaluation. (Source: New Directions for Evaluation)
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Christina A. Christie, Carmel R. Wright Tags: ORIGINAL ARTICLE Source Type: research

Understanding the context and appreciating the complexity of evaluating the Diversity Program Consortium
AbstractThe National Institutes of Health (NIH) made a sizeable investment in developing a scientific approach to understanding how to best increase diversity in the NIH-funded workforce by fostering inclusive excellence at a national scale through the Diversity Program Consortium (DPC). This chapter provides an overview of the context in which the consortium-wide evaluation study has taken place to provide readers with an understanding of its level of complexity. This evaluation effort is the first large-scale, national, systemic, longitudinal evaluation of harmonized interventions focused on undergraduate biomedical rese...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Lourdes R. Guerrero, Teresa Seeman, Heather McCreath, Nicole M.G. Maccalla, Keith C. Norris Tags: ORIGINAL ARTICLE Source Type: research

Describing engagement practices for the Enhance Diversity Study using principles of Tailored Panel Management
AbstractThe purpose of this chapter is to examine engagement strategies used in a large, multisite evaluation study through the lens of Estrada, Woodcock, and Schultz's (2014) tailored panel management. The evaluation, called the Enhance Diversity Study (EDS), is part of an effort funded by the National Institutes of Health (NIH) to increase diversity in NIH-funded research. The chapter discusses engagement with a large national cohort of student participants and outlines survey administration complexities, tailored engagement approaches, and annual survey response trends. It shows how the EDS expanded Estrada and colleagu...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Karina D. Ramirez, Cynthia J. Joseph, Hansook Oh Tags: ORIGINAL ARTICLE Source Type: research

The funders ’ perspective: Lessons learned from the National Institutes of Health Diversity Program Consortium evaluation
AbstractAdvancing diversity in the biomedical research workforce is critical to the ability of the National Institutes of Health (NIH) to achieve its mission. The NIH Diversity Program Consortium is a unique, 10-year program that builds upon longstanding training and research capacity-building activities to promote workforce diversity. It was designed to rigorously evaluate approaches to enhancing diversity in the biomedical research workforce at the student, faculty, and institutional level. In this chapter we describe (a) the program's origins, (b) the consortium-wide evaluation, including plans, measures, challenges, an...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Kenneth D. Gibbs, Christa Reynolds, Sabrina Epou, Alison Gammie Tags: ORIGINAL ARTICLE Source Type: research

Implementing case study design to evaluate diverse institutions and STEM education contexts: Lessons and key areas for systematic study
We describe lessons learned from the case study design used for the evaluation of BUILD that applies to administrators of STEM initiatives who are interested in case study methods and to evaluators who are familiar with case studies and tasked with program evaluation of a multisite STEM program. These lessons include practical considerations for logistics and the importance of clarifying the goals of the case study design within the larger program evaluation, fostering the continuation of knowledge within the evaluation team, and embedding trust building and collaboration throughout all stages of the case study. (Source: N...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Krystle P. Cobian, Damani Khary White ‐Lewis, Sylvia Hurtado, Hector V. Ramos Tags: ORIGINAL ARTICLE Source Type: research

A meta ‐analysis approach for evaluating the effectiveness of complex multisite programs
AbstractThe National Institutes of Health (NIH) created the Building Infrastructure Leading to Diversity (BUILD) initiative to incentivize undergraduate institutions to create innovative approaches to increasing diversity in biomedical research, with the ultimate goal of diversifying the NIH-funded research enterprise. Initiatives such as BUILD involve designing and implementing programs at multiple sites that share common objectives. Evaluation of initiatives like this often includes statistical analyses that combine data across sites to estimate the program's impact on particular outcomes. Meta-analysis is a statistical ...
Source: New Directions for Evaluation - August 8, 2022 Category: Universities & Medical Training Authors: Catherine M. Crespi, Krystle P. Cobian Tags: ORIGINAL ARTICLE Source Type: research

An Evaluation Roadmap for a more effective government
AbstractConsistent with the American Evaluation Association's (AEA) mission, the Evaluation Roadmap for a More Effective Government is a document approved by AEA members and the Board that outlines a vision of the role for evaluation in the federal government. The Evaluation Roadmap outlines steps to build government capacity for strengthening the practice of evaluation throughout program life cycles and for decision-makers to use evaluation to inform policymaking. The Evaluation Roadmap was first developed in 2009 and revised by AEA in 2019 to share the lessons learned in agencies that have applied evaluation. Several age...
Source: New Directions for Evaluation - April 29, 2022 Category: Universities & Medical Training Authors: American Evaluation Association Evaluation Policy Task Force Tags: ORIGINAL ARTICLE Source Type: research

Issue Information
(Source: New Directions for Evaluation)
Source: New Directions for Evaluation - April 27, 2022 Category: Universities & Medical Training Tags: ISSUE INFORMATION Source Type: research

Editors ’ notes
(Source: New Directions for Evaluation)
Source: New Directions for Evaluation - April 27, 2022 Category: Universities & Medical Training Authors: Melvin M. Mark, Nicholas R. Hart Tags: EDITORIAL Source Type: research

The importance of implementation: Putting evaluation policy to work
AbstractFederal agencies are increasingly expected to write and implement guidance for program evaluation, also known as evaluation policies. The Foundations for Evidence-Based Policymaking Act required such policies for some federal agencies, and guidance from the White House Office of Management and Budget outlined an expectation that all agencies develop evaluation policies. Before these expectations, many federal agencies were already developing such policies to suit organizational needs and contexts. This chapter details findings from interviews with stakeholders at ten federal agencies and offices that developed and ...
Source: New Directions for Evaluation - April 27, 2022 Category: Universities & Medical Training Authors: Leslie A. Fierro, Alana R. Kinarsky, Carlos Echeverria ‐Estrada, Nadia Sabat Bass, Christina A. Christie Tags: ORIGINAL ARTICLE Source Type: research

The future of evaluation policy
AbstractWe highlight some key issues regarding evaluation policy, including themes that emerged across chapters of this volume. These topics include what an evaluation policy is, the kind of content that evaluation policies can have, learning agendas (which are an increasingly common component of evaluation policies, especially at the U.S. federal level), the processes by which evaluation policies are developed and implemented, the role of relationships in evaluation policies, and the consequences of evaluation policy. We briefly highlight how the chapters in this volume offer guidance to those involved with developing, im...
Source: New Directions for Evaluation - April 27, 2022 Category: Universities & Medical Training Authors: Melvin M. Mark, Nicholas R. Hart Tags: ORIGINAL ARTICLE Source Type: research

Putting it all together: The case of the U.S. Department of Labor's evidence ‐building strategy
AbstractThis chapter describes how and why the U.S. Department of Labor (DOL) structured and implemented a comprehensive evidence-building strategy in the years ahead of the federal legislation that now requires many of the same key components. In 2010, the Chief Evaluation Office was established in DOL at the departmental level to coordinate evaluation strategy and evidence building and to promote an organization-wide culture of learning. This represented a new approach intended to elevate the priority on evidence, improve the scope and quality of evaluations and research, and expand the use of evidence. The DOL strategy ...
Source: New Directions for Evaluation - April 27, 2022 Category: Universities & Medical Training Authors: Molly Irwin, Demetra Smith Nightingale Tags: ORIGINAL ARTICLE Source Type: research

Learning agendas: Motivation, engagement, and potential
AbstractIn 2017, the U.S. Commission on Evidence-Based Policymaking recommended that federal agencies produce strategic plans focused on research and evaluation, referred to as learning agendas. This requirement was later incorporated into the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act) for the 24 largest federal agencies. Prior to the Evidence Act, only a few federal agencies had experimented with learning agendas, a relatively new concept in the evaluation literature. Learning agendas hold potential for supporting organizational strategic planning that focuses on the generation of relevant know...
Source: New Directions for Evaluation - April 27, 2022 Category: Universities & Medical Training Authors: Kathryn Newcomer, Karol Olejniczak, Nicholas Hart Tags: ORIGINAL ARTICLE Source Type: research