Research & Evaluation

Research & Evaluation

Research and evaluation are essential to continue expanding the reach and impact of infant and early childhood mental health consultation (IECMHC). By gathering, analyzing, and reporting data, programs can learn about what they have been doing well, as well as what can be improved or made more efficient. Further, evaluations are essential for educating others – including policymakers, funders, and families – about the value of IECMHC.

The foundation of a good program evaluation is a well-defined program that is implemented with fidelity. The Center of Excellence has compiled a wide array of resources to help ensure that programs are familiar with IECMHC Basics and can conceptualize and articulate their program design so that it is implemented as intended. These foundational pieces set the stage for a strong program evaluation, but the resources in this section – including step-by-step guidance – will help programs progress in their evaluation efforts.


Advancing the Evidence Base for IECMHC: Guidance for Evaluators and Program Partners

This guide is intended to help programs design and implement an approach to evaluation that is intentional, realistic, and driven by their priorities. Also check out the accompanying Interactive Evaluation Plan Worksheet.


Program Evaluation Tools

Program Planning and Reflection Tool (PPRT) This program-level online assessment tool helps users evaluate progress in key IECMHC program implementation areas while providing guidance on how to put these elements into place. An entire domain is devoted to program evaluation and covers: creating an evaluation team; identifying research questions; designing the evaluation; data collection & analysis; and communicating results. Users can complete an interactive version online, or simply download PDF versions for strategic planning. Access PDF files here: Program Evaluation PPRT Module only | Entire PPRT (all five modules).

Early Childhood Mental Health Consultation: An Evaluation Tool Kit. This tool kit provides guidance, tools, and resources that will assist in designing and implementing program evaluations. For researchers, this toolkit will also help guide their work in research design and evaluation, determining evidence of effective practices, and decision-making related to tools and approaches to partnering with program staff.

Advancing the Evidence Base: Guidance for Evaluators and Program Partners This guide illustrates the different purposes for which data may be gathered and analyzed in the context of IECMHC, organized by the goals of the program. It is intended to help programs design and implement an approach to evaluation that is intentional, realistic, and driven by their priorities.

Evaluation Plan Template This document outlines the elements of a comprehensive IECMHC evaluation plan. Teams may write an evaluation plan as they design their own evaluations and seek funding for it, or they may request written evaluation plans if employing an external evaluator. This outline is intended to be used as a resource for teams seeking to understand what to include in a written evaluation plan. It is not strict guidance; rather it provides a starting point that teams may adapt to suit their goals. Ultimately, the goal is to gain clarity and group agreement on the key questions to be answered, the methods to be used to answer them, and the resources that will be dedicated to the undertaking.

Step-By-Step Program Evaluation Design Guidance

COE Cycle of Evaluation Graphic

Five Steps for Evaluating IECMHC. This comprehensive, step-by-step guidance helps programs design and build upon their IECMHC program evaluations. Within each expandable step below, users will find a diverse array of evaluation insight and resources to support IECMHC programs and program evaluators every step of the way.

To organize concepts for both program development and evaluation purposes, theories of change and logic models serve as “roadmaps” that graphically depict the connections between the community context, the actions to be undertaken, and the desired outcomes. Please see the documents below for examples of theories of change and logic models as well as guidance for creating your own.

The evidence base for Infant and Early Childhood Mental Health Consultation has grown significantly in recent decades. Learning about evaluations of other IECMHC programs helps teams to develop their own questions that build upon the foundational evidence.

  • The Evidence Base for How & Why IECMHC Works: This article reviews the evidence base for Infant and Early Childhood Mental Health Consultation (IECMHC) and organizes the literature based on alignment with the CoE’s theory of change. The authors describe the current state of evidence for IECMHC, identify gaps in the literature, outline future research directions, and provide a call to action.
  • Reflection Guide for Empirical Articles about IECMHC: Through a series of checklist-style reflection questions, this brief resource guides readers in best practices for interpreting research studies about IECMHC. You will be empowered to understand the strengths, weaknesses, main points, and implications of various empirical articles and, ultimately, apply the findings to your own work.
  • Evidence Synthesis: This brief summarizes the status of the evidence for IECMHC, including both peer-reviewed journal articles and evaluation reports. Future directions for research are provided.
  • Searchable Evidence Database for IECMHC: Using this resource, you can explore the evidence base for IECMHC. In this searchable tool, you can view summaries of all peer-reviewed articles that have been published about IECMHC. You can search the article summaries based on key study details (such as research design and setting), and you can also explore the evidence supporting each element of the IECMHC theory of change. 
  • Annotated Bibliography: Many key findings from innovative studies can be found in peer-reviewed journal articles, but many professionals do not have access to these journals. The Center of Excellence has created a comprehensive Annotated Bibliography that summarizes the important contributions of each known, published study of IECMHC, cutting across different settings and outcomes of interest. In addition, papers that describe IECMHC practices or theories but do not present new analyses are also included. This resource will be updated annually to provide consultants, administrators, students, and all other interested individuals with current research findings on IECMHC. The Annotated Bibliography is intended to build general knowledge about IECMHC, facilitate writing about IECMHC (for grant applications, reports, etc.), and to demonstrate future directions for research on IECMHC. In this resource, you will have access to:
    • Descriptions of each article’s unique contribution to the empirical knowledge of IECMHC
    • APA-format citations for each article
    • Abstracts for each articlE
  • Infant & Early Childhood Mental Health Consultation: Advancing the Evidence Base (video) This is a panel presentation at the National Research Conference on Early Childhood 2022, moderated by Dr. Deborah Perry, Director of Research & Evaluation at the Georgetown University Center for Child and Human Development (GUCCHD). It features IECMHC research findings from Arkansas, Louisiana, and New York City.
  • Program Evaluation Reports:
  • San Francisco: This exploratory program evaluation of the Early Childhood Mental Health Consultation (ECMHC) initiative in San Francisco—a collaborative effort between Clarity and Indigo Cultural Center—spans three years in the making, punctuated by the many disruptions caused by the COVID-19 pandemic. The report presents findings from two phases: Phase 1 reports on the On-Call model and approach, and Phase 2 explores experiences, impact, and feedback on the ‘traditional’ ECMHC model from the perspective of consultation
  • Alameda County: An evaluation that measured both IECMHC process and outcomes was conducted to evaluate IECMHC services in Alameda County. Quantitative and qualitative data were collected from 2017 through 2019. Key findings included statistically significant increases in consultant self-efficacy, director self-efficacy, classroom emotional climate, children’s attachment, children's self-regulation, and children’s initiative. There was a statistically significant decrease in children's risk of expulsion and consultant hopelessness. In line with this program’s theory of change, consultants who received a higher ‘dosage’ of training and technical assistance reported higher self-efficacy, which was positively associated with improvements in child outcomes and improvements in emotional classroom climate. Further, more training and technical assistance predicted higher fidelity to the intervention standards, stronger relationships with teachers, and better outcomes for directors.
  • This report presents outcomes from four years of Arizona’s statewide IECMHC program, Smart Support. In the large sample (over 1,000 children and 100 MHCs), there were positive outcomes across domains, including improved teacher-child relationships, classroom climate, and child social-emotional skills.
  • This report describes outcomes from 37 centers that engaged in Arkansas’ IECMHC program called Project PLAY. Project PLAY is unique in that it prioritizes child care centers serving children in foster care. The evaluators reported significant improvements in use of developmentally appropriate social emotional supports, classroom environments, and children’s behavior.
  • Colorado’s home visiting programs funded by MIECHV have incorporated IECMHC for home visitors since 2016. The evaluation team gathered interview and survey data from home visitors, supervisors, and consultants to report on the role of the consultant and the way consultation is implemented in home visiting. Barriers and facilitators to implementing IECMHC in this context are explored.
  • This report describes the first randomized-controlled evaluation of an IECMHC program, Connecticut’s Early Childhood Consultation Partnership (ECCP), a 12-week model in which school-based services were provided to infants, toddlers, and preschoolers. Results indicated that ECCP yielded significant improvements in child hyperactivity and oppositionality as well as increased communication between school and home, but did not significantly affect classroom climate. Additionally, information was provided regarding the budget for ECCP in comparison to alternative responses to challenging behaviors.
  • These reports present evaluation findings from the fourth and fifth years of D.C.’s IECMHC program, Healthy Futures, in which IECMHC Consultants were embedded in child development centers, primarily in low-income neighborhoods. Among other positive impacts, analyses indicated that there were significant improvements in children’s social-emotional skills and reduced expulsion rates after child-focused consultation, as well as significant improvements in teacher-child interactions and reductions in teacher turnover after programmatic consultation. A unique finding in Year 5 was the effect of dosage of consultation, with a full year of consultation significantly predicting classroom-and individual-level improvements in behavioral concerns. Process outcomes and lessons learned were also articulated. Sample evaluation tools are included in the Appendix of Year 4’s evaluation.
  • This report presented findings from the three-year evaluation of Illinois’ statewide IECMHC program. This program evaluation is unique because it is the first study to investigate IECMHC in multiple settings (home visiting and ECE), and to use both a mixed-methods matched-comparison group design and multilevel modeling. Twenty-three early childhood programs (center-based child care and home visiting) participated in the evaluation; data from children and consultees were collected at baseline, 6, 12, and 18 months after the start of program implementation. Teachers and home visitors in the intervention group demonstrated greater improvement in reflective capacity over time than those in the comparison group. Teachers with greater reflective capacity rated children’s behavior as demonstrating more strengths and fewer behavioral concerns, and overall teachers in the intervention group indicated a lesser impact of child challenging behavior on their learning and relationships. In terms of the classroom, teachers in the intervention group reported more growth in positive classroom behavior management and classroom equity than the comparison group. Specific to home visiting, parents whose home visitors received consultation reported greater satisfaction in their role as a parent than did the comparison group, and home visitors in the intervention group were observed to engage in more responsive behaviors over time in their interactions with families. A wealth of qualitative data from interviews with staff were also analyzed and reported.

The Maryland State Department of Education (MSDE), Division of Early Childhood (DEC) has been jointly committed to funding IECMHC work since 2002. This was initiated through a pilot project that spanned three jurisdictions in the state. Since 2009, following positive outcomes form the pilot project, MSDE now utilizes Child Care Development Block Grant (CCDBG) dollars to fund 11 programs that cover all 24 Maryland jurisdictions regionally. Through ongoing partnership with the University of Maryland School of Social Work’s Institute for Innovation and Implementation’s Parent, Infant, Early Childhood (PIEC) team, the MSDE-funded IECMHC workforce has grown to include a state-wide data management system, regular production of reports and legislative briefs, as well as ongoing training, professional development and opportunities for reflective supervision.

In 2016, Maryland was selected as one of 14 pilot sites nationwide to receive expert technical assistance through the SAMHSA-funded Center of Excellence (CoE) for IECMHC. This work resulted in the development and publication of new state standards published in 2020. The new standards utilize a multi- disciplinary approach to consultation for the state’s workforce, ensuring at least one clinically licensed consultant per program. This approach was an effort to align the current workforce (which was largely individuals with experience and credentials in the education fields) with the national model, which emphasizes advanced mental health degrees. During this time we also engaged in efforts to support the whole workforce through training in mental health principles and reflective practice.

MSDE’s investment in ongoing evaluation of IECMHC efforts includes a statewide database of IECMHC activities called the Outcomes Monitoring System (OMS).  The PIEC team works with all Maryland IECMHC programs to improve their data collection and assess the impact of services.  This level of data reporting supported the Maryland state legislature to introduce and pass a bill that increased state funding for these services from 1.4 million annually to 3.0 million annually to initiate in 2023. 

In 2021, ongoing consultation with the Indigo Cultural Center was initiated to support the integration of equity and anti-racism approaches throughout our model. This is still ongoing. In 2022, MSDE dedicated ARPA funds to increase service capacity as well as funding for infrastructure to support universal systems such as onboarding, training and coaching, and reflective supervision.

  • A mixed method evaluation was conducted for Michigan’s IECMHC program, the Childcare Expulsion Prevention Program (CCEP), which served center-and home-based childcare settings statewide. Results indicated some positive findings for children (e.g., decreased hyperactivity, improved social skills), parents (e.g., increased empowerment to advocate for child, decreased work/school problems) and childcare providers (e.g., increased sense of competence to manage challenging behavior). Additionally, the team reported on fidelity to the model and parent and provider satisfaction with CCEP.
  • Digging Deeper: This study represents an effort between the Alliance for the Advancement of Infant Mental Health in collaboration with Indigo Cultural Center and the Reflective Supervision Equity Roundtable. The charge of this study was to use a critical and community-forward approach in shaping the field of reflective supervision (RS) in infant and early childhood mental health (IECMH) and advance a new RS paradigm and framework for the Alliance that is influenced by expansive anti-racist, Indigenous, and liberatory frameworks. Their aim was to engage in research that is with, by, and for the BIPOC community members and professionals in the field with a focus on contributing to a sustainable social justice movement within IECMH.
  • Portland State Centering Racial Equity: In Fall 2020, the Oregon Early Learning Division (ELD) contracted with Portland State University’s Center for Improvement of Child and Family Services (CCF) to develop a foundational document that would guide development and implementation of a model for providing statewide Infant and Early Childhood Mental Health Consultation (IECM- HC) services. PSU’s charge from the ELD was to gather information that prioritized and centered the needs, experiences, and strengths of children, families, and early child care and education (ECE) providers of color. Rather than replicating an existing model that may not have been developed for, by, or with Black, Indigenous and People of Color (BIPOC) communities, the ELD saw this as an opportunity to create a system grounded in racial equity. The report summarizes information collected from key systems stakeholders, particularly those representing minoritized communities, and provides detailed recommendations for implementing an equity-focused system of IECMHC in Oregon.
  • In 2016, the Pennsylvania ECMH Consultation Project contracted with Georgetown University Center for Child and Human Development to conduct an external review geared towards situating their program in a national perspective. By analyzing Pennsylvania’s program based on the insights from the national “What Works” (2009) study, the authors identified the strengths of the program as well as areas for continued growth, which were then pursued by program leadership. Among others, one strength included their strong data collection system, and one suggestion was to hire a reflective supervisor. In addition, two years of program evaluation data were analyzed. Findings included: improved child behavior, reduced teacher stress, and increased teacher adherence to the Pyramid Model.
  • The Center of Excellence highlights reflective supervision consultation (RSC) as an important part of its Theory of Change, but few evaluations look at RSC in the context of IECMHC. This program evaluation is unique as this report summarizes a pilot implementation of the Pennsylvania Key’s statewide RSC model for IECMH consultants, specialists, supervisors, and program managers. Over a 12-month period, the Pennsylvania Office of Child Development and Early Learning (OCDEL) in partnership with The Alliance for the Advancement of Infant Mental Health and the Eastern Michigan University School of Social Work collected quantitative and qualitative data from a sample of IECMH professionals. The evaluators noted positive impacts of RSC on participants’ capacities for reflection; experience of their work; skills associated with their roles; and relationships with families, supervisors, and/or other professionals. Findings also suggest positive implications for service outcomes for infants, toddlers, young children and families.
  • Holding Hope Child Care Aware: In 2017, under the direction of the Washington State Legislature, the Department of Children, Youth, and Families (DCYF) began planning for a statewide expansion of Infant and Early Childhood Mental Health Consultation. In 2019, the legislature provided funding for new state supported IECMH Consultation services to be implemented by DCYF in partnership with Child Care Aware of Washington (CCA of WA). By February 2021, the program was officially named Holding Hope. The purpose of this formative program evaluation was to assess the design and implementation of the IECMHC program through its first year, inform efforts to build scalable practices as the program expands across the state, and identify early successes that support positive and equitable long-term outcomes for the social-emotional health of children, families, and the child care providers who serve them. The evaluation employed developmental evaluation methods and participatory strategies to engage key stakeholders, closely involve the IECMHC team in interpreting results, and apply an equity lens in data collection and research.
  • To guide your research questions, identify and consult with stakeholders, including families, funders, and providers. Ideally, representatives from all interested parties should be included in all aspects of the evaluation. Families and community members from diverse populations can offer key insights to help formulate relevant questions and identify variables to measure.
  • Enhancing equity is a foundational goal of IECMHC; all evaluations should use their data to answer questions related to closing disparities, addressing biases, providing culturally and linguistically appropriate care.
  • Primary questions will be driven in part by the funders, the policy climate, the model, and what can be measured accurately using reliable and valid tools. But useful program evaluations should also address questions that are relevant to the early childhood program, administrators, providers, and families. It is important to consider several key factors.
    • What questions MUST you answer (to meet reporting or other requirements)?
    • What questions would you LIKE to answer and WHY (how will you use the information)?
    • Expected effects—what is realistic, based upon your IECMH program implementation? Identify those aspects of early childhood practice that can be expected to change as a result of the “dose” of IECMHC your program is actually delivering. Consider collecting data about possible barriers to successful implementation as well.
    • What information can you readily collect? What reliable tools are available to collect these data?
  • For additional information on community-based participatory research specific to evaluations in tribal communities, please access A Roadmap for Collaborative and Effective Evaluation in Tribal Communities.
  • When developing the data collection plan, it is necessary to consider the participants that you will likely be able to engage (e.g., parents, home visitors, children, etc.) as well as the languages and cultural background of all participants. Evaluators should learn about the psychometrics of each measure (if available) and translation procedures, and should examine measures item-by-item to ensure that they are appropriate for the participants and measuring what is intended.
  • Measure PROCESS and OUTCOME variables
    • The Center of Excellence created a searchable tool for selecting Outcome Measures for IECMHC for evaluators to explore outcome measures used in prior ECE-based IECMHC evaluations.
    • Process variables are critical in moving beyond asking whether consultation had an impact to answering WHY and HOW it has the effect that it had. Process variables can lead to fruitful conversations about strengths and possible areas for improvement for the consultant or the consultation program overall. They can also allow programs to explain null findings and explore predictors of success. Questions that can be answered with process variables include:
      • What is the “dose” of consultation provided to each consultee?
      • Is consultation being delivered with fidelity (i.e., consistent with the program’s written guidelines or program manual)?
        • For example, this ECMHC implementation guide was developed for Maryland’s statewide mental health consultation in 2011. Built upon the findings from the Georgetown What Works study, the guide provides detailed implementation guidance (including sample forms) to help field-based staff operationalize a set of research-based standards. In addition, each standard has a group of self assessment indicators that can assist with fidelity monitoring and continuous quality improvement.
      • How strong is the relationship between the consultant and consultee?
      • How often does the consultant receive reflective supervision, and what is the quality of that supervision?
    • All evaluations should collect demographic information from all participants to allow for disaggregated analyses
  • Consider Qualitative Data
    • Gathering qualitative data involves the collection of non-numerical information by various methods such as recording, transcribing, and analyzing interviews or holding focus groups with staff members, family members, or consultants. Qualitative data often take the form of personal stories or case studies that convey the details of an individual’s experience with IECMHC. These data can stand alone, or can help contextualize the results from quantitative analyses. Qualitative data are critical for weaving in a range of ways-of-knowing, honoring the wisdom of all stakeholders, and incorporating clinical insights and cultural values.
  • Data collection and management
    • It is essential to have a well-organized plan for your data that is exempt or approved by the relevant Institutional Review Board (IRB) prior to beginning data collection. The written plan should address:
      • What information will be collected
      • Who will ensure that consent is obtained from all participants
      • The schedule for data collection (e.g., baseline, and then after every three months of consultation)
      • Who will be responsible for collecting data
      • Where the centralized data will be stored; and
      • Who will take responsibility for ensuring data are recorded, compiled, and reported, including monitoring for missing data and ensuring that they are managed in a HIPAA-compliant manner.
    • Given the likelihood of barriers to data collection, including staff turnover, create a system for double-checking that data collection is happening on schedule for each participant. If you are gathering data pre- and post-intervention, it is essential to have procedures to ensure that post-intervention data are collected – analyses hinge upon having a sufficient quantity of data at the post-intervention time point.
  • Data analysis
    • Depending on the skills and expertise of the evaluation team, consider a formal relationship with a university-based researcher or similarly trained professional. The analyst should have the ability to select a particular statistical test or qualitative analysis strategy that fits the type, quantity, and quality of data that was collected and answers the research questions.
    • Create a data analysis plan that addresses:
      • The type of data analysis to be performed
      • Who will complete the data analysis
      • How often the data will be analyzed
      • The format and process for sharing preliminary and final results
      • Expectations for disseminating the findings
  • Communicating Results
    • The contents of any evaluation report depends on its purpose and the intended audiences.
    • Evaluation data can be crafted into a message by blending science, marketing, communications, and graphical skills. The information can convey concrete take-away messages, comprehensible facts, and ideas for promoting consultation as a valuable service to young children, families, staff, and programs.
    • Make it accessible and meaningful! Can the public, policy-makers, and other stakeholders, including families and providers, easily understand what the data mean and what the implications are for the child, family, staff, programs, and ultimately the community well-being?
    • All system stakeholders should have an opportunity to review the data to provide interpretations. The community members should be informed of not only the results, but also the potential implications of the findings.
    • Publication in a final report or professional journal should not be the primary means of dissemination of community-based research. Ideally, the results should be published in user-friendly formats accessible to all diverse populations.
  • Ongoing data collection is valuable not only for reporting to an external audience, but also for internal program improvement and development efforts. This is called Continuous Quality Improvement (CQI). Preparing interim, and periodic reports provides ongoing opportunity for reflection, reviewing program performance, making mid-course corrections, and hopefully, celebrating successes.
  • Examples of programs that effectively communicated their evaluation results:
    • Using Mental Health Consultations to Support SEL in Early Childhood
      • The Bipartisan Policy Center hosted a Youtube webinar called “Using Mental Health Consultations to Support SEL in Early Childhood” to highlight findings from the recent randomized-controlled trial of IECMHC from Ohio. The webinar incorporated voices from the field – including a parent, center director, and consultant – into a discussion about the importance of consultation.

Additional Evaluation Resources

This product was developed [in part] under grant number 1H79SM082070-01 from the Substance Abuse and Mental Health Services Administration (SAMHSA), U.S. Department of Health and Human Services (HHS). The views, policies and opinions expressed are those of the authors and do not necessarily reflect those of SAMHSA or HHS.