Design and Implementation

We provide three examples of how drivers work in designing and implementing state early childhood programs. The first example focuses on implementation of developmental screening. The second example addresses Wisconsin’s implementation of the Pyramid Model. The third example—infant/toddler quality improvement—is adapted from the work of Paulsell and colleagues.[1]

A Statewide Developmental Screening Program

This developmental screening example is made from a combination of distinct state actions and decisions.

State level

  • A leadership team was developed to included senior staff from across agencies who were involved in the following projects: Individuals with Disabilities Education Act (IDEA) Part C and Part B, prekindergarten programs, quality rating and improvement system (QRIS), a technical assistance organization (contractor), Head Start, child care licensing, home visiting, the workforce registry, and higher education (facilitative administration).
  • Ages and Stages Questionnaire: Social-Emotional (ASQ:SE) was selected as the screener and distributed to a network of family and community engagement organizations (technical leadership).
  • The state contracted with an intermediary organization (professional development provider) to train direct-service providers (staff selection).
  • For the initial rollout, the leadership team focused on programs that were most likely to be successful in integrating the Ages and Stages tool into their program policies and practices, namely, programs at the top two levels of the QRIS (facilitative administration and adaptive leadership).
  • The state developed partnerships with the medical community (pediatricians, family practice, and mental health) to support the use of developmental screening instruments (adaptive leadership).
  • Funding was allocated to provide substitutes so that direct-service providers could attend training on the ASQ:SE (decision-support data system and facilitative administration).

Intermediary level

  • Training was provided to TA organization staff on how the tool can be used to help enhance parents’ understanding of child development and to link families to community supports (training and technical leadership).
  • Health ambassadors were integrated into the state’s Help Me Grow program with the explicit goal of making more comprehensive referrals for families with young children, particularly into IDEA Part C (adaptive leadership).
  • A nonprofit organization supported pediatricians and other health care providers with the proper knowledge to conduct screenings (facilitative administration).
  • One challenge—duplicative data entry for some health provider practices—became known because of data and communication loops used by the intermediary (systems interventions).

Direct-service-provider level

  • Staffing schedules were adjusted so that teachers could attend training (facilitative administration).
  • Directors requested support for teachers in communicating with families about results (facilitative administration).
  • Modifications were made to parent and family handbooks and to talking points for parent tours and orientations to ensure that families knew about the use of developmental screening from the start (technical leadership).
  • Directors signed up their programs to participate in information sessions and trainings on developmental screening (facilitative administration).
  • Teachers built relationships with parents through ongoing communication. The establishment of a strong relationship makes it easier to share and receive news that can be difficult to hear (technical leadership).

Wisconsin’s Pyramid Model

Wisconsin established a goal of “comprehensive, cross-disciplinary professional development to support professionals working to ensure the social and emotional well-being of infants, young children, and their families.”[2] Using a three-year grant, Wisconsin selected the Pyramid Model, an evidence-based model for professional development that is implemented simultaneously at state, intermediary, and direct-service levels. Diverse early childhood settings adopted the model: child care, Head Start, and public schools.

Wisconsin’s 2015 annual report illustrates the role of drivers (even though they are not named as such).[3] Below are some of the specific ways drivers were part of planning and implementation at the state, intermediary, and provider levels.

Competency drivers

  • Recruitment
    • At the intermediary level, trainers were recruited through regional communities of practice and regional coaches.
  • Coaching and training
    • At the intermediary level, trainers and coaches were trained on Pyramid Model content and staff responsibilities.
    • At the intermediary level, regional communities of practice were established to support trainers.
    • At the direct-service level,
      • regional communities of practice supported the needs of direct-service providers,
      • coaching was provided to site staff (teachers and program leadership),
      • coaches used multiple sources of information for feedback to teachers, and
      • program leadership teams attended a two-day “implementation academy” to prepare them for implementation.

Organizational drivers

  • Facilitative administration
    • At the state level
      • A state leadership team (SLT) from different systems and disciplines designed and oversaw implementation. Members included individuals from the departments of Children and Families, Health Services (Birth to Three Program), and Public Instruction, as well as higher education (including the technical college system), the Workforce Registry, the Division for Early Childhood, the state early childhood association, the Head Start State Collaboration Office, the Head Start Training and Technical Assistance system, an infant mental health organization, a parent-support organization, professional development, an organization focused on preventing child abuse and neglect, and Pyramid Model coordinators and coaches.
      • The SLT met quarterly.
  • Systems intervention
    • At the state level
      • The SLT created optimism by publishing an annual report that included achievements and progress.
      • SLT members were responsible for advocating for support for the Pyramid Model within their departments or organizations.
      • The SLT nurtured external champions and opinion leaders. SLT members presented at numerous conferences and published a paper in a national journal. The SLT presented to the Legislative Committee on Supporting Early Brain Development. That committee developed policy suggestions including one to “integrate the Pyramid Model with coaching in the classroom into requirements for teacher, childcare provider, and home visitors.”[4]
  • Decision-supported data systems
    • At the state level
      • Data related to implementation were presented and discussed at every meeting.
      • Quarterly leadership team meetings were held.
      • Rates of implementation were benchmarked and reported publicly in the annual report.
      • Site-level data were aggregated. This process revealed that it took one year of coaching for a teacher to reach fidelity on the Pyramid observation tool. These data also provided an opportunity for Wisconsin to compare its progress to national data.
    • At the direct-service level, internal coaches (those within the site) conducted observations and provided coaching until observation data showed that a teacher had achieved fidelity standards.

Leadership drivers

  • Technical leadership
    • At the state level, a tool was developed to assess implementation progress, inform decisions, and plan next steps.
  • Adaptive leadership
    • At the state level, the training content for the Pyramid Model was integrated into technical colleges’ early childhood curriculum.

Please see the Resources section of this guide for information on how to access Wisconsin’s Pyramid Model.

An Infant-Toddler Quality Improvement Initiative

In Measuring Implementation of Early Childhood Interventions at Multiple System Levels, a brief by Paulsell, Austin, and Lokteff, a chart illustrates strategies, measures, and data collection methods of a statewide initiative to improve the quality of infant/toddler center-based care. The chart below was adapted from the chart in the brief by Paulsell and colleagues[5] A column was added to identify the driver category and specific driver associated with each measure.

Table 1. Adapted from Measuring Implementation of Early Childhood Interventions at Multiple System Levels

Direct-Service-Provider Level
Strategy: Teachers implement an intervention to improve the quality of center-based infant-toddler care.
Constructs Illustrative Measures Data Collection Method Category of Driver and Specific Driver
Selecting teachers for implementation
  • Practitioner assessment
Staff survey
  • Competency driver selection
Implementation of new strategies by teachers
  • Practitioner assessment
Staff survey
  • Competency driver selection
Classroom quality
  • Infant/Toddler Environment Rating Scale –Revised Edition (ITERS-R)a
  • Classroom Assessment Scoring System (CLASS)-Toddler Versionb
Observation
  • Competency driver: performance assessment (performance assessment is highly correlated with intended outcomes)
  • Organization driver: decision-support data system (data are reliable; standardized tool is used)
Strategy: Center directors obtain grants to improve caregiving environments, purchase training for teachers on infant/toddler care, and facilitate access to onsite coaching and mentoring for infant/toddler teachers.
Constructs Illustrative Measures Data Collection Method Category of Driver and Specific Driver
Delivery of training program
  • Competency driver: trainingc
Staff survey
  • Competency driver: training (performance assessment measures related to training)
Supervision or coaching
  • Satisfaction with training procedures and topics
Staff survey or interview
  • Competency driver: coaching (satisfaction surveys from those coached)
  • Organization driver: facilitative administration (solicits feedback)
Supervision or coaching
  • Supervision or coaching
Staff survey or interview
  • Competency driver: coaching (coaches directly observe practitioners)
Supervision or coaching
  • Frequency of in-class coaching
Staff survey or training log
  • Organization driver: decision-support data system (data reporting built into routines)
  • Organization driver: facilitative administration (solicits feedback from staff)
Supervision or coaching
  • Implementation drivers: Assessing Best Practices coaching sectiond
Staff survey
  • Competency driver: performance assessment
Supervision or coaching
  • Satisfaction with coaching; self-assessment of learning and behavior and classroom changes
Staff survey or interview
  • Competency driver: coaching (satisfaction surveys from those being coached)

Intermediary Level—Implementing Agency
Strategy: Infant/toddler consultants assess caregiving environment using the ITERS-R, provide onsite coaching and mentoring for infant/toddler teachers and directors, and provide specialized training in infant/toddler development using a scripted curriculum.
Constructs Illustrative Measures Data Collection Method Category of Driver and Specific Driver
Selection of quality improvement trainers
  • Trainer qualifications are commensurate with those specified in qulaity improvement (QI) program
Trainer survey  resume or application materials
  • Competency driver: selection (prerequisites and qualifications for employment are related to the initiative)
Fidelity of program delivery
  • Content and dosage delivered as specified in the QI program
Observation or training logs
  • Competency driver: performance assessment (performance measures extend beyond measurement of context and content; use of multiple data sources)
  • Leadership driver: adaptive leadership (participating in and observing training)

State Level

Strategy: The state Office of Child Care contracts with intermediary organizations to provide coaching.
Constructs Illustrative Measures Data Collection Method Category of Driver and Specific Driver
Adequacy of funding to fulfill program requirements
  • Funding sources and adequacy to implement model as specified
Document reviews or administrator interviews
  • Leadership driver: adaptive leadership (soliciting feedback from practitioners and stakeholders)
  • Decision-support data systems (used to make decisions)
Alignment of training curriculum and characteristics of the service population
  • Documentation of model content, research base, psychometric data, and populations previously served
Document review
  • Leadership driver: adaptive leadership (alignment)
Frequency and content of TA
  • Frequency and content of TA and qualifications of TA providers
Staff pre- and post-training assessments or periodic TA needs assessments
  • Competency driver: training (outcome data collected and analyzed, and performance assessment measures related to training collected and analyzed)

a Harms, T. (2002). Infant/Toddler Environment Rating Scale (Rev. ed.). New York: Teachers College Press.

b Pianta, R. C., La Paro, K. M., & Hamre, B. (2009). Classroom Assessment Scoring System: Toddler Version. Unpublished instrument.

c Blase, K., van Dyke, M., & Fixsen, D. (2013). Implementation drivers: Assessing best practices. Chapel Hill, NC: Frank Porter Graham Child Development Institute, University of North Carolina Chapel Hill, NC. Retrieved from https://www.researchgate.net/publication/307967873_Implementation_Drivers_Assessing_Best_Practices.

d Ibid.

 

[1] Paulsell, D., Austin, A. M. B., & Lokteff, M. (2013). Measuring implementation of early childhood interventions at multiple system levels. OPRE research brief 2013-16. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

[2] Wisconsin Legislative Council Staff Memorandum, September 29, 2014. Retrieved from http://goo.gl/6ujf3S.

[3] Wisconsin Early Childhood Collaborating Partners. (2015). Wisconsin Pyramid Model for social and emotional competence: 2015 annual report. Retrieved from http://www.collaboratingpartners.com/documents/final2015pmannualreport1.pdf.

[4] Wisconsin Legislative Council Staff Memorandum, September 29, 2014. Retrieved from http://goo.gl/6ujf3S.

[5] Paulsell, D., Austin, A. M. B., & Lokteff, M. (2013). Measuring implementation of early childhood interventions at multiple system levels. OPRE research brief 2013-16. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.