Skip Navigation

Strengthen the Evidence for Maternal and Child Health Programs

Sign up for MCHalert eNewsletter

Evidence Tools
MCHbest. Developmental Screening.

MCHbest Logo

Strategy. Adaptation of Existing Screening Tools

Approach. Develop and implement standardized parent guidelines to increase the accuracy and utility of parent-performed developmental screening tests

Return to main MCHbest page >>

Overview. Developmental screening tests, such as the Ages and Stages Questionnaires (ASQ) and the Parent’s Evaluation of Developmental Status (PEDS), are crucial for the early detection of developmental disabilities (DDs) in young children. [1,2] However, these widely used parent-performed screeners have demonstrated only moderate accuracy in previous studies, [3] a challenge often linked to the reliability of parents as administrators.[1] Parents frequently report difficulties administering the tests due to confusing questions or failure to adequately observe their child’s performance. [1] To address these issues, states can implement standardized parent guidelines that provide easily understandable, comprehensive instructions on test administration, developed via expert consensus (e.g., Delphi technique). The guidelines prioritize maximizing objective evidence for scoring—for example, explicitly advising against scoring based solely on proficiency, as relying on subjective evidence can lead to an overestimation of abilities and increased false-negative rates. These detailed instructions cover critical aspects such as the person and location of administration, observation length (e.g., seven days), and precise scoring methods for items with numerical performance criteria or those difficult to administer. Parents who participated in testing the subjective usefulness of these guidelines reported high value for instructions covering difficult-to-score items, indicating that targeted guidance meets their needs. [1]

Evidence. Emerging Evidence. Strategies with this rating typically trend positive and have good potential to work...

Access the peer-reviewed evidence through the MCH Digital Library or related evidence source.

Outcome Components. This strategy has shown to have impact on the following outcomes (Read more about these categories):

  • Quality of Care. This strategy promotes the degree to which healthcare services meet established standards aimed at achieving optimal health outcomes.
  • Timeliness of Care. This strategy promotes delivery of healthcare services in a timely manner to optimize benefits and prevent complications.
  • Patient Experience of Care. This study improves individuals' perceptions, feelings, and satisfaction with the healthcare services they receive.

Detailed Outcomes. For specific outcomes related to each study supporting this strategy, access the peer-reviewed evidence and read the Intervention Results for each study.

Intervention Type. Health Teaching (Education and Promotion) (Read more about intervention types and levels as defined by the Public Health Intervention Wheel).

Intervention Level. Individual/Family-Focused

Examples from the Field. There are currently no ESMs that use this strategy. Search similar intervention components in the ESM database.

Sample ESMs. Here are sample ESMs to use as models for your own measures using the RBA framework (see The Role of Title V in Adapting Strategies).

Quadrant 1:
Measuring Quantity of Effort
(“What/how much did we do?”)

  • Number of state MCH programs that formally adopt and implement standardized parent guidelines for developmental screening tools. (Measures the institutional adoption of the evidenced-based practice)
  • Number of unique parent guideline packets distributed electronically or physically to families participating in universal developmental screening programs. (Measures the direct reach of the educational resource)

Quadrant 2:
Measuring Quality of Effort
(“How well did we do it?”)

  • Percent of healthcare practices utilizing standardized parent guidelines that report high fidelity (e.g., over 90%) to the implementation protocol. (Measures the quality and consistency of program delivery)
  • Percent of parents receiving the guidelines who report that the instructions are clear, usable, and easy to understand (e.g., meeting a specified reading level). (Measures the quality and usability of the educational materials)

Quadrant 3:
Measuring Quantity of Effect
(“Is anyone better off?”)

  • Number of developmental screening tools submitted by parents that are scored based predominantly on objective evidence, as defined by the standardized guidelines. (Measures the shift in parent behavior towards accurate, evidence-based reporting)
  • Number of children receiving developmental screening whose scores result in a correct positive identification (sensitivity) of a developmental delay, verified by subsequent assessment. (Measures the improved effectiveness and diagnostic utility of the screening process)
  • Number of children previously classified as false negatives who are subsequently identified as having a developmental delay after the implementation of the standardized guidelines. (Measures the reduction in misclassification errors resulting from the intervention)
  • Number of local health agencies or clinical partners who report increased confidence in the accuracy of parent-reported screening results for timely referral decisions. (Measures systems change and trust among downstream service providers)

Quadrant 4:
Measuring Quality of Effect
(“How are they better off?”)

  • Percent decrease in the reported rate of false-negative developmental screening results following the implementation of the standardized parent guidelines. (Measures the targeted improvement in diagnostic accuracy and reduction of critical error)
  • Percent of parents completing the screening who report high satisfaction with the clarity of the instructions and the overall administration process. (Measures the improvement in Patient Experience of Care related to the assessment process)
  • Percent increase in the observed reliability (e.g., inter-rater consistency) of the developmental screening tool results comparing parents in the intervention group versus the control/baseline group. (Measures the psychometric quality improvement of the administered tool)
  • Percent of children identified as needing follow-up assessment who receive comprehensive diagnostic evaluation within the recommended public health timeframe. (Measures the overall improvement in the timeliness of the developmental care cascade)

Note. When looking at your ESMs, SPMs, or other strategies:

  1. Move from measuring quantity to quality.
  2. Move from measuring effort to effect.
  3. Quadrant 1 strategies should be used sparingly, when no other data exists.
  4. The most effective measurement combines strategies in all levels, with most in Quadrants 2 and 4.

Learn More. Read how to create stronger ESMs and how to measure ESM impact more meaningfully through Results-Based Accountability.

References

[1] Rah, S. S., Hong, S. B., & Yoon, J. Y. (2023). Development of Parent Guidelines for Parent-Performed Developmental Screening Tests. Journal of the Korean Academy of Child and Adolescent Psychiatry, 34(2), 141. [2] Sheldrick, R. C., Marakovitz, S., Garfinkel, D., Carter, A. S., & Perrin, E. C. (2020). Comparative accuracy of developmental screening questionnaires. JAMA Pediatr, 174, 366–374.Abdoola, S., Swanepoel, D. W., & Van Der Linde, J. (2022). A Scoping Review on the Use of the Parents Evaluation of [3] Halpin, P. F., de Castro, E. F., Petrowski, N., & Cappa, C. (2024). Monitoring early childhood development at the population level: the ECDI2030. Early Childhood Research Quarterly, 67, 1-12.Developmental Status and PEDS: Developmental Milestones Screening Tools. Volume 45, Issue 3. https://doi.org/10.1177/10538151221091202.

This project is supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under grant number U02MC31613, MCH Advanced Education Policy, $3.5 M. This information or content and conclusions are those of the author and should not be construed as the official position or policy of, nor should any endorsements be inferred by HRSA, HHS or the U.S. Government.