Head Start Reporting Overview
Head Start grantees face a reporting burden that is among the most comprehensive in federal grant programs. This reflects the program's comprehensive services model — because Head Start addresses education, health, nutrition, and family engagement simultaneously, reporting must cover all four domains plus fiscal management. Unlike simpler grant programs with a single annual report, Head Start requires multiple reporting instruments on different schedules, each with its own data elements and submission protocols.
Failure to submit required reports on time and with accurate data can result in compliance findings during federal monitoring, delays in continuation funding, and increased scrutiny from your OHS regional office. Data quality issues in the PIR can also trigger enhanced oversight.
The Program Information Report (PIR)
The PIR is the primary annual data collection instrument for Head Start and Early Head Start programs. It is the single most important reporting requirement beyond financial reports, serving as both a compliance tool and the primary source of data OHS uses to understand program performance nationwide.
PIR Submission Timeline
The PIR data collection window typically opens in June and closes in August, though exact dates vary by year and are announced by OHS through Program Instructions. The data covers the most recently completed program year. Key timeline milestones:
- Data collection opens: Typically June. OHS publishes the PIR form and instructions. Programs begin entering data.
- Submission deadline: Typically August. The exact date is specified in the annual Program Instruction. Late submissions are flagged and may result in follow-up from the regional office.
- Correction period: OHS reviews submitted data for anomalies and may request corrections or clarifications after the submission window closes.
PIR Data Elements
The PIR collects data across multiple sections. Understanding what data is collected helps programs build systems to capture it accurately throughout the year rather than scrambling at reporting time:
| PIR Section | Data Elements |
|---|---|
| Program information | Program type, service area, funded enrollment, actual enrollment, program options, operating schedule |
| Enrollment and demographics | Total enrolled, eligibility categories (income, categorical, over-income), age at enrollment, race/ethnicity, primary home language, children with disabilities, homeless children, foster children |
| Family characteristics | Family size, family income, parent education levels, employment status, housing status, public assistance receipt, two-parent vs. single-parent households |
| Staff qualifications | Education credentials for all classroom staff (lead teachers, assistant teachers, home visitors), CDA credentials, bachelor's/associate's degrees in early childhood education, staff turnover rates |
| Health services | Health insurance status, children receiving medical/dental exams, immunization rates, children with health conditions, mental health consultation data, screening completion rates |
| Family engagement | Families receiving services (education, health, parenting, emergency/crisis intervention, housing assistance), family engagement activities, policy council participation |
| Transition | Children transitioning to kindergarten, school readiness assessment data, transition activities, coordination with receiving schools |
PIR Data Quality
OHS has increasingly emphasized PIR data quality. Common data quality issues that trigger follow-up include:
- Internal inconsistencies: Numbers that do not add up across sections (e.g., enrollment totals that do not match eligibility breakdowns)
- Significant year-over-year changes: Dramatic shifts in reported data from one year to the next without explanation may indicate data collection problems rather than actual program changes
- Unrealistic values: 100% screening completion rates, zero staff turnover, or other perfect scores that are statistically unlikely and suggest data fabrication or misunderstanding of definitions
- Missing data: Fields left blank or reported as zero when the program clearly provides those services
SF-425 Federal Financial Reports
Head Start grantees must submit the SF-425 (Federal Financial Report) on a quarterly basis. The SF-425 reports on federal funds drawn down, expenditures by budget category, unliquidated obligations, and the non-federal match applied during the reporting period. The quarterly schedule is:
| Reporting Period | Due Date | Submission Method |
|---|---|---|
| January 1 – March 31 | April 30 | HSES (Head Start Enterprise System) |
| April 1 – June 30 | July 30 | HSES |
| July 1 – September 30 | October 30 | HSES |
| October 1 – December 31 | January 30 | HSES |
The SF-425 is more than a financial form — it is a compliance document. OHS uses SF-425 data to monitor spending patterns, identify potential fiscal concerns, and verify that grantees are spending their awards at an appropriate pace. Significant underspending or overspending relative to the budget can trigger inquiries from your regional program specialist. For more detail on aligning your financial reporting with your budget structure, see the Budget & Financial Management guide.
Single Audit Requirements
Under 2 CFR 200 Subpart F, organizations that expend $750,000 or more in federal awards during their fiscal year must complete a Single Audit. Given that Head Start awards typically range from several hundred thousand to tens of millions of dollars, virtually every Head Start grantee meets this threshold.
Key Single Audit requirements for Head Start grantees:
- Timing: The Single Audit must be completed and submitted to the Federal Audit Clearinghouse within 9 months of the end of the grantee's fiscal year (not the federal fiscal year)
- Head Start as a major program: Head Start is almost always tested as a major program in the Single Audit due to the size of the award. The auditor will test compliance with HSPPS requirements and 2 CFR 200, using the OHS-specific compliance supplement
- DRS implications: Audit findings that indicate material weakness, significant questioned costs, or a going concern opinion can trigger DRS Condition 5 (fiscal findings). This makes the Single Audit not just a compliance requirement but a DRS risk factor.
- Corrective action: Findings from the Single Audit require a management response and corrective action plan. OHS monitors whether findings are resolved in subsequent audits.
Health Screening Data and the 45/90-Day Requirements
The HSPPS establish specific timelines for health screenings that are tracked and reported both in the PIR and during federal monitoring:
- Within 45 days of enrollment: Complete developmental screening, sensory screening (vision and hearing), behavioral screening, and obtain or perform a health screening (including current immunization status, growth assessment, and detection of health problems)
- Within 90 days of enrollment: Obtain a determination from a health care professional as to whether the child is up-to-date on a schedule of age-appropriate preventive and primary health care, including medical and dental exams. Begin treatment for any identified health problems.
These timelines are among the most frequently monitored requirements during federal reviews. Programs must maintain systems to track each child's screening dates, referral dates, and treatment completion dates. A common monitoring finding is that while screenings were completed, the program failed to document follow-up on identified concerns within the 90-day window. For health screening compliance pitfalls, see the Common Mistakes guide.
School Readiness Data and Reporting
The HSPPS require every Head Start program to establish school readiness goals aligned with the Head Start Early Learning Outcomes Framework (ELOF), implement ongoing child assessment, and report on children's progress toward school readiness. This is not a single report but an ongoing data collection and analysis process:
- Ongoing assessment: Programs must use research-based assessment instruments (common tools include Teaching Strategies GOLD, COR Advantage, and the Desired Results Developmental Profile) to collect data on each child's developmental progress at multiple points during the program year — typically three times (fall, winter, spring)
- Aggregate analysis: Programs must analyze aggregate school readiness data to identify patterns, strengths, and areas needing improvement at the program level — not just individual child progress
- Governance reporting: School readiness data must be presented to the governing body and policy council to inform program planning and improvement decisions
- Program improvement: Assessment data must drive individualization in the classroom and program-level improvements in curriculum implementation and teaching practices
CLASS Data and Monitoring Data
While CLASS scores are collected during federal monitoring reviews (not submitted by grantees), programs should maintain their own internal CLASS observation data as part of their quality improvement system. Many programs conduct internal CLASS observations 2–3 times per year in every classroom to track progress and identify classrooms needing additional coaching support.
Internal CLASS data serves multiple purposes:
- Identifies classrooms at risk for low scores before federal monitoring
- Provides baseline data for coaching and professional development planning
- Demonstrates to federal reviewers that you have a quality improvement system in place
- Supports the governing body and policy council in understanding program quality
Other Reporting Requirements
Beyond the major reporting instruments, Head Start grantees must also fulfill several additional reporting obligations:
- Real Property Status Report: If the grantee owns or has a long-term lease on facilities purchased or renovated with Head Start funds, periodic reporting on property status is required
- Incident reporting: Serious incidents including child injuries, vehicle accidents, facility emergencies, and other safety events must be reported to the regional office within specified timelines
- Community assessment updates: While not a formal "report," the annual update to the community assessment is a required documentation activity that feeds into program planning and the PIR
- Self-assessment: The HSPPS require an annual program self-assessment that evaluates the effectiveness of the program in meeting school readiness goals and implementing comprehensive services. Results must be shared with the governing body and policy council.
- Grant application amendments: Changes to the approved program design, budget modifications exceeding threshold amounts, and other substantive changes require prior approval through the grant amendment process in HSES
Building an Effective Data Infrastructure
Programs that struggle with reporting typically have data infrastructure problems, not reporting problems. The most effective approach is to build systems that capture data continuously throughout the year so that reporting periods are a matter of compilation and review rather than retrospective data collection. Key components include:
- Child-level management system: A database or software system (many programs use ChildPlus, PROMIS, or similar platforms) that tracks enrollment, attendance, eligibility, health screenings, assessment data, and family services at the individual child level
- Real-time monitoring dashboards: Ability to view key compliance metrics (enrollment vs. funded enrollment, screening completion rates, attendance rates) in real time rather than at reporting deadlines
- Staff data entry protocols: Clear procedures for who enters what data, when, and in what system. Data quality problems often trace back to unclear responsibilities and inconsistent entry practices.
- Data reconciliation: Monthly or quarterly data reconciliation to catch errors before they accumulate into PIR submission problems
Reporting Calendar Summary
| Report | Frequency | Typical Due Date |
|---|---|---|
| Program Information Report (PIR) | Annual | August (date set annually by OHS) |
| SF-425 Financial Report | Quarterly | 30 days after quarter end |
| Single Audit | Annual | 9 months after fiscal year end |
| Self-assessment | Annual | Before end of program year |
| Community assessment update | Annual | Before start of new program year |
| School readiness data | 3x/year (fall, winter, spring) | Internal checkpoints; annual report to governance |