SNAP-Ed Reporting Overview
SNAP-Ed reporting serves two purposes: demonstrating accountability for federal funds and building the evidence base for nutrition education effectiveness. Implementing agencies report through multiple channels — to their state SNAP agency on schedules defined by the subgrant agreement, and indirectly to FNS through the state's annual reporting submission. Understanding the reporting landscape helps implementing agencies build efficient data collection systems rather than treating reporting as an afterthought.
The primary reporting mechanism for SNAP-Ed is the Education and Administrative Reporting System (EARS), which collects standardized data from all state SNAP-Ed programs annually. Additionally, implementing agencies must report to their state SNAP agencies on financial expenditures, program activities, and outcome measures as specified in their subgrant agreements.
EARS: Education and Administrative Reporting System
EARS is the federal reporting system through which FNS collects standardized data on SNAP-Ed activities nationwide. State SNAP agencies are responsible for compiling and submitting EARS data to FNS, but implementing agencies are responsible for collecting and providing the underlying data. EARS data is submitted annually, typically within 120 days after the end of the federal fiscal year (by late January or early February for the preceding October-September fiscal year).
EARS Data Elements
EARS collects data across several categories. Implementing agencies must build data collection systems that capture all required elements throughout the program year:
| Data Category | Required Elements |
|---|---|
| Reach | Number of unique individuals reached through direct education and estimated number reached through indirect activities and PSE changes |
| Demographics | Age group, race/ethnicity, sex, SNAP participation status for direct education participants |
| Delivery Channels | Setting type (school, community center, food bank, etc.), delivery modality (in-person, virtual), target audience (youth, adult, senior) |
| Frequency | Number of sessions per education series, total sessions delivered, single session vs. multi-session programming |
| PSE Activities | Type of PSE change (policy, systems, environmental), setting, stage of adoption, estimated population reach |
| Fiscal | Total SNAP-Ed expenditures by category (personnel, contracts, supplies, travel, indirect, other) |
EARS Reporting Timeline
While EARS data is submitted annually to FNS, the data collection process must be ongoing throughout the fiscal year. Implementing agencies that wait until year-end to compile EARS data typically produce less accurate reports and face significant data quality issues. Best practice is to:
- Collect participant demographics and attendance data at every education session
- Document PSE change activities within one week of each activity
- Compile and review data quarterly to identify gaps and quality issues
- Submit data to the state SNAP agency according to state-specified deadlines (often quarterly or semi-annually)
- Conduct year-end data quality review before final EARS submission
Demographic Data Collection
Accurate demographic data collection is essential for EARS reporting and for demonstrating that SNAP-Ed is reaching its intended target populations. Implementing agencies must collect demographic data from direct education participants while balancing data quality with participant burden and privacy concerns.
Required Demographic Categories
EARS requires data in the following demographic categories for direct education participants:
- Age group: Youth (under 18), adult (18-59), and older adult (60+), with further breakdowns within youth categories as required by some states
- Race/ethnicity: Using OMB-standard categories (American Indian/Alaska Native, Asian, Black/African American, Native Hawaiian/Pacific Islander, White, Hispanic/Latino, multiracial)
- Sex: Male, female, and (in some state systems) non-binary or prefer not to answer
- SNAP participation status: Whether the participant is currently receiving SNAP benefits, eligible but not participating, or unknown
Collection Methods and Privacy
Demographic data is typically collected through participant registration forms completed at the first session of an education series. Self-reported data is acceptable and preferred over visual estimation. Key considerations include:
- Participation in SNAP-Ed is voluntary, and participants cannot be required to provide demographic data as a condition of receiving services
- Forms should include an explanation of why data is being collected and how it will be used
- For school-based programs, demographic data may be obtained from school records with appropriate consent and data sharing agreements
- Unknown or missing demographic data should be reported as such rather than estimated by staff
Reach and Frequency Metrics
Two of the most important EARS metrics are reach (how many people were served) and frequency (how intensively they were served). These metrics are fundamental to demonstrating program impact and value.
Direct Education Reach
Direct education reach counts the number of unique individuals who participated in SNAP-Ed nutrition education sessions during the fiscal year. This is an unduplicated count — a person who attends multiple sessions of the same education series or participates in more than one program during the year is counted once. Accurate reach counting requires implementing agencies to track individual participants across sessions, which can be challenging in settings like food banks or community events where attendance fluctuates.
Frequency and Dosage
Frequency measures the intensity of education delivered, typically expressed as the number of sessions per education series. SNAP-Ed distinguishes between:
- Single-session contacts: One-time education touchpoints such as nutrition demonstrations at community events or single workshops
- Multi-session series: Structured education programs delivered over multiple sessions (the evidence-based approach most strongly supported in the literature, typically 4-12 sessions)
FNS increasingly emphasizes multi-session programming because the evidence base for behavior change is stronger when participants receive sustained, reinforced education. Implementing agencies should track both the number of series completed and individual session attendance within each series.
Indirect Estimation Methodology
Not all SNAP-Ed reach can be directly counted. Indirect activities — social marketing campaigns, website content, printed materials distributed through third parties, and PSE changes that affect populations rather than individual participants — require estimation methodologies to quantify their reach.
FNS allows implementing agencies to use reasonable estimation methods for indirect reach, but requires documentation of the methodology used. Acceptable approaches include:
- Social marketing: Website analytics (unique visitors), social media impressions, print material distribution counts, media market coverage data
- PSE changes: Population of the institution or community affected by the change (e.g., total school enrollment for a cafeteria layout change, number of residents in a housing complex where a community garden was established)
- Material distribution: Number of materials distributed through partner organizations, adjusted for estimated readership rates
Overestimating indirect reach is a common compliance issue. Use conservative estimation methods and document all assumptions. Auditors and FNS reviewers will evaluate whether estimation methodologies are reasonable and defensible.
PSE Change Reporting
PSE change activities require their own reporting track within EARS. For each PSE initiative, implementing agencies must report:
- Type of change: Whether the initiative is a policy change, systems change, or environmental change, categorized using EARS-defined PSE types
- Setting: Where the PSE change occurred (school, worksite, community organization, food retail, health care, etc.)
- Stage of adoption: Where the initiative falls on the adoption continuum — readiness, adoption, implementation, or maintenance
- Estimated reach: The number of people potentially affected by the PSE change
- Supporting activities: Direct education or social marketing activities that complement the PSE change
For detailed documentation requirements for PSE activities, see the Compliance & Evidence-Based Requirements section.
SNAP-Ed Evaluation Framework
The SNAP-Ed Evaluation Framework provides the national structure for measuring SNAP-Ed outcomes. Developed with input from researchers, practitioners, and FNS, the framework defines indicators across five priority outcome areas. While not all implementing agencies are expected to measure every indicator, the framework guides evaluation planning and helps ensure that SNAP-Ed evaluation efforts contribute to the national evidence base.
Priority Outcome Areas
| Priority Area | Example Indicators |
|---|---|
| Dietary Quality | Fruit and vegetable consumption, whole grain intake, sugar-sweetened beverage reduction, compliance with Dietary Guidelines |
| Physical Activity | Minutes of physical activity per day, sedentary behavior reduction, screen time reduction |
| Food Resource Management | Meal planning behaviors, food budgeting skills, comparison shopping, food waste reduction |
| Food Safety | Handwashing frequency, proper food storage, temperature control practices, cross-contamination prevention |
| Food Security | Household food security status, ability to access affordable healthy food, use of food assistance programs |
Evaluation Levels
The Evaluation Framework recognizes four levels of evaluation, from basic output tracking to rigorous outcome measurement:
- Level 1 — Reach and Demographics: Basic output data (how many people were reached, who they were). All implementing agencies must report at this level through EARS.
- Level 2 — Changes in Knowledge, Attitudes, Skills: Pre/post measurement of participant knowledge, attitudes, and skills related to nutrition and physical activity.
- Level 3 — Behavior Changes: Measurement of actual dietary and physical activity behavior changes among participants.
- Level 4 — Health Outcomes and Environmental Changes: Long-term health outcomes (BMI, disease risk factors) and documented PSE changes at the institutional or community level.
State Reporting Requirements
Beyond EARS, implementing agencies must comply with state-specific reporting requirements established in their subgrant agreements. These vary significantly by state but commonly include:
- Quarterly or monthly financial reports: Expenditure reports by budget category, comparing actual spending to budgeted amounts
- Quarterly programmatic reports: Activities completed, reach data, sessions delivered, sites served, and progress against plan objectives
- Annual narrative reports: Comprehensive summaries of program accomplishments, challenges, lessons learned, and success stories
- Evaluation reports: Results of pre/post surveys, outcome data analysis, and PSE change documentation
Meeting both state and federal reporting requirements simultaneously requires well-designed data collection systems. Implementing agencies should map state reporting requirements against EARS data elements at the beginning of the fiscal year to identify all data collection needs and build integrated tracking systems. For related guidance on federal reporting standards under 2 CFR 200, see the compliance section.
Building Effective Data Collection Systems
The most common reporting problems stem from inadequate data collection systems rather than from the reporting process itself. Implementing agencies should invest in systems that:
- Track individuals across sessions: Use unique participant identifiers to enable unduplicated reach counting
- Capture data at the point of service: Collect attendance and demographic data at each session rather than reconstructing records later
- Align with EARS categories: Map data fields to EARS reporting categories so that year-end compilation is straightforward
- Include quality assurance: Build in data validation checks, completeness reviews, and supervisor approval processes
- Support multiple reporting needs: Design systems that serve both EARS and state-specific reporting requirements through a single data entry process