Loading Events

<< All Events

  • This event has passed.
Event Series Event Series: Decoding NAEP
Feb43:00 pm - 4:30 pm

Decoding NAEP: Behind and Beyond the Headlines

Details

Channel:
Date:
February 4
Time:
3:00 pm - 4:30 pm
Series:
Event Category:
Event Tags:
Resources:
Panelist BiosWebinar Slide DeckAdditional Resources

This GLR Learning Tuesdays webinar featured the voices of journalists, data experts and community leaders in an engaging conversation as the Campaign for Grade-Level Reading launched its new mini-series on what we need to know about NAEP to accelerate learning recovery and close achievement gaps.

Moderated by John Gomperts of CGLR, the conversation began with Morgan Scott Polikoff, Ph.D., of University of Southern California providing an overview of NAEP, its administration and what the various performance levels mean:

  • NAEP is a national test required by federal law for the purpose of monitoring performance over time.
  • Randomly selected representative samples of schools and children in fourth and eighth grades from across the U.S.  participate.
  • The National Assessment Government Board determines the content and format of each test including the performance levels.
  • Students take two 25-minute tests in one subject as well as survey questions about their learning habits and motivations.
  • NAEP Proficiency is a relatively high benchmark that is above the proficiency levels of nearly all states with NAEP Basic being a little below state proficiency standards.

         

        The release of the NAEP scores on January 29 unleashed a slew of headlines, including stories authored by our panelists who shared highlights and reflections on the issues they explored in their stories.

        Kaylin Belsha of Chalkbeat drew attention to the growing gap between the lowest- and highest-performing students saying, “Kids who are reading at the lowest levels are doing so at the lowest levels in 30 years.”

        Kevin Mahnken of The 74 looked at the gaps for Asian and Hispanic students noting, “Asian fourth graders saw a pronounced dip in fourth-grade reading, but the figures for Hispanic students really jumped out because they were fairly consistent across both subjects” experiencing significant drops.

        Sara Randazzo of the Wall Street Journal pointed out the slide in reading that began pre-pandemic (early 2019) explaining, “You look at these numbers and you want to tell a narrative that says why this happened, but nobody really knows.”

        Reflecting on the headlines following the NAEP release, Karyn Lewis, Ph.D., of NWEA said, “I was most surprised by how surprised everyone else was because this is exactly what [we] have been seeing in interim assessment and reporting over the last four years….The reading backslide is really alarming and the reading gaps are continuing to grow.” Meanwhile although there are still significant gaps in math, that subject is “where we rang the alarm bells first” in response to the 2022 NAEP scores with many states and districts responding by directing resources and attention. Polikoff noted that the relative progress in math could also be attributed to the fact that “math is more sensitive to educational intervention.”

        In reflecting on data from her company’s assessments over the past four years, Kristen Huff, Ed.D., of Curriculum Associates stressed the importance of paying attention to the ages of students when pandemic disruptions occurred. The fourth graders tested in the 2024 NAEP tests were in kindergarten in the spring of 2020. Curriculum Associates released research in 2024 showing that students who were 3 and 4 years old in 2020 are starting school behind pre-pandemic levels and are not learning at the same rate as their pre-pandemic peers. Huff noted that “what we’ve learned is that the impact of the pandemic on our nation’s youngest students and preK students is lasting….We’re going to continue to see this if we don’t put the right supports and interventions in place.”

        With all this data and the variety of attention points, it is important to use the data responsibly — failure to do so is what Polikoff calls “misNAEPery.” Munro Richardson, Ph.D., of Read Charlotte encouraged users to “compare your state assessment with the NAEP, look at what the NAEP says, and if you’re lucky enough to be in [one of the Tribal Urban District Assessment areas] triangulate both of these.” He encouraged attendees to look for patterns in the data in order to ask better questions that can inform action. The data won’t necessarily tell you why, but it can give some insights on what to look for. Karyn Lewis echoed this, calling for self-reflection.

        As the panel shared their reflections on the data and the things communities can be thinking and asking about as they seek to apply  NAEP data wisely,  Belsha called on education stakeholders to drill down into the data, noting that several large districts saw significant drops in eighth-grade math suggesting those students will need targeted interventions. Polikoff agreed with that call to action and pointed to the importance of looking at what is happening in places like Massachusetts, Louisiana and Mississippi that are outperforming their peers. “[Find] one place where it’s working…one district, one school even, where something they’ve implemented is having immediate effects,” said Randazzo.

        If you were able to attend the session, we would love to hear your feedback! We appreciate your help in filling out the following form as we seek to learn and understand the perspectives, ideas, critiques and recommendations that better inform our key audiences.