During a Feb. 5 Congressional hearing, Rep. Kevin Kiley, R-Calif., used a graph on an easel behind his chair to make a dramatic point about the last decade of American education.
“We’ve just received the latest test scores for students across this country, and they are absolutely alarming,” said Kiley, the newly minted House subcommittee chair for K-12 and early-childhood education. He pointed to a chart with upward-facing curves above the x-axis showing increased education funding, and downward-facing curves showing declining test scores.
“This is a steady growth in spending in real dollars which is proceeding in tandem with a steady decline in student achievement,” Kiley said, just before previewing broad plans to “reimagine federal funding” for schools and expand public subsidies for private education nationwide.
With his comments, Kiley joined a chorus of politicians, pundits, and researchers who have questioned why U.S. students haven’t gained more academic ground after Congress doled out close to $200 billion in pandemic relief funds for K-12 schools in 2020 and 2021.
But the story Kiley told about the graph, published Jan. 29 by the Georgetown University-based research group the Edunomics Lab, doesn’t precisely line up with what the graph actually shows—or what its authors say they want to get across.
In the two weeks following the release of the latest NAEP scores, the Edunomics Lab charts have helped spark the latest round in a perennial debate over how much money really matters in education. The discussion is nothing new for the field, but it raises a deeper philosophical question: Does plotting spending figures against test scores wind up increasing confusion rather than clarity?
Funding matters, but what about how it’s used?
Researchers reached consensus roughly a decade ago that funding increases make a tangible difference in student outcomes. But debate has continued since then over which students benefits most from increases, and what else states, schools, and educators need in order to use money efficiently.
These discussions are especially consequential now, as many states gear up to revise decades-old school funding formulas, and the Trump administration plows forward with an unprecedented and sweeping crackdown on federal spending that has already hit education research hard and tested centuries-old constitutional principles.
Graphics like the Edunomics analysis come at an inopportune time, said Morgan Polikoff, a professor of education at the University of Southern California Rossier School of Education.
“The message it perpetuates is, school spending is unrelated to student achievement. And there is very ample research that shows that to be false,” Polikoff said.
This message is spreading as the Trump administration and Republican majorities in both houses of Congress have threatened to slash federal education funding and eliminate the U.S. Department of Education altogether.
“They could point to this figure and say, well, school spending doesn’t matter,” said Polikoff. “But that isn’t true.”
Marguerite Roza, director of the Edunomics Lab and a research professor at Georgetown University, said she didn’t intend to suggest that money doesn’t matter for education. Instead, she hoped her group’s analysis would start a conversation about why funding increases have contributed to academic improvements in some states more than in others.
“Of course money matters,” Roza told Education Week. “We’re looking at where it matters more, when it matters more, what role do people play in making sure it matters more. It’s a how much, not a yes or no.”
Despite the confusion, Roza sees the vigorous dialogue about the “ROI” analysis as a sign that her analysis is prompting essential, if uncomfortable, debate and introspection. Lawmakers, state education departments, school district leaders, and labor unions that represent educators all have a role to play in setting America’s students on a more positive academic trajectory, she said.
“Thoughtful, smart people can disagree on stuff and still have a conversation where we all end up smarter, regardless of whether we changed our view,” she said.
Comparison of scores and funding demands context, experts say
The Edunomics chart and its accompanying state-by-state graphics were published Jan. 29. Earlier that day the federal government released the latest round of scores for the NAEP exam, administered to a sample of the nation’s 4th and 8th graders in reading and math every few years. The results generally showed declining or stagnant performance and widening gaps between struggling students and their higher-performing peers.
The Edunomics team mapped NAEP scores from the last decade onto a chart for each state that also includes federal data on K-12 education spending, and a curve that illustrates the inflation rate as published by the federal Bureau of Labor Statistics. The charts have since prompted a wide range of reactions.
Some conservative pundits and right-leaning policy groups have included the charts in posts arguing for decreasing or rethinking school funding.
“Teachers’ unions and their apologists continue to say the problem with public schools is funding … More money correlates with more problems rather than solving them,” wrote Andrew Clark, president of the yes. every kid. foundation, an arm of the Koch Network which advocates for school privatization.
Some education researchers, in interviews and on social media, said the charts don’t fully contextualize how school spending and academic outcomes are linked.
Of course money matters.
Marguerite Roza, director, Edunomics Lab at Georgetown University
For instance: It’s impossible to know whether NAEP scores might have been even lower without spending increases over the last decade. State-level figures paper over the substantially different spending strategies school districts took with the resources at their disposal. There’s no evidence of a directly proportional connection between increased spending and higher test scores because so many other factors are at play.
“You can’t compare [test scores] that go from 0 to 500 on a plane with spending that goes from $0 to $30,000 or $0 to $100,000,” said Jess Gartner, the founder of Allovue, a tech company now owned by PowerSchool that helps school districts manage spending. “If you want to share that information side by side, it needs to come with a lot of context.” (Gartner serves on the board of trustees for Editorial Projects in Education, the nonprofit publisher of Education Week.)
Other research suggests that COVID relief funding has made a difference. Analyses published last fall and earlier this month by professors at Harvard and Stanford universities show that, in low-income districts, the additional federal dollars had about as much influence on student achievement as a general revenue increase—and did, in fact, “prevent larger losses,” the researchers wrote. Districts that used more of their money to fund academic recovery saw bigger gains in student test scores.
But that nuance is lost in the Edunomics charts, some say.
“If you went to any state legislature in the country and presented this, the idea that they would get is not, we should figure out how to make dollars more effective,” said Josh McGee, associate professor of education policy at the University of Arkansas. “The idea they would get is, we don’t have to invest anymore because the dollars right now are being wasted.”
The pandemic brought unpredictable expenses
That exact scenario has already begun to play out on Capitol Hill. And state legislatures are taking notice.
Late January in the Oregon House of Representatives, lawmakers examined the Edunomics analysis as they tried to grapple with their state’s academic performance. Some lawmakers were skeptical that test scores alone fully illustrate the state of Oregon’s schools, Willamette Week reported.
The Oregon Parent Teacher Association advanced the debate a few days later, when the group sent a 10-page letter to state house lawmakers detailing numerous complaints about the Edunomics analysis and warnings about how to interpret it.
If you went to any state legislature in the country and presented this, the idea that they would get is … we don’t have to invest anymore because the dollars right now are being wasted.
Josh McGee, associate professor of education policy, University of Arkansas
Among several concerns, the group echoed experts who say the graphs highlight overly simplistic spending numbers.
On the Massachusetts chart, for instance, a curve shows that school spending increased by 71 percent between 2013 and 2024. Another curve below it shows the inflation rate increase of 35 percent over the same period.
The accompanying text spotlights only the percentage increase in unadjusted dollars, leaving out the lower percentage increase when accounting for inflation.
News reports in Alabama, New Jersey and Oklahoma have cited their states’ respective spending increase figures as characterized by Edunomics, without mentioning that the numbers haven’t been adjusted for inflation. The Oregon parents group told lawmakers the unadjusted numbers leave out that 2013 was a low point for post-recession education funding in the state.
Roza said her group’s choices were deliberate, noting that in focus groups that “people don’t understand what adjusted dollars are. Laying it out separately like that allows people to see what’s going on.”
As for Oregon, she said, “it is hard to ignore that other states have been more successful in leveraging incremental dollars to drive progress for students.”
Numbers don’t tell the full story, researchers say. Some districts had far less funding than others even before unevenly distributed pandemic aid; the price of goods and the makeup of the labor pool vary widely from place to place; and growing fixed costs for staff salaries and benefits, utility bills, and pension debts can’t be avoided.
“What we really care about is the extent to which additional dollars allow us to purchase more educational goods and services.” McGee said. “This overstates how much more we could purchase.”
All of this context complicates arguments about how school funding is or isn’t working. But the nuance is essential said Katie Roy, partner and general counsel at Education Resource Strategies, a firm that helps large school districts with budget planning.
Roy helped found the nonprofit Connecticut School + State Finance Project, and used to spend much of her time explaining the particulars of school finance to state senators and representatives who didn’t know the first thing about funding formulas or district expenses. Her team used a 100-page PowerPoint presentation and applied one main rule to avoid confusion: Each chart should only communicate one piece of data.
“People who do not have a lot of experience looking at data, charts, and graphs can very easily, just by mistake, misunderstand,” Roy said.
Student achievement may be slow to rebound, experts say
Roza and many of her critics share the view that reversing downward trends in student achievement will require a wide range of thoughtful new approaches.
The effects of the pandemic, a once-in-a-lifetime disruption to students’ lives, weren’t likely to be fully offset on NAEP in a few years, said Martin West, the vice chair of the National Assessment Governing Board, which sets policy for NAEP. Lost learning tends to persist.
“That is, there’s nothing that would make us expect that students would become faster learners after a disruption. In fact, if anything, you might think the opposite: They would have lost some knowledge or skills or habits of mind that would slow their progress going forward,” he said.
The tutoring and summer school programs that many schools stood up to make up for lost time varied widely in scale and intensity. The call to “accelerate” students’ learning by simultaneously teaching new content and refreshing previous years’ skills felt practically impossible for many educators.
“If we could improve schools hugely, rapidly, we probably would have done it beforehand,” Susanna Loeb, the executive director of the National Student Support Accelerator, told Education Week in January.
Despite these steep challenges, research from the Education Recovery Scorecard, a project from researchers at Harvard and Stanford that tracks pandemic-related learning loss, found that some districts have bucked the trend of general decline and raised student scores above pre-pandemic levels. But the data can’t pinpoint what combination of policy decisions and spending choices caused these results in every case, the researchers said.
NAEP is notoriously hard to use to draw cause-and-effect conclusions, and generally speaking, improvements on it are incremental.
“NAEP trends don’t move very much in terms of points and percentages. They move slow, and even in good times they move up by a small sliver of a standard deviation, whereas staffing and spending go up in percentage terms quite a bit more,” said Chad Aldeman, a policy analyst and columnist for The 74 who has published reports on school funding and educator pensions. “You can’t just assume that because one trend goes up at one rate, the other trend goes up at the same rate.”
And, at least in reading, the declines go much further back than the pandemic. This slide comes after an upward trend in NAEP scores during the 1990s and 2000s.
This fuller timeline of scores on the nation’s report card complicates the connection between dollars spent and student outcomes, said Polikoff. School spending has risen for much of the past three decades, though it dipped in the years following the Great Recession. Achievement has gone up and down over the same period.
Roza said she hoped the Edunomics analysis would spur policymakers and experts across the ideological spectrum to embrace the iterative work of improving students’ test scores, rather than ignoring the nationwide trend and the states defying it.
“I actually think the work of getting more value for the dollars is kind of a slog,” Roza said. “I think it requires prioritizing what we’re trying to do in schools, and I think it means using data constantly to see what’s working and what’s not.”
Some districts cited labor shortages and hiring challenges to explain why their ambitious plans to scale up tutoring and summer school for the last few years fell short of expectations. Roza isn’t convinced those hurdles were insurmountable.
“We have to be willing to be eyes wide open and pivot and check if the investments are delivering the value we hoped,” Roza said. “If not, try something else, or try to fix the thing that’s not working.”
2025-02-13 19:20:48
Source link