Harvard’s recent announcement that it would significantly scale back its doctoral programs came as a shock to the rest of the academic world. If Harvard can’t keep graduate programs afloat, what hope is there for the rest of us? Although the spat between Trump and Harvard played a role, the move was also a direct result of President Trump’s general cuts to federal research spending.
These cuts should prompt a long overdue critical look at research funding. Where does the money go anyway? After the Trump administration announced its cutback plans, every major university protested that this would cancel work on cures for cancer, diabetes, and other deadly diseases. But if pressed, they would have to acknowledge that the bulk of STEM funding is not of this nature. That doesn’t mean the work it supports lacks value. But a sensible prioritization is imperative.
“The bulk of STEM funding is not of this nature.”
In what follows, I will consider a medical example: research in compressed sensing, which, among other applications, allows MRI scans to complete much more quickly. Shortening MRI time is quite valuable given that little children, for instance, simply cannot hold still for long periods of time. Likewise, many diseased adults cannot hold their breath that long. Roughly speaking, compressed sensing reduces scan time by focusing only on aspects of an image of major importance, as determined by sophisticated mathematical analysis. I will focus primarily on funding for the mathematical sciences, and on the compressed sensing work of three world-class mathematical researchers, Terence Tao of UCLA; Emmanuel Candès of Stanford, but at Caltech during his work with Tao; and David Donoho of Stanford.
My focus here on compressed sensing is motivated by its close relation to general research funding issues, in several senses. First of all, compressed sensing is a best-case scenario” for funded math research, a genuine applied success story with major medical usage. Moreover, Donoho, considered the father of the modern field, has been extremely active in promoting it, such as presenting a much-heralded briefing to members of Congress to increase funding for the field. UCLA’s Institute for Pure and Applied Mathematics has also been active in promoting the CS field, for example organizing a series of invited speaker talks. In 2025, the Trump administration suspended National Science Foundation funding for UCLA, including a $25 million grant to IPAM, citing the university’s failure to deal with anti-Semitism on campus.
A big issue here will be the role of pure theoretical research, what I call theory for theory’s sake (TTS), meaning very abstract mathematical work, theorem-proving, that contributes little or nothing to the actual technology. Donoho’s math work that has developed the compressed sensing technology mostly was not TTS, while Candès and Tao’s collaborative research, and later Candès’s individual work, was. A major question will be, should the federal government be funding TTS? But first, let’s follow the money.
Universities’ abuse of the overhead portion of research grants—ostensibly to “keep the lights on” in the labs, in reality often to cover a range of unrelated activities—has been a matter of bipartisan concern for years. Trump has imposed a 15 percent cap on overhead costs, well below prior rates, which were often 50 or 60 percent or more.
But what about the dollars to go directly to research? First, the grants fund graduate students through graduate student research assistantships (GSRs), typically 20 hours per week during the academic year, and possibly on a full-time basis in the summer. This is what the grad students live on, but it wasn’t always that way. Long ago, when I was a doctoral student in math at UCLA, most students were supported by working as teaching assistants (TAs). Though their appointments are nominally set at 20 hours per week, in the mathematical sciences their main work occurs during midterm and final exam grading times, so otherwise they do have plenty of time for research.
In my day, federal research grants were viewed by departments as supplements, expanding somewhat the number of students a department might support. But as government funding grew over the succeeding decades, departments grew highly dependent on grants. Today, few STEM doctoral students at major research universities work as TAs once they start their research. At UCLA, an estimated 70-90 percent of STEM students work as GSRs. The situation is similar at my institution, UC Davis.
Those students’ research advisers benefit financially from government grants as well. Professors generally work on nine-month contracts, and grants typically add a summer research salary to that. But the term “summer research salary” is misleading. A professor at a research university is judged, both in terms of promotions and professional standing, mainly on research output. Due to the career-critical nature of that activity, they will conduct summer research regardless of additional funding.
“No one viewed this as working for free.”
Back in my days as a graduate student and then an assistant professor, most faculty conducted research in the summers without extra compensation. No one viewed this as working for free, just as no one considered their work during evenings and weekends during the academic year as working without overtime pay. Hence, in the old days doctoral students and professors in the mathematical sciences managed to do good research without the levels of federal support that later became typical. But grants sweeten the deal. Many larger grants even include “buyout” funds, which compensate the university so that a professor can take a semester or two off from teaching.
Now we can return to our case study of funding for research on compressed sensing. In August, in response to a discussion on social media about the role of math theory in compressed sensing work, UCLA’s Tao defended theory’s inclusion in research grants:
it has been argued that the rigorous theorems that guaranteed with mathematical certainty that compressed sensing algorithms actually worked…was not as necessary as advertised, since researchers in medical imaging (as well as in other fields, such as seismology, astronomy, and statistics) were empirically discovering very similar algorithms at various times (including some that predate my own work by decades)...[But what the theory] brought to the field was a clarity, insight, generality, and level of trust that was not being produced just from the empirically derived results alone.
Tao also noted that the theoretical work helped convince manufacturers such as Siemens to incorporate compressed sensing algorithms into their imaging machinery, and that it suggested conditions under which the methods would work.
This seems like a plausible argument, until it becomes clear that Tao’s use of the term theory seems not to have primarily referred to theory for theory’s sake (TTS), but rather work like his colleague Donoho’s, which though mathematical, consisted of mathematical derivations that developed the actual technology. He notes that the key ingredient to convincing imaging engineers was the theory’s use of linear algebra–a very basic branch of math taken by freshman or sophomore undergrads, often as part of the calculus series. This certainly is not TTS, which uses deep core material on analysis from doctoral curricula. Donoho, in making the same point regarding the need to persuade engineers of the correctness of the methods, specifically refers to the “theory” work as “derivations” (though he does say that Tao’s name lent some credence).
At the core of the compressed sensing work is sparseness. Imaging math typically expresses pixel intensities as weighted sums involving two-dimensional waves. This is how JPEG compression is handled in your camera, for example. Compressed sensing methods assume that only a few of those weights are nonzero, with the rest being exactly zero. This condition is never satisfied in practice. Subsequent work by Candès was similarly theoretical.
Again, Candès and Tao’s work is quite impressive in terms of pure mathematical prowess. But Tao’s claim that his “own research at IPAM … helped lead to the algorithms that now cut MRI scan times by a factor of up to 10” is a stretch. Tellingly, the patents for the methods were awarded to Donoho (and Stanford), not Candès or Tao.
“An environment of reduced research funding is here to stay.”
No matter which political party is in power in the coming years, an environment of reduced research funding is here to stay. There is bipartisan agreement that a major reduction in overhead costs is warranted, and continuing budget tightening will impact total research levels. Indeed, historically, science research funding has tended to fare better under Republicans rather than Democrats.
With these conditions in mind, much TTS funding is unwarranted from the point of view of allocation of precious research dollars. Valuable TTS research is not threatened by these cutbacks. We know this because before the universities became so dependent on federal dollars, TTS fared just fine under the older paradigm previously described, in which students worked as TAs and faculty conducted unfunded summer research. Tao’s statement that “I myself have been fortunate to be supported by [grants] for almost the entirety of my professional career, allowing me to conduct research in the summer,” flies in the face of pre-federal largesse custom. To his credit, he has temporarily deferred his summer pay to prioritize his students, but can he not scrape by on his 9-month base salary of $533,730?
The intellectual depth of TTS-type research has often contributed to academic snobbery, an unwarranted sense of entitlement, and institutional insularity. The top computer science conferences will only accept people as (unpaid) manuscript reviewers if they have a first-authored paper in one of those conferences. All this results in poor use of funds, ignoring work that may be of greater economic and societal value.
Take software. There has been an attitude that “anyone can write code,” but not everyone can write good, useful, generalizable code. One major analog of wet lab work in the math sciences is large-scale open source software development, such as the OpenICS system of Zhao et al at Arizona State University, with funding from the NSF and NVIDIA. This type of work should be recognized as worthy of funding. There is a similar bias against simulation software: empirical experiments to evaluate concepts, theories and methods. Again, doing this well is not a trivial undertaking at all, and it can have great value. This too should be recognized as worthy.
What is more, while it is wonderful that grants pay really bright people to romp in worlds of abstract math theory at IPAM, just a few miles away many kids in LA schools can’t even add fractions. I have long called for a “Manhattan Project” to address the crisis in math education. Tragically, some supposed “solutions” popular with progressives involve watering down the curriculum. Instead, we need strong, coordinated efforts, not just on instructional methods, but on motivating kids and helping parents provide support. Charismatic figures from not only academia but also the sports and entertainment worlds should be recruited in a sustained effort. The alphabet soup of federal science funding agencies should band together and lead this effort.
As someone who has undergone his share of medical imaging procedures, my hat is off to compressed sensing researchers, especially Donoho. But recognizing their contributions while seeing the need for broad-based reform of STEM education and research at all levels.