…you know, regulated, ethical, human-centered AI, by responsible companies who we can trust, does have enormous potential. That is not where we are. We are at the opposite of that. We have shadowy companies owned by people we cannot trust, because we can look at their past behavior. We can see there is no ethical thought about what the possible consequences are. Us, as humans, are not at the center of it. We’re not being asked to consent.
(Carole Cadwalladr, on Democracy Now, June 5, 2025)
One missed opportunity among many
The most cited of Freeman Dyson’s non-technical papers is his 1972 American Mathematical Society Gibbs lecture, published in the AMS Bulletin under the title “Missed opportunities.” Dyson’s lecture lists a series of questions that were being considered at the same time — in a few cases even in the same place — by two different scientists or groups of scientists, usually a physicist and a mathematician, and argues that if they had been communicating with each other at the time they would have developed the unified perspective that was only established many years later. In his first example, which concerns identities involving Ramanujan’s τ function and related functions, Dyson himself was both the physicist and the mathematician, but Dyson the physicist only recognized the relevance of formulas discovered by Dyson the mathematician when they were rediscovered, with a conceptual explanation, by Ian MacDonald.
I was reminded of Dyson’s article the other day when I realized that two meetings on AI took place this spring at the Institute for Advanced Study — Dyson’s former professional home, and the place where he and MacDonald, unbeknownst to each other, were both working on the τ function identities at exactly the same time — one just more than a month after the other, with no coordination, no common themes, and apparently no visible communication of any kind. I’ve seen no evidence that even a single person attended both events.
The earlier of the two meetings was the week-long DeepMind Math+AI Workshop hosted in the IAS School of Mathematics, and organized by IAS Professor Akshay Venkatesh and Alex Davies of Google DeepMind. The second meeting was a public panel discussion, entitled
RARE/EARTH
The Geopolitics of Critical Minerals and the AI Supply Chain
hosted by the ST & SV [Science, Technology, and Social Values] Lab directed by IAS Social Science Professor Alondra Nelson. RARE/EARTH’s website tells you what the panel discussed:
The global AI ecosystem is deeply dependent on critical minerals—primarily sourced from countries in the Global South—that are essential to hardware production, energy infrastructure, and the deployment of AI systems. As the tech economy rapidly expands, so too does its appetite for natural resources: critical minerals, energy, land, and water.
The emergence of generative AI has not only introduced new geopolitical dynamics but also intensified long-standing tensions over access to the natural resources that power its extensive supply. A growing contest over critical minerals—described by some as the next “critical mineral supercycle”—is reshaping global power relations and heightening geopolitical tensions, particularly between China and the United States.
I urge you to watch the panel discussion video on YouTube (though you have to know where to look: here is the link, which is not indicated on the website). Thanks to Google’s speech recognition algorithms,1 I can tell you that Alondra Nelson said, among other things,
…when we turn to critical minerals we encounter a kind of definitional challenge governments regularly revise their critical mineral lists based on changing geopolitical circumstances supply vulnerabilities and technological needs… what we do know is this the materials essential for AI infrastructure including copper lithium cobalt gallium and a host of rare earth uh rare earth elements are increasingly the subject of what political scientists Henry Ferrell and Abraham Newman call weaponized interdependence that is countries are discovering that their control over critical resources can be wielded as instruments of geopolitical influence turning economic dependencies into tools of statecraft levers in trade wars and weapons for national security…
… a single large-scale data center requires over a thousand tons of copper for its power networks circuit boards and cooling systems the semiconductors that power machine learning demand gallium germanium and indium the batteries that store energy for these operations need lithium and cobalt each of these minerals has a geography a politics a set of communities whose lives are reorganized around their extraction when we map these dependencies honestly the AI stack begins to look less like a technical diagram and more like what Kate Crawford calls and her revvelatory [sic] book Atlas of AI a novel form of global empire one that requires the subordination of landscapes and communities across the world to serve the computational ambitions of a relative few and the algorithmic aspirations of perhaps many…
Nelson mentioned the subordination of regions in the Global South, notably the Atacama Desert in Chile (lithium) and the Democratic Republic of Congo (practically everything), which has been well documented, but also
…communities within the United States as well in Boxtown a southwest Memphis neighborhood founded by formerly enslaved people residents are witnessing the world's wealthiest man consume their air and water to power AI systems the XAI facility operates gas turbines without required pollution controls likely making it the largest industrial polluter in Tennessee and a community where cancer rates already exceed the national average by a factor of four…
The four panelists continued in this materialist vein for an hour or so, saying things like
…everywhere I went there were political troubles almost everywhere I went where there was violence or chaos or unrest which is sort of the superficial label that we journalists slapped onto these things if you scratched a little bit you could find uh resource problems at their root…
or
…that deep lived experience of the material realities of these systems it blew me away particularly because artificial intelligence is so commonly misunderstood as being immaterial as abstract as you know code in the cloud when in fact you know the opposite is the case it is one of the most resource inensive architectures on this planet in fact I saw that you know just last week Sam Altman was saying that AI is currently the largest infrastructure project in the world and I think that's about right…
or
…how quickly can we build AI how fast can we get it out there and how can we participate in what is now framed as a geopolitical race [i.e., the race with geopolitical rivals, especially China: MH] … this is such a damaging framework because it is a sort of a permission note to really do anything that you want in terms of extraction at scale exploitation of populations and of course the mass harvesting of data…
What is the cost?
The Q&A ended with Howard W. French getting the last word, reiterating in the strongest terms that AI is in the first place the extraction and transportation and organization of material stuff:
figuring out you know physics equations I think is well within the the the capabilities if not present in the very near future of AI I think the problem is the problem that has been raised here most recently in this conversation what's the cost of these things what's the cost and the cost isn't just the dollar cost or the cost in water the cost those are real costs the costs are also the human costs like what are we doing to ourselves as human beings when we insist that everything must pass through AI
and
the ocean you know Zaire or Congo New Guinea wherever you wherever the stuff we've decided we must have it because we can't afford to lose this race is going to get plowed under if nothing is done uh and that's what this conversation needs to be about what is the cost yeah uh well and I I think the question of the the national security line right now being used as an excuse to further concentrate resources and power…
To deduce what was discussed at the earlier meeting, and in particular whether the “conversation” there was about what it “needs to be about,” you have to look at the online list of talks, with titles like The Future of AI-Mathematician Interaction, or else ask someone who was there. The organizers were hoping for an honest discussion, and it seems that discussions involving the tech industry are only honest when they are off the record. But my informant who was on the spot tells me that mining, water, and energy were not mentioned in any of the talks.
The opportunity for the mathematicians and engineers to talk about, or to be forced to think about, the material preconditions for the development of AI, might not have been missed had the ST & SV lab not been limited by its own conceptual blinders. There’s no sign that Nelson and her associates reached out to the participants in the earlier workshop, even though they had every reason to know that it had taken place a month earlier, that many of the participants were still on hand, and moreover that they were sitting no more than five minutes away from where 80 years earlier John von Neumann began to design “a project that departed from the realm of the purely theoretical.”
If machines will indeed change mathematics, answering in the affirmative the title question posed by the article introducing two special issues of the AMS Bulletin, which Venkatesh and I cosigned along with four more editors, the most consequential change may be that mathematicians will no longer be able to ignore the material conditions that allow us to keep pretending that our activities are “purely theoretical.” Along with everyone else whose daily routine involves interaction with machines that are distant descendents of von Neumann’s design, we are dependent on mining of scarce resources taking place in what for most of us are locations far from view.2 That won’t change, but if AI, or for that matter formalization, becomes indispensable to the future practice of mathematicians, then practicing mathematics will be no less contingent on the extraction and transportation and organization of material stuff than the laboratory sciences, even if the stuff in question is located far away in a massive data center rather than on campus.
More missed opportunities on the horizon?
In retrospect I see it as a failure on my part to ensure that at least some of the articles in the AMS Bulletin special issues (here and here) — even if only my own article — reflected on what it would mean for mathematics to be irrevocably entwined with “extraction at scale exploitation of populations and of course the mass harvesting of data.” But in the meantime I’m about to leave for Cambridge, England, to confer with a collaborator about a project mentioned in passing in an earlier post. I’m not linking to that post because it was unexpectedly requested for reprinting as an article in the Mitteilungen der Deutschen Mathematiker-Vereinigung, and has thus been promoted to the status of science (or at least Wissenschaft).
The project was mentioned in passing in order to advertise my newfound willingness to participate in the formalization of mathematics — and thus to do my part in the long-range project of rendering the planet uninhabitable, fully aware of this inevitable cost — but on condition that the formalization in question be devoted specifically and exclusively to the stabilization of the twisted trace formula for function fields, as a precondition to solving part of the Langlands program.
So far no one has offered to buy what I announced I was willing to sell. It has not escaped my attention, however, that Ursula Martin, Professor of Computer Science at the University of Edinburgh and Commander of the Order of the British Empire, last seen on Silicon Reckoner as the best-informed participant in the National Academies Workshop on AI to Assist Mathematical Reasoning, will be in Cambridge at the same time, attending this conference:
(Found at this Facebook page.)
Commander Martin’s presence at such a conference may come as a surprise to those familiar with her earlier thoughts on formalization. “[F]ull formal development of large bodies of mathematics,” she had written in 1999,
is at odds with mathematical practice: many mathematicians view formal proof development unsympathetically and it is hard to see it being incorporated except in expensive reworkings of well-understood material.
She adds that the mathematical community
might prefer to spend the enormous cost of formalising the endeavour on creating some new mathematics: Atiyah's "excitement and action".
Does she no longer see formalization as entailing an “enormous cost” (apart from “the human costs like what are we doing to ourselves as human beings,” of course)? Or is formalizing no longer a distinct endeavor from “creating some new mathematics”? Will she be able to inspire the conference participants to devote their “formalizing at scale” to “creating new mathematics” rather than to the dreary and predictable prospect of forming yet another tribute band?
If “excitement and action” are what you’re looking for, you can hardly go wrong with the Langlands program. But the formalization cabal has grown so Big that, like Obama’s ocean liner of state, it’s hard for it to change course. So another missed opportunity is probably all we can expect next week.
… which is now terrifyingly accurate for those of us who would rather not experience automated eavesdropping. So far, though, Google’s transcription is agnostic about punctuation.
As Kate Crawford pointed out at the panel,
The rare earth family of minerals are actually much more common in the earth's crust than most people know but what makes them so difficult is the extraction process which is you know again it relies on a sort of you know specialized technologies but it's also enormously environmentally destructive. When you go into the practice of refining rare earths you produce an enormous amount of waste tailings but also radioactive material and you can actually go to some truly horrifying sites on this planet including a black lake in Mongolia that I write about in Atlas of AI where you can see just as all the way to the horizon just this black radioactive sludge that is the result of rare earth mining. So part of the reason that rare rare earth is seen as you know 90% in China is is not because it's the only place you can find it but because China was the only country willing to take on this horrifying amount of environmental destruction to own this particular market.
Michael --
Glad to see that you pushing on the issue of the environmental damage associated with AI.
It does seem unfair to me that you should lay the blame for the missed opportunity on the SV&SP folks. From their point of view, and of course from the point of view of the governments and corporations that they are criticizing, AI for Math is a small niche activity. If the AI community completely lost interest in AI for Math, it would only have a tiny impact on the demands for material resources. It's mostly the job of the math community be aware of the issue and to say that they don't want to be any part of that.
It's not even obvious how much larger the environmental impact of "AI for Math" would be than the "ordinary" use of high tech by the mathematicians themseles. If you take all desktops, laptops, smart phones, cloud usage, emails, Googling, online material and so on that can be associated directly with the people who are involved in one way and another with AI for Math, and compare that to the environmental impact of AI for Math activities specifically, how do they compare in environmental impact? My completely uniformed guess is that, so far, the AI for Math impact would be a fairly small fraction. That could change, of course; or I could be dead wrong
already.
Enjoy your trip to Cambridge! Best to Ursula Martin.
-- Ernie
The link in your post to RARE/EARTH panel discussion video does not seem to work (I get "This video is private" message) but it is now available on IAS YouTube channel:
https://www.youtube.com/watch?v=GxVM3cAxHfg&t=7s