Debates about technology and progress are often framed in terms of “optimism” vs. “pessimism.” For instance, Steven Pinker, Matt Ridley, Johan Norberg, Max Roser, and the late Hans Rosling have been called the “New Optimists” for their focus on the economic, scientific, and social progress of the last two centuries. Their opponents, such as David Runciman and Jason Hickel, accuse them of being blind to real problems in the world, such as poverty, and to risks of catastrophe, such as nuclear war.
Economic historian Robert Gordon calls himself “the prophet of pessimism.” His book The Rise and Fall of American Growth warned that the days of high economic growth are over for the United States and will not return. Gordon’s opponents include a group he calls the “techno-optimists,” such as Andrew McAfee and Erik Brynjolfsson, who have predicted a growth spurt in productivity from information technology.
It’s tempting to choose sides. But while it can be rational to be optimistic or pessimistic on any specific question, these terms are too imprecise to be adopted as a general intellectual identity. Those who identify as optimists can be too quick to dismiss or downplay the problems of technology, while self-styled technology pessimists or progress skeptics can be too reluctant to believe in solutions.
As we look forward to the post-pandemic recovery, once again we’re being tugged between the optimists, who highlight all the diseases that may soon be beaten through new vaccines, and the pessimists, who warn that humanity will never win the evolutionary arms race against microbes. But this represents a false choice. History provides us with powerful examples of people who were brutally honest in identifying a crisis but were equally active in seeking solutions.
At the end of the 19th century, William Crookes—physicist, chemist, and inventor of the Crookes tube (an early type of vacuum tube)—was the president of the British Association for the Advancement of Science. On September 7, 1898, he used the traditional annual address to the association to issue a dire warning.
The British Isles, he said, were at grave risk of running out of food. His reasoning was simple: the population was growing exponentially, but the amount of land under cultivation could not keep pace. The only way to continue to increase production was to improve crop yields. But the limiting factor on yields was the availability of nitrogen fertilizer, and the sources of nitrogen, such as the rock salts of the Chilean desert and the guano deposits of the Peruvian islands, were running out. His argument was detailed and comprehensive, based on figures for wheat production and land availability from every major European country and colony; he apologized in advance for boring his audience with statistics.
He criticized the “culpably extravagant” waste of nonrenewable nitrogen resources. To those who looked myopically only at recent years of the harvest, which had been quite sufficient, he pointed out that those years had been unusually fruitful, which masked the problem. The bounty of the recent past was no guarantee of prosperity in the future.
In a sense, Crookes was an “alarmist.” His purpose was to draw attention to a problem caused by progress and growth. He sought to open the eyes of the complacent. He began by saying that “England and all civilized nations stand in deadly peril,” variously referring to “a colossal problem” of “urgent importance,” an “impending catastrophe,” and “a life-and-death question for generations to come.” To those who would call him alarmist, he insisted that his message was “founded on stubborn facts.”
Crookes caused a sensation, and many critics spoke against his message. They pointed out that wheat wasn’t the only food, that people would moderate consumption of it if necessary, and that land for wheat could be taken from what was used for meat and dairy production, especially as prices rose. They said that he underestimated the opportunities for American farmers to supply food to other nations, by better adapting their methods to the soil and climate so as to increase production.
Writing in Nature in 1899, one R. Giffen compared Crookes to Thomas Malthus, and to others who had predicted shortages of various natural resources—such as Eduard Suess, who had said that gold would run out, and William Stanley Jevons, who warned about Peak Coal. Giffen’s tone is weary as he notes that “there has been much experience of these discussions since the time of Malthus.” Every time, he explains, we’ve been unable to make precise forecasts because the anticipated limits to growth are too far in the future, or we know too little about their causes.
But Crookes had always intended his remarks to take “the form of a warning rather than of a prophecy.” In the speech, he said:
“It is the chemist who must come to the rescue … Before we are in the grip of actual dearth the chemist will step in and postpone the day of famine to so distant a period that we and our sons and grandsons may legitimately live without undue solicitude for the future.”
Crookes’s plan was to tap a virtually unlimited source of nitrogen: the atmosphere. Plants can’t use atmospheric nitrogen directly; instead, they use other nitrogen-containing compounds, which in nature are manufactured from atmospheric nitrogen by certain bacteria, a process called fixation. Crookes said that the artificial fixation of atmospheric nitrogen was “one of the great discoveries awaiting the ingenuity of chemists,” and he was optimistic that it could happen soon, calling it “a question of the not-far-distant future.”
He devoted a significant part of his speech to exploring this solution. He pointed out that nitrogen can be burned at sufficiently high temperatures to create nitrate compounds, and that this can be done using electricity. He even estimated practical details, such as the cost of the nitrates produced this way, which was competitive at market rates, and whether the process could be scaled up to industrial levels: the new hydroelectric plant at Niagara Falls, he concluded, would alone provide all the electricity needed to make up the gap he had forecast.
Crookes knew that synthetic fertilizer wasn’t a permanent solution, but he was satisfied that when the problem reappeared in the distant future, his successors would be able to deal with it. His alarmism was not a philosophical position, but a contingent one. Once the facts of the situation were changed by the invention of suitable technology, he was happy to call off the alarm.
Was Crookes correct? By 1931, the year he had said we could run out of food, it was clear that his predictions had not been perfect. The harvest had increased, but not because crop yields greatly improved. Instead, acreage had actually increased, to a degree Crookes had thought impossible. This happened in part because of improvements in mechanization, including the gas tractor. Mechanization drove down labor costs, which made marginally yielding lands profitable. As often happens, a solution came from an unexpected direction, invalidating the assumptions of forecasters both optimistic and pessimistic.
But if Crookes was not correct in his detailed predictions, he was correct in essence. His two key points were accurate: one, that food in general and yields in particular were problems that would have to be reckoned with in the next generation or so; two, that synthetic fertilizer from the fixation of atmospheric nitrogen would be a key aspect of the solution.
Less than two decades after his speech, the German chemist Fritz Haber and industrialist Carl Bosch developed a process to synthesize ammonia out of atmospheric nitrogen and hydrogen gas. Ammonia is a chemical precursor of synthetic fertilizers, and the Haber-Bosch process is still one of the most important industrial processes today, providing fertilizer for almost half the world’s food production.
The chemist, ultimately, did come to the rescue.
So was Crookes an optimist or a pessimist? He was pessimistic about the problem—he was not complacent. But he was optimistic about finding a solution—he was no defeatist, either.
In the 20th century, fears of overpopulation and food supply once again reared their heads. In 1965, the world population growth rate reached an all-time high of 2% per year, enough to double every 35 years; and as late as 1970, it is estimated, over a third of people in developing countries were undernourished.
The 1968 book The Population Bomb, by Paul and Anne Ehrlich, opened with a call for surrender: “The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate.” In 1970, Paul Ehrlich reinforced the defeatism, saying that in a few years “further efforts will be futile” and “you may as well look after yourself and your friends and enjoy what little time you have left.” Because they saw the situation as hopeless, the Ehrlichs supported a proposal to cut off aid to countries such as India that were seen as not doing enough to limit population growth.
Fortunately for India and the rest of the world, others were not ready to give up. Norman Borlaug, working in Mexico in a program funded by the Rockefeller Institute, developed high-yield varieties of wheat that resisted fungal disease, used fertilizer more efficiently, and could grow at any latitude. In the 1960s, thanks in part to the new grains, Mexico transformed itself from an importer to an exporter of wheat and India and Pakistan nearly doubled their yields, averting the famine that the Ehrlichs saw as inevitable.
Yet even after winning the Nobel Peace Prize for his accomplishments, Borlaug never lost sight of the challenge involved in making agriculture keep up with population, and never considered it solved for good. In his 1970 Nobel lecture, he called the increases in food production “still modest in terms of total needs” and, pointing out that half the world is undernourished, said “no room is left for complacency.” He warned that “most people still fail to comprehend the magnitude and menace of the ‘Population Monster.’” “And yet,” he continued, “I am optimistic for the future of mankind.” Borlaug was confident that human reason would eventually bring population under control (and indeed, the global birth rate has been declining ever since).
The risk of adopting an “optimistic” or “pessimistic” mindset is the temptation to take sides on an issue depending on a general mood, rather than forming an opinion based on the facts of the case. “Don’t worry,” says the optimist; “accept hardship,” counters the pessimist.
We can see this play out in debates over covid and lockdowns, over climate change and energy usage, over the promise and peril of nuclear power, and in general over economic growth and resource consumption. As the debates escalate, each side digs in: the “optimists” question whether a threat is even real; the “pessimists” deride any proposed technological solution as a false “quick fix” that merely allows us to rationalize postponing the difficult but inevitable cutbacks. (For an example of the latter, see the “moral hazard” arguments against geoengineering as a strategy to address climate change.)
To embrace both the reality of problems and the possibility of overcoming them, we should be fundamentally neither optimists nor pessimists, but solutionists.
The term “solutionism,” usually in the form of “technocratic solutionism,” has been used since the 1960s to mean the belief that every problem can be fixed with technology. This is wrong, and so “solutionism” has been a term of derision. But if we discard any assumptions about the form that solutions must take, we can reclaim it to mean simply the belief that problems are real, but solvable.
Solutionists may seem like optimists because solutionism is fundamentally positive. It advocates vigorously advancing against problems, neither retreating nor surrendering. But it is as far from a Panglossian, “all is for the best” optimism as it is from a fatalistic, doomsday pessimism. It is a third way that avoids both complacency and defeatism, and we should wear the term with pride.