8 The Steam Engine and the Potato

An Edible History of Humanity


 

8

 

THE STEAM ENGINE AND THE POTATO

 
 

It is the fashion to extol potatoes, and to eat potatoes. Every one joins in extolling potatoes, and all the world like potatoes, or pretend to like them, which is the same thing in effect.

 

—WILLIAM COBBETT, ENGLISH FARMER AND PAMPHLETEER, 1818

 

“The Offspring of Agriculture”

 

From the dawn of prehistory to the beginning of the nineteenth century, almost all of the necessities of life had been provided by things that grew on the land. The land supplied food crops of various kinds; wood for fuel and construction; fibers with which to make clothing; and fodder for animals, which in turn provided more food, along with other useful materials such as wool and leather. Butchers, bakers, shoemakers, weavers, carpenters, and shipbuilders depended on animal or vegetable raw materials, all of which were the products, directly or indirectly, of photosynthesis—the capture of the sun’s energy by growing plants. Since all these things came from the land, and since the supply of land was limited, Thomas Malthus concluded that there was an ecological limit that growing populations and economies would eventually run into. He first made this prediction on the eve of the nineteenth century, and he refined his argument in the following years.

   Yet Britain did not hit the ecological wall that Malthus anticipated. Instead, it vaulted over it and broke free of the constraints of the “biological old regime” in which everything was derived from the produce of the land. Rather than growing most of its own food, Britain concentrated on manfacturing industrial goods, notably cotton textiles, which could then be traded for food from overseas. During the nineteenth century the population more than tripled, but the economy grew faster still, so that the average standard of living increased—an outcome that would have astonished Malthus. Britain had dealt with the looming shortage of food by reorganizing its economy. By switching from agriculture to manufacturing, Britain became the first industrialized nation in the world.

   To be fair, Malthus could hardly have been expected to see this coming, since nothing like it had ever happened before. And none of it was planned: It was the accidental result of the convergence of several independent trends. Three of the most important related to changes in food production: greater specialization in handicrafts, prompted by rising agricultural productivity; the growing use of fossil fuels, initially as a land-saving measure; and an increasing emphasis on importing rather than growing food.

   The first step along the road from a farm-based to a factory-based economy was the growth of rural industry, in the form of home-based manufacturing and handicrafts. This happened throughout Europe, but it was particularly notable in England because of the unusually rapid growth in English agricultural productivity. By 1800 only 40 percent of the male labor force worked on the land, compared with 65 to 80 percent in continental Europe. The number of men working in agriculture in 1800 was about the same as it had been two hundred years earlier, but the introduction of new crops and improved farming techniques meant that each one was producing twice as much food. This high productivity liberated ever more workers from the land and prompted people to move into rural manufacturing, as Adam Smith explained:

 

An inland country naturally fertile and easily cultivated produces a great surplus of provisions beyond what is necessary for maintaining the cultivators . . . Abundance, therefore, renders provisions cheap, and encourages a great number of workmen to settle in the neighbourhood, who find that their industry there can procure them more of the necessities and conveniences of life than in other places. They work up the material of manufacture which the land produces, and exchange their finished work, or what is the same thing the price of it, for more materials and provisions. They give a new value to the surplus part of the rude produce . . . and they furnish the cultivators with something in exchange for it that is either useful or agreeable to them. The cultivators get a better price for their surplus produce, and can purchase cheaper other conveniences which they have occasion for . . . The manufacturers first supply the neighbourhood, and afterwards, as their work improves and refines, more distant markets . . . In this manner have grown up naturally the manufactures of Leeds, Halifax, Sheffield, Birmingham and Wolverhampton. Such manufactures are the offspring of agriculture.

 

   Once rural manufacturing had established itself in England, it intensified in the northern half of the country during the eighteenth century in response to the adoption of new agricultural techniques in the south. The use of clover and turnips in rotation with wheat and barley to increase cereal yields was less efficient on the heavy clay soils of the north and west of England, so people in those regions concentrated instead on livestock farming and manufacturing, and used the proceeds to buy grain from the south of the country. The result, by chance, was a concentration of manufacturing in just the regions of England where there were rich deposits of coal.

 

The Fuels of Industry

 

The shift to using coal rather than wood as a fuel was a second trend that contributed to Britain’s industrialization. People much preferred burning wood rather than coal in their homes, but as land became more sought after for agricultural use, areas that had previously provided firewood were cleared to make way for farming. The price of firewood shot up—it increased threefold in western European cities between 1700 and 1800—and people turned to coal as a cheaper fuel. (It was cheap in England, at least, since there were plentiful deposits near the surface.) One ton of coal provides the same amount of heat as the wood that can be sustainably harvested each year from one acre of land. In England and Wales, some seven million acres of land that had previously provided wood, or around one fifth of the total surface area, were taken under cultivation between 1700 and 1800. This ensured that the growth of the food supply could continue to keep pace with the population—but required everybody to switch to burning coal.

   And switch they did: The actual consumption of coal by 1800 was about ten million tons a year, providing as much energy as would otherwise have required ten million acres to be set aside for fuel production. At this point Britain accounted for 90 percent of world coal output, by some estimates. When it came to fuel, at least, Britain had already escaped from the constraints of the biological old regime. Rather than relying on living plants to trap sunlight to produce fuel, coal provided a way to tap vast reserves of past sunlight, accumulated millions of years ago and stored underground in the form of dead plants.

   Although it was originally exploited as an alternative to wood for domestic heating, the abundance of coal meant that it was soon being put to other uses. Arthur Young, an English agricultural writer and social observer, was struck by the relative scarcity of glass in windows while traveling in France in the 1780s; it was far more widespread in England by this time because coal provided cheap energy for glass-making. (French glassmakers, meanwhile, were so desperate for fuel that they had resorted to burning olive pits.) Coal was also heavily used by the textile industry, to warm the liquids used in bleaching, dyeing, and printing and to heat drying rooms and presses. Coal enabled a rapid expansion in the production of iron and steel, which had previously been smelted using wood. And, of course, coal was used to power steam engines, a technology that emerged from the coal industry itself.

   Once England’s outcropping surface deposits of coal had been depleted, it was necessary to sink mine shafts, and to ever greater depths—but the deeper they went, the more likely they were to flood with water. The steam engine invented by Thomas Newcomen in 1712, building on the work of previous experimenters, was built specifically to pump water out of flooded mines. Early steam engines were very inefficient, but this did not matter very much since they were powered by coal—and in a coal mine the fuel was, in effect, free. Hundreds of Newcomen engines had been installed in mines around England by 1800. The next step was taken by James Watt, a Scottish inventor who was asked to repair a Newcomen engine in 1763 and quickly realized how its wasteful design could be improved upon. His design, completed in 1775, was much more efficient and was also better suited to driving machinery.

   This meant steam power could be applied to the various labor-saving devices that had been devised in the textile industry, providing an enormous increase in productivity. In 1790 the first steam-powered version of Samuel Crompton’s “mule,” a machine that spun cotton into yarn, increased the output of thread per worker 100-fold over a manual spinning wheel, for example. So much thread could be produced that looms also had to be automated to make use of it. By putting these various machines together in a single factory, so that the product of one stage of processing could be passed on to the next stage, as on a sugar plantation, it was possible to achieve further improvements in productivity. By the end of the eighteenth century Britain could produce textiles so cheaply and in such abundance that it began exporting them to India, devastating that country’s traditional weaving trade in the process.

   The third shift that underpinned Britain’s Industrial Revolution was a far greater reliance on food imports. Just as it used coal from underground to power its new steam engines, Britain used food from overseas to provide energy for its workers. From its possessions in the West Indies, it brought in vast quantities of sugar, which provided an astonishing proportion of Britain’s caloric intake during the nineteenth century, increasing from 4 percent of all calories consumed in 1800 to 22 percent by 1900. Sugar flowed eastward across the Atlantic, paying for manufactured goods that traveled in the opposite direction. Since an acre of sugar produces as many calories as nine to twelve acres of wheat, imported sugar provided the caloric equivalent of the produce of 1.3 million “ghost acres” of wheat-farming land in 1800, rising to 2.5 million acres in 1830 and around 20 million acres by 1900. Britain had clearly escaped the constraints of its limited land area by producing industrial goods, which did not require much land to manufacture, and trading them for food, which did.

   Sugar was of course used to sweeten tea, the favored drink of industrial workers, which helpfully delivered energy (from the sugar) and kept them alert during long shifts (since tea contains caffeine). Sugar was also consumed as a foodstuff, to enliven an otherwise monotonous diet: It could be added to porridge in the form of treacle or molasses, and eaten as jam (containing 50 to 65 percent sugar) in sandwiches. Treacle or jam spread on bread was favored by working families in the industrial cities because it was a cheap source of calories and could be prepared quickly without the need to cook anything. Many women were now working in factories, and they no longer had time to prepare soup. The price of sugar fell and the availability of jam shot up after 1874, when Britain abolished its tariffs on sugar imports, which dated all the way back to Charles II and his pineapple in 1661.

   It was not just the sugar in the jam that was imported; so too, increasingly, was the wheat used to make the bread. As the prospect of food shortages loomed in the late eighteenth century, Britain began to import more food from Ireland. Following the Act of Union of 1801, Ireland was technically part of the United Kingdom, but in practice it was treated as an agricultural colony by the English. Laws which had forbidden the importing of Irish animal products into England had been repealed in 1766, and by the end of the 18th century imports of Irish beef had gone up threefold, butter sixfold, and pork sevenfold. By the early 1840s, imports from Ireland were supplying one sixth of England’s food. This food was produced by men who worked on the best, most easily cultivated land and were typically given small patches of inferior land on which they grew potatoes to support themselves and their families. The English could only keep eating bread, in short, because the Irish were eating potatoes. By sustaining Irish farm workers, the potato helped to fuel the first few decades of British industrialization.

 

The Potato Famine and Its Consequences

 

Britain’s example appeared to have proved Malthus wrong, but in at least one respect he was ominously prescient. At the beginning of the nineteenth century Malthus had disagreed with the idea that potatoes provided the answer to the food problem, as they seemed to have done in Ireland. In The Question of Scarcity Plainly Stated and Remedies published in 1800, Arthur Young had suggested that the British government ought to give every country laborer with three or more children half an acre of land on which to grow potatoes and keep one or two cows. “If each had his ample potato-ground and a cow, the price of wheat would be of little more consequence to them than it is to their brethren in Ireland,” he wrote. But Ireland’s reliance on the potato was not something that other countries should seek to emulate, Malthus declared. For if people became dependent on potatoes, a failure of the potato crop would be a catastrophe. “Is it not possible,” he wrote in response to Young’s proposal, “that one day the potato crop itself may fail?”

   Just such a catastrophe struck Ireland in the autumn of 1845. In retrospect it was a disaster waiting to happen. The potato crop had failed in previous years, at least in some parts of Ireland, and there had been a run of bad years in the 1830s. But the crop failure of 1845, caused by a previously unknown disease, was on an entirely different scale, and affected the whole country. The potato plants started to wither, while underground the tubers began to rot; fields full of apparently healthy plants were reduced to black, devastated foliage within days. This was the potato blight, caused by Phytophthora a fungus from the New World that crossed the Atlantic for the first time in 1845. Even potatoes that had been dug up before the blight manifested itself went bad and rotten within a month. What was expected to be a bumper crop—2.5 million acres of potatoes had been planted, 6 percent more than the previous year—was instead a total loss.

   The scale of the devastation was unlike anything seen in some parts of Europe since the Black Death. The potato crop failed again in 1846, and the famine continued because farmers gave up planting potatoes in subsequent years. The people faced not just starvation, but disease. William Forster, a Quaker who visited Ireland in January 1847, recalled the scene in one village:

 

The distress was far beyond my powers of description. I was quickly surrounded by a mob of men and women, more like famished dogs than fellow creatures, whose figures, looks and cries, all showed that they were suffering the ravening agony of hunger . . . in one [cabin] there were two emaciated men, lying at full length, on the damp floor . . . too weak to move, actually worn down to skin and bone. In another a young man was dying of dysentry; his mother had pawned everything . . . to keep him alive; and I never shall forget the resigned, uncomplaining tone in which he told me that all the medicine he wanted was food.

 

   In Ireland around one million people starved to death as a result of the famine or were carried off by the diseases that spread in its wake. Another million emigrated to escape the famine, many of them to the United States. The potato blight also spread across Europe, and for two years there were no potatoes to be had anywhere. But Ireland’s unrivaled dependence on the potato meant that it suffered the most.

   As the magnitude of the disaster became apparent in late 1845, the British prime minister, Sir Robert Peel, found himself in a difficult situation. The obvious response to the famine was to import grain from abroad to relieve the situation in Ireland. The problem was that such imports were at the time subject by law to a heavy import duty to ensure that homegrown grain would always cost less, thus protecting domestic producers from cheap imports. The Corn Laws, as they were known, were at the heart of a long-running debate that had pitted the aristocratic landowners, who wanted the laws to stay in place, against an alliance of opponents led by industrialists, who demanded their abolition.

   The landowners argued that it was better to rely on homegrown wheat than unreliable foreign imports, and warned that farmers would lose their jobs; they left unspoken their real concern, which was that competition from cheap imports would force them to reduce the rents they charged the farmers who worked their land. The industrialists said it was unfair to keep the price of wheat (and hence bread) artificially high, given that most people now bought food rather than growing their own; but they also knew that abolition would reduce demands for higher wages, since food prices would fall. Industrialists also hoped that cheaper food would leave people with more money to spend on manufactured goods. And they favored abolition of the Corn Laws because it would advance the cause of “free trade” in general, ensuring easy access to imported raw materials on one hand, and export markets for manufactured goods on the other. The debate over the Corn Laws was, in short, a microcosm of the much larger fights between agriculture and industry, protectionism and free trade. Was Britain a nation of farmers or industrialists? Since the landowners controlled Parliament, the argument had raged throughout the 1820s and 1830s to little effect.

   The outcome was determined by the potato, as the famine in Ireland brought matters to a head. Peel, who had vigorously opposed the abolition of the Corn Laws in a Parliamentary debate in June 1845, realized that suspending the tariff on imports to Ireland in order to relieve the famine, but keeping it in place elsewhere, would cause massive unrest in England, where people would still have to pay artificially high prices. He became convinced that there was no alternative but to abolish the Corn Laws altogether, a reversal of his government’s policy. At first he was unable to persuade his political colleagues, but some of them changed their minds as the news from Ireland worsened and it became apparent that the survival of the government itself was at stake. Finally, with a vote in May 1846, the Corn Laws were repealed. The support of the Duke of Wellington, an aristocratic war hero who had long been a strong supporter of the Corn Laws, was crucial. He persuaded the landowners who sat in the House of Lords to back the repeal on the grounds that the survival of the government was more important. But he privately conceded that “those damned rotten potatoes” were to blame for the demise of the Corn Laws.

   The lifting of the tariff on imported grain opened the way for imports of maize from America, though in the event the government mishandled the aid effort and it made little difference to the situation in Ireland. The removal of the tariff also meant that wheat could be imported from continental Europe to replace the much diminished Irish supply. In the second half of the nineteenth century British wheat imports soared, particularly once the construction of railways in the United States made it easy to transport wheat from the Great Plains to the ports of the East Coast. Within Britain, meanwhile, the shift from agriculture to industry accelerated. The area of land under cultivation and the size of the agricultural workforce both went into decline in the 1870s. By 1900, 80 percent of Britain’s main staple, wheat, was being imported, and the proportion of the labor force involved in agriculture had fallen to less than 10 percent.

   Coal was not the only fuel that had driven this industrial revolution. The growth in agricultural productivity that had started two centuries earlier (supplemented by sugar from the Caribbean) and the supply of wheat from Ireland (made possible by the potato) had also played their part in carrying England over the threshold into the new industrial age. And by clearing away the obstacle to a greater reliance on food imports, the tragedy of the potato famine helped to complete the transformation.

 

Food and Energy Revisited

 

It is no exaggeration to suggest that the Industrial Revolution marked the beginning of a new phase in human existence, just as the Neolithic revolution associated with the adoption of farming had done some ten thousand years earlier. Both were energy revolutions: Deliberate farming of domesticated crops made a greater proportion of the solar radiation that reaches Earth available to mankind, and the Industrial Revolution went a step farther, exploiting solar radiation from the past, too. Both caused massive social changes: a switch from hunting and gathering to farming in the former case, and from agriculture to industry in the latter. Both took a long time to play out: It was thousands of years before farmers outnumbered hunter-gatherers globally, and industrialization has only been under way for 250 years, so only a minority of the world’s population lives in industrialized countries so far—though the rapid development of China and India will soon tip the balance. And both are controversial: Just as it is possible to argue that hunter-gatherers were better off than farmers and that the adoption of agriculture was a big mistake, a case can also be made that industrialization has caused more problems than it has solved (though this argument is most often advanced by disillusioned people in rich, industrialized countries). There have been dramatic environmental consequences in both cases, too: Agriculture led to widespread deforestation, and industrialization has produced vast quantities of carbon dioxide and other greenhouse gases that have started to affect the world’s climate.

   In this sense the industrialized countries have not escaped Malthus’s trap after all, but have merely exchanged one crisis, in which the limiting factor was agricultural land, for another, in which the limiting factor is the atmosphere’s ability to absorb carbon dioxide. The possibility that the switch to fossil fuels might provide only a temporary respite from Malthusian pressures occurred even to nineteenth-century writers, notably William Stanley Jevons, an English economist and author of The Coal published in 1865. “For the present,” he wrote, “our cheap supplies of coal and our skill in its employment, and the freedom of our commerce with other wider lands, render us independent of the limited agricultural area of these islands, and apparently take us out of the scope of Malthus’s doctrine.” The word apparently did not appear in the first edition of the book, but Jevons added it to a later edition shortly before his death in 1882.

   He was right to worry. In the early twenty-first century, renewed concerns about the connection between energy supplies and the availability of sufficient land for food production have been raised once again by the growing enthusiasm for biofuels, such as ethanol made from maize and biodiesel made from palm oil. Making fuel from such crops is appealing because it is a renewable source of energy (you can grow more next year) and over its life cycle it can produce fewer carbon emissions than fossil fuels. As plants grow, they absorb carbon dioxide from the air; they are then processed into biofuel, and the carbon dioxide goes back into the atmosphere when the fuel is burned. The whole process would be carbon neutral, were it not for the emissions associated with growing the crops in the first place (fertilizer, fuel for tractors, and so on) and then processing them into biofuels (something that usually requires a lot of heat). But exactly how much energy is required to produce various biofuels, and the level of associated carbon emissions, varies from crop to crop. So some biofuels make more sense than others.

   The type that makes least sense is ethanol made from maize (corn), which is, unfortunately, the predominant form of biofuel, accounting for 40 percent of world production in 2007, most of it in the United States. The best-guess figures suggest that burning a gallon of corn ethanol produces only about 30 percent more energy than was needed to produce it, and reduces greenhouse-gas emissions by about 13 percent compared with conventional fossil fuel. That may sound impressive, but the corresponding figures for Brazilian sugarcane ethanol are about 700 percent and 85 percent respectively; for biodiesel made in Germany they are 150 percent and 50 percent. Put another way, making a gallon of corn ethanol requires four fifths of a gallon of fossil fuel (not to mention hundreds of gallons of water), and does not reduce greenhouse-gas emissions by very much. America’s corn-ethanol drive makes even less sense on economic grounds: To achieve these meager reductions in emissions, the United States government subsidizes corn-ethanol production to the tune of some seven billion dollars a year, and also imposes a tariff on sugarcane ethanol from Brazil to discourage imports. Corn ethanol seems to be an elaborate scheme to justify farming subsidies, rather than a serious effort to reduce greenhouse-gas emissions. England abolished its farmer-friendly Corn Laws in 1846, but America has just introduced new ones.

   Enthusiasm for corn ethanol and other biofuels is one of the factors that has helped to drive up food prices as crops are diverted to make into fuel, so that they are in effect fed to cars, not people. Opponents of biofuels like to point out that the maize needed to fill a vehicle’s twenty-five-gallon tank with ethanol would be enough to feed one person for a year. Since maize is also used as an animal feed, its higher price makes meat and milk more expensive, too. And as farmers switch their land from growing other crops to growing corn instead, those other crops (such as soy) become scarcer, and their prices also rise. Food and fuel are, it seems, once again competing for agricultural land. Cheap coal meant that English landowners in the eighteenth century realized their land was more valuable for growing food than fuel; concern about expensive oil today means American farmers are making the opposite choice, and growing crops for fuel rather than for food.

   Biofuels need not always compete with food production, however. In some cases, it may be possible to grow biofuel feedstocks on marginal land that is unsuitable for other forms of agriculture. And those feedstocks need not be food crops. One potentially promising approach is that of cellulosic ethanol, in which ethanol is made from fast-growing, woody shrubs, or even from trees. In theory, this would be several times more energy efficient even than sugarcane ethanol, could reduce greenhouse-gas emissions by almost as much (a reduction of around 70 percent compared with fossil fuels), and would not encroach upon agricultural land. The problem is that the field is still immature, and expensive enzymes are needed to break down the cellulose into a form that can be made into ethanol. Another approach involves making biofuel from algae, but again the technology is still in its early days.

   What is clear is that the use of food crops for fuel is a step backward. The next logical step forward, after the Neolithic and Industrial revolutions, is surely to find new ways to harness solar energy beyond growing crops or digging up fossil fuels. Solar panels and wind turbines are the most obvious examples, but it may also be possible to tinker with the biological mechanism of photosynthesis to produce more efficient solar cells, or to create genetically engineered microbes capable of churning out biofuels. The trade-off between food and fuel has resurfaced in the present, but it belongs in the past.