Topic Tags:
0 Comments

The Word Made Wok

Michael Giffin

Aug 26 2011

29 mins

Several years ago I became a vegan, for ethical reasons; a personal choice—part of my tiny contribution to natural selection and religious belief—which only becomes a public issue when social eating occurs. Meat eaters make more of an issue of my diet than I do. Those who do usually fall back on generalisations; we need to eat meat, poultry, fish and dairy because we’ve evolved to or because we’ve been given biblical warrant to. These generalisations are mindless; neither contributes to the argument for evolution or intelligent design, or to the more challenging argument that they might be the same thing. Then there are those who—knowing little about nutrition; insensible to how their food is produced—assume vegans are missing out on essential nutrients since human health supposedly depends on an animal-based diet. I try to change the subject quickly but sometimes not before more mindlessness occurs; men are fond of showing me their incisors, as proof we’re carnivores; women are fond of reminding me that cows experience pain if they aren’t milked, and want us to drink their milk; apparently hens want us to eat their eggs too.

As we need to keep an open mind about our food, here are two thought-provoking books which argue that we are what we eat, and that our claim to be civilised depends on how we produce and distribute food. They make a nice pair as, when read together, they offer a window to the whole span of human development. The first, written by a professor of biological anthropology at Harvard, contributes to the theory of evolution. The second, written by a business editor at the Economist, makes accessible what is already known about food and the history of civilisation, and contributes to the debate over food and the future. 

Richard Wrangham begins Catching Fire by arguing that answering the question of our origins involves understanding the forces that sprung Homo erectus from its australopithecine and habiline past. According to the most popular view since the 1950s, there was a single impetus, the Man-the-Hunter or meat-eating hypothesis. The hypothesis is incomplete, though, as male hunters often returned empty-handed and the foods gathered (mainly by women) often provided most of the calories brought to camp. Also, the hypothesis implies a single leap from ape to human but, actually, it only explains the transition from australopithecine to habiline, not habiline to H. erectus; and, while australopithecines and habilines had adapted to eating raw meat, H. erectus was poorly adapted to eating raw meat. Something else must have been happening.

Wrangham proposes a possible answer: the transformative moment in human evolution involved the control of fire and the advent of cooked meals. Cooking increased the value of our food. It changed our bodies, brains, use of time, and social lives. It made us into consumers of external energy and thereby created an organism with a new relationship with nature. Animals need food, water and shelter. We need all those things but we need fire too. How long have we needed it? Few have thought about this question. Darwin didn’t pursue it. Lévi-Strauss acknowledged that cooking marked the transition from nature to culture but he believed its significance was psychological rather than biological. No one challenged his thesis.

Chapter One, “Quest for Raw-Foodists”, notes that animals thrive on raw food and asks whether humans can also. Conventional wisdom says yes—if we’re animals we should fare well on an animal diet—but this is questionable. Stories about our human (rather than our pre-human) ancestors living purely on raw food have all proven illusory. Human studies, conducted in controlled modern environments for short periods, demonstrate that raw food diets provide positive health outcomes; cholesterol levels fall, blood pressure returns to normal, and volunteers lose weight even without exercise. In uncontrolled environments, however, long-term raw foodists are significantly more likely to suffer from energy depletion, loss of libido, and infertility; female raw foodists are more likely to have irregular periods or cease to menstruate. Hypothetically, a purely raw food diet would have created evolutionary disadvantages for humans.

The evolutionary question isn’t eating a plant-based diet, or an animal-based diet, or some combination of both, it’s what cooking does to food and what cooked food does to humans. Among people who eat cooked food, there’s no significant difference in body weight between vegetarians and meat eaters; when our food is cooked we get as many calories from plants as animals. It’s only when on a purely raw diet that we suffer poor weight gain. Raw foodists can only thrive in rich modern environments where they depend on eating exceptionally high-quality foods. Wild animals don’t have the same option and flourish on raw foods. The implication is clear: there’s something odd about us. We aren’t like other animals. We need a proportion of cooked food in our diet.

Chapter Two, “The Cook’s Body”, acknowledges that our pre-human ancestors utilised a raw food diet as efficiently as apes; however, for reasons that remain unclear, they lost this ability at some evolutionary point, and, as they adapted to cooked food, the spontaneous advantages were complemented by evolutionary benefits. Natural selection favoured humans with small guts; because they were able to digest their food well, at a lower cost than before, the result was increased energy efficiency. This is evident when we compare the human digestive system with those of chimpanzees and other apes; our tiny lips, smaller mouths, smaller teeth and weaker jaws fit well with the softness, high caloric density, low fibre content and high digestibility of cooked food.

The way food moves through our body also challenges the Man-the-Hunter hypothesis; we lack the carnivore system of retaining food in our stomachs; also, the hypothesis can’t explain why we digest plant foods so efficiently. Plant foods are vital because humans need large amounts of their carbohydrates and fats. Without these, we depend on protein for our energy, and excessive protein induces a form of poisoning. Also, cooking changed the chemistry of our food, which must have affected our digestive systems, creating some toxins and reducing others; for example, humans have become less able to tolerate some of the toxins other apes tolerate. Finally, if we were adapted to a raw-meat diet we would have developed resistance to the toxins produced by bacteria on raw meat.

Chapter Three, “The Energy Theory of Cooking”, suggests that the idea of cooked food yielding more energy—allowing humans to gain weight and reproduce better—is poorly understood. Standard references, through which we obtain authoritative nutritional information, tell us cooking food changes water content, and reduces the concentration of vitamins, but the density of calories remains unchanged whether food is eaten raw or is roasted, grilled or boiled. This conflicts with contrary conclusions from nutritional scientists; some argue cooking accelerates digestion; others argue cooking has a negative effect on energy value. The solution is to understand how, overall, cooking creates a positive energy gain in food, through gelatinising starch, denaturing protein, and softening.

Research of digestion in the small intestine (ileum) demonstrates that cooked starch is significantly more digestible than raw starch; also, cooking consistently increases the glycaemic index of starch. The effects of cooking on animal protein are widely debated and have never been formally investigated—some argue cooking reduces protein digestibility; others argue cooking increases it—but recent studies indicate cooked protein is digested much more completely than raw protein. Also, the widespread myth that eggs should be eaten raw has been disproven, as has the myth that indigenous people didn’t cook eggs (most did). When our ancestors first obtained extra calories by cooking their food, they passed on more genes than others of their species who ate raw. The result was a new evolutionary opportunity. 

Chapter Four, “When Cooking Began”, describes recent tests in which no primate, when offered the choice, preferred their food raw; perhaps our ancestors were similar; perhaps their pre-existing sensory and brain mechanisms also preferred cooked foods. Archaeologists are divided over when humans began to control fire and what they originally used controlled fire for; some argue only for warmth and light, believing there was a long gap before fire was used for cooking; some say humans were cooking as late as 40,000 years ago; some say as early as 500,000 years ago; a common compromise is in between. Wrangham believes important evidence comes from the fossil record; when our ancestors’ anatomy changed to accommodate a cooked diet; when cooking became a regular occurrence, not merely an occasional activity.

Two kinds of evidence independently link regular cooking with H. erectus. First, H. erectus was vulnerable to predators, lost the traits that allowed efficient climbing, and slept on the ground, all of which are hard to explain without the protection offered by controlled fire at night. Second, anatomical changes relating to diet, including the reduction in tooth size and the flaring of the ribcage, were larger then than at any other time in human evolution, which fit the theory that the nutritional quality of the diet had improved and the food consumed was softer. Regular cooking began with H. erectus and launched the journey that has led, without any major changes, to the anatomy of modern humans.

Chapter Five, “Brain Foods”, discusses the ways in which cooking and a high-quality diet were important to the growth of the brain and therefore to intelligence. Explanations for the evolution of intelligence sometimes appeal to specific adaptive advantages—the tendency towards warfare; the occupation of large home ranges—but these have been disproven. Also, while the social brain hypothesis is important in explaining the benefits of intelligence, it doesn’t explain why some social species have smaller brains than others. In 1995 Aiello and Wheeler discovered a relationship between the large size of a primate’s brain relative to the small size of its gut; primates that spend less energy fuelling their intestines can afford to power more brain tissue. The idea became known as the expensive-tissue hypothesis.

Wrangham believes Aiello and Wheeler were right in principle but wrong in specifics. They assumed there was a single increase in brain size from australopithecines and H. erectus. Actually, that phase occurred in two steps: first, the appearance of the habilines; second, the appearance of H. erectus. Meat eating and cooking account for these two distinct steps and therefore for their accompanying increases in brain size; that said, the expensive-tissue hypothesis provides a useful explanation not only for the substantial increase in brain size, which occurred around the time of human origins, but also for other increases in brain size before and after.

Chapter Six, “How Cooking Frees Men”, opens the large and contentious subject of the sexual division of labour; women and men making different and complementary contributions to the household economy; a division that only applies to humans. Hints of this division have been noticed in non-human species but it isn’t significant. Traditionally, the Man-the-Hunter hypothesis explains this uniquely human bond between males and females, so much so that many researchers feel no other explanation is necessary, but the hypothesis misses much as it rarely looks beyond meat eating. At different times of the year, the relative importance of male hunting and female gathering changes, and the overall importance of each sex’s foods can be just as critical as the other’s in survival and maintaining health.

The influence of cooking to the sexual division of labour is often missed. Before our ancestors cooked they had much less free time. Their options for subsistence activities would have been severely constrained. For example, males couldn’t afford to spend all day hunting because, if they failed to get any prey, they would have to fill their bellies with raw plant foods, which would take them a long time to consume. If we lived on the same raw diet the great apes do, it would conservatively take 42 per cent of the day chewing; just over five hours in a twelve-hour day. 

Chapter Seven, “The Married Cook”, argues that, within the domestic setting, it’s universally given that women cook for their husbands, even in societies where domestic sexual roles overlap, or in societies where women are capable of becoming prominent outside the home. The classic reasons given are mutual convenience, and the dynamics of co-operation required to tend the fire, eat a meal, and share food, but these reasons are superficial. What actually happened was a protection racket. As cooking took time, lone cooks were vulnerable and couldn’t easily guard their wares from thieves. Pair bonding solved the problem. Husbands used their social relationships with other men to protect their wives from being robbed; women returned the favour by preparing their husbands’ meals.

The proposal that the human household originated in competition over food presents a challenge to conventional thinking, because the proposal holds economics as primary and sexual relations as secondary. We often see marriage as an exchange, in which women get resources and men get a guarantee of paternity. In that view, sex is the basis of our mating system and economic considerations are an add-on. In animal species, however, the mating system is adapted to the feeding system rather than the other way around. So cooking ended individual self-sufficiency and created human society. But men were the greater beneficiaries. Cooking freed women’s time, and fed their children, but it also trapped them in a newly subservient role enforced by male-dominated culture, which created and perpetuated a system of male cultural superiority.

Chapter Eight, “The Cook’s Journey”, is speculative storytelling, since we don’t know how deeply the effects of controlling fire and cooking food are burnt into our DNA. So the story goes, we live decades longer than the great apes, and reach sexual maturity more slowly, suggesting our ancestors were good at using fire to deter predators. Controlled fire and cooked food allowed early weaning, which allowed mothers to recover from birth faster, allowed children to grow faster, and promoted larger families. They reduced the seasonal dependence on fresh food, challenging the thrifty-gene hypothesis that we are adapted to periods of feast and famine. They led to a higher caloric intake, an easily digested protein intake, and a higher ingestion of toxic compounds, hence the adaptive changes to our digestive physiology. They allowed the regulation of our body temperature and control of our social temperament, which led to our family dynamics and group behaviour.

These changes all depend on a mysterious initial moment upon which we can only speculate. Desertification appears to be the climate change that stressed australopithecines. Two lines survived. One of these led to humans. Habilines ate a raw diet of both animals and plants, the proportions changing dramatically depending on the season, and their ability to process food was probably limited to softening by pounding. As their brain size was double that of the great apes, they were mentally capable of keeping a fire alive; so the big question isn’t whether they had fire but whether they obtained it regularly. Humanity began once H. erectus appeared and began to cook food regularly.

The Epilogue, “The Well-Informed Cook”, outlines the challenges ahead. Human evolution has reached a stage where we’re dying from too much food rather than too little. Our evolutionary commitment to cooking drives an industry, which makes foods that are bad for us: but these are foods we’ve evolved to want. We need to become more aware of the calorie-raising consequences of our current diet. To do this we need to better understand nutritional biophysics. That’s not easy, as our vast nutritional science is so focused on chemistry that physical realities are forgotten. Researchers treat food entering the stomach as if it were purely a solution of nutrients ready for biochemical reactions.

Take meat for example; nutritionists forget our digestive enzymes interact not with protein but with a bolus. Structural complexity matters because it affects how the bolus is converted to digestible nutrients, and therefore how many calories we get from it. The physics of food matters because we are currently unable to assess the real caloric value of our diet. The conventions through which nutritionists estimate energy values need to be revised, beginning with a reassessment of the Atwater system; flawed because it doesn’t measure the true cost of digestion; flawed because it assumes the proportion of food digested is always the same, regardless of whether it’s liquid or solid, raw or cooked, or part of a high-fibre or low-fibre diet. It’s time to modify Atwater’s system to include the effects of the physical structure of foods in estimating their nutritional value. 

Tom Standage begins An Edible History of Humanity where Wrangham leaves off, by reminding us that there are many ways of looking at the past: as a list of important dates; a conveyor belt of kings and queens; a series of rising and falling empires; a narrative of political, philosophical or technological progress. His book looks at the past another way: as a series of transformations caused, enabled or influenced by food. Throughout human history, food has done more than provide sustenance. It has acted as a catalyst of social transformation, societal organisation, geopolitical competition, industrial development, military conflict and economic expansion. The stories of individual foodstuffs, food-related customs and traditions, and the development of national cuisines, have already been told. Less attention has been paid to the overarching narrative of how food has driven and shaped human history.

In The Wealth of Nations, Adam Smith famously likened the unseen influence of market forces to an invisible hand, acting on participants all looking out for their own best interests. Food’s influence on history can be likened to an invisible fork that has, at several crucial points in history, prodded humanity and altered its destiny, even though people were generally unaware of its influence at the time. Many food choices made in the past have had far-reaching consequences, and have shaped our present world in unexpected ways. To the discerning eye, food’s historical influence can be seen all around us, not just in the kitchen, at the dining table, or in the supermarket. That food has been such an important ingredient in human affairs might seem strange, but it would be far more surprising if it hadn’t; after all, everything every person has ever done, throughout history, has literally been fuelled by food.

Part One, “The Edible Foundations of Civilisation”, forces us to reconsider the romantic idea that our food is a gift from nature, as the overwhelming majority of the plants and animals we eat are as man-made as a microchip, a magazine or a missile. Much as we like to think of farming as natural, it’s profoundly unnatural; 10,000 years ago it was a new and alien development; indeed, farmed land is (and has always been) as much a technological landscape as a biological one. Also, our relationship with the food we farm is interdependent and symbiotic; we depend on domesticated plants and animals; they depend on us; and, although the point is missed on many, in domesticating them we’ve ensured they can’t exist in the wild. Standage illustrates this with the detailed histories of three staples—wheat, rice and maize—each of which was genetically engineered by proto-farmers thousands of years ago.

The mechanism by which plants and animals were domesticated may be understood but the motives that led to farming from hunting and gathering are harder to understand, especially as the switch made humans worse off from a nutritional perspective and in many other ways. Did proto-farmers not realise what was happening until it was too late? Did farming begin as an option but gradually become a necessity? Was it driven by climate change, sedentism, population growth, or a combination of these? By 2000 BC the majority of humanity had taken up farming and today the distribution of human languages and genes continues to reflect the advent of farming. During domestication, plants and animals were genetically reconfigured by humans; however, as agriculture was adopted, humans were genetically reconfigured by plants and animals. Farming has done more to change the world, and has had a greater positive and negative impact on the environment, than any other human activity. For all its faults, it’s the basis of civilisation as we know it.   

Part Two, “Food and Social Structure”, reminds us that hunters and gatherers were governed by a food supply which necessitated they be nomadic and non-hierarchical. We shouldn’t romanticise their egalitarian tendencies, though, as they practised savageries such as infanticide and cannibalism. There was a gradual transition from hunting and gathering to small egalitarian farming villages; then a gradual transition to socially-stratified cities, made possible when part of the population started producing more food than was needed for its subsistence. A nexus emerged between food, wealth and power; “big men” evolved when early farming societies required individuals to manage food–wealth–power; then chiefdoms evolved; then civilisations with explanatory religions. Why did this happen? One theory, which the archaeological evidence supports, is that more complex societies—those with a strong leadership and a clear social hierarchy—are more productive, more resilient, better able to survive hardship, and better at defending themselves.

Food evolved as a tracer for power; once the social order saw itself as imitating a natural order analogous with a divine order; once humans saw farming as a proxy for triumphing over nature, analogous with the struggle of the gods in a larger cosmological battle. In early civilisations, farming defined and reinforced the privilege of the elite. Food was currency, was used to pay taxes, and was extracted as tribute. In the modern world, you follow money to determine where power lies; in the ancient world you followed food. Nowadays, food is only equivalent to money—and hence to wealth and power—where it’s scarce or expensive; as it was for most of recorded history; as it still is in parts of the undeveloped world. The definition of poverty isn’t only the inability to accumulate wealth; it’s still a lack of access to food. In that sense, what we mean by social inequity and economic inequality is a consequence of civilisation, which is a consequence of farming.

Part Three, “Global Highways of Food”, describes the extraordinary influence of the spice trade to the social, economic, political and global development of Western civilisation. The trade was a curious phenomenon, as there’s nothing inherently valuable about spices—they are nutritionally superfluous; the myth that they were once used to mask rotten meat is untrue—but they were prized for their unusual scents and tastes. They were durable, lightweight, hard to obtain, and only found in specific places. These factors made them ideal for long-distance trade; the farther they were carried, the more sought after, exotic and expensive they became. There were two stages to the development: first, Europe’s long struggle to circumvent Arabia’s (and later Islam’s) hegemony over the trade, once Alexandria fell in 641 AD and the “Muslim Curtain” blocked European access to the East; second, once Arabian–Islamic hegemony was circumvented, how the European spice trade sowed the seeds of Western empire.

Spices helped to lure Columbus westward, where none were to be found, and da Gama eastward, where they could be found in abundance, and spices also inspired Magellan’s first circumnavigation of the earth. For the first few decades, Spain and Portugal dominated the seas, but within a century of the Treaty of Tordesillas (1494), which divided the world into Spanish and Portuguese hemispheres of influence, the English and Dutch arrived on the scene to compete. During the seventeenth century, however, spices lost their allure and profitability, being eclipsed by markets for textiles and new products such as tobacco, coffee and tea. Also, not everyone championed the bringing of inessential ingredients such long distances: “For this we go to India!” Pliny the Elder grumbled about pepper in the first century AD. Today similar arguments are advanced by proponents of “local food”, although locally-grown food can have a more negative environmental impact than shipping foods around the globe.

Part Four, “Food, Energy, and Industrialisation”, weaves together several interdependent stories which remind us how colonialism, commerce and science went hand in hand. First, the story of botany—the “big science” of its day—and the divided loyalties of botanists who served two masters: as participants of a new international research community dedicated to furthering a revolution in which direct observation triumphed over received wisdom; as servants expected to ensure their own country would benefit most from their work. Second, the far-reaching consequences of the “Columbian Exchange” of food crops between the Old and New Worlds, in which—to name a few of the most important—wheat, sugar and rice moved west and maize, potatoes and chocolate moved east. Crops had always migrated from one place to another but never with such speed, on such a scale, over such distances, and with such long-term consequences.

Third, the ways in which sugar plantations depended on slave labour and provided a prototype for industrialisation. Fourth, the struggle rulers had convincing their European populations to embrace the potato and the consequences of this embrasure: for example, on human growth, on the population explosion, and on the Irish Potato Famine. Fifth, the fears of economists such as Malthus who argued that population would outstrip food supply. Sixth, the story of why Britain never hit the wall Malthus anticipated, by vaulting over the wall instead; by switching from agriculture to manufacturing; by becoming an importer of food and an exporter of goods that could be traded for food, thus beginning a revolution that marked a new phase in human existence. Seventh, the effect this revolution had on energy consumption, and why we need to invest in biofuels that don’t come from food crops: the use of food crops for fuel is a step backwards; the trade-off between food and fuel has resurfaced in the present but belongs in the past.

Part Five, “Food as a Weapon”, demonstrates how control of the food supply has been decisive in war and ideological conflicts. Until relatively recently, military logistics depended on food; armies travelled slowly and could only operate close to rivers and coasts. Requisitioning and foraging underpinned the French army’s agility and mobility, which led to Napoleon’s greatest victory in 1805 and greatest blunder in 1812. The invention of canned food and mechanised transport meant armies could be resupplied easily; the trenches of the First World War depended on both. During the Second World War, Rommel’s attempt to defeat the Allies in North Africa failed because of an inadequate stock of weapons, petrol and ammunition—not because of a lack of food or fodder—which demonstrates that food’s central role in military planning had come to an end.

Food continued to be an ideological weapon. In the late 1920s and early 1930s, during its attempt to introduce collectivisation, the Soviet regime waged ideological war against the peasant farmers; it introduced internal passports to control population movements; it created a famine in which seven to eight million people died. In the late 1950s, Mao followed the Soviet model closely, with similar results; during the Great Leap Forward, which created the worst famine in human history, between thirty and forty million people died. Meanwhile, back in the USSR, it was noticed that agricultural output had been shrinking since 1940, while the population was growing; the Soviet Union eventually collapsed because it couldn’t feed itself. On the positive side, after the Second World War, when the Soviets blockaded supplies, in an attempt to force the Allies out of Berlin, the Americans airlifted 2.3 million tons of supplies on 275,000 flights over fifteen months. Finally, in the words of Amartya Sen, winner of the Nobel Prize for Economics in 1998: “no substantial famine has ever occurred in any independent and democratic country with a free press”. 

Part Six, “Food, Population, and Development”, looks at the successes as well as the failures of the Green Revolution, beginning before most do, with the “machine that changed the world”. In Germany in 1909, this unprepossessing machine led to the Haber-Bosch process, which synthesised ammonia from its constituent elements. The link between ammonia and nutrition is nitrogen; a vital building block of all plant and animal tissue; the nutrient responsible for vegetative growth and the protein content of cereals: our staple crops. In the nineteenth century, discovering the role of nitrogen in plant nutrition coincided with realising the need to improve crop yields to feed the world’s burgeoning population. The Haber-Bosch process wasn’t only used for making fertiliser, though, it was also used for making explosives and conducting chemical warfare; clearly, chemicals can be used to both sustain and destroy life.

While fertiliser increased yields, plants became too tall and toppled over; therefore, a sustainable increase in food production depended on discovering new dwarf varieties. When it came to the developing world, one man did more than anyone else to adapt and promote dwarf seeds: Norman Borlaug, an American agronomist who won the Nobel Peace Prize in 1970 when the impact of the Green Revolution was already apparent. Plenty brings its own paradoxes, however: increased agricultural yields rely on increased use of pesticides as well as fertilisers and dwarf seeds; new technologies have had unforeseen consequences; demand for food has decreased in some parts of the world, as wealth is a powerful contraceptive. What does this mean? Government and aid agencies, including the World Bank, don’t have the same interest in promoting agriculture they once had, assuming the global food problem had been solved, at least for the time being; some developing countries that originally benefited from the Green Revolution are now taking their eye off the agricultural ball. While the West hasn’t noticed it, a new food crisis is emerging in some parts of the developing world.

The Epilogue, “Ingredients of the Future”, begins with a Chinese proverb: “There is no feast which does not come to an end.” Someday a global calamity might make it necessary to rebuild human civilisation from scratch, starting with its deepest foundation: agriculture. How will this happen? There are 1400 seed banks in the world but many of these are vulnerable to wars, natural disasters or lack of funding. Taliban fighters wiped out a seed bank in Afghanistan. During the American invasion of Iraq, a seed bank in Abu Ghraib was destroyed by looters. Much of the national collection of the Philippines was lost during a typhoon. A bank in Latin America almost lost its collection when its refrigerators broke down. Malawi’s bank is a freezer in the corner of a wooden shack. Kenya’s bank was almost lost because its administrators couldn’t pay its electricity bill.

There is a bright side. On a remote island in the Arctic Circle, 700 miles from the North Pole, is the Svalbard Global Seed Vault: the world’s largest and safest seed-storage facility. The purpose of this extraordinarily secure vault is to provide an insurance policy against a short-term threat such as global warming and possible wars over food—as global shifts in agricultural production lead to widespread droughts and food shortages and provoke conflict over access to agricultural land and water supplies—or a long-term threat such as a nuclear war or an asteroid hitting the earth. Despite this seed bank’s futuristic design and high-tech features, there’s an echo of the Neolithic in its purpose; it was the ability to store seeds, as an insurance policy against future food shortages, which first led humanity to take a particular interest in cereal crops. This interest started humanity down the path to domestication, farming, and all the other consequences described in this book. From the dawn of agriculture to the Green Revolution, food has been an essential ingredient in human history, and food is certain to be a vital ingredient in humanity’s future. 

There’s nothing in these two books to dissuade me from remaining a vegan; there’s plenty to suggest a plant-based diet should be balanced, should contain raw and cooked foods, and is a sustainable and healthy choice of diet. Also, as in many other areas of knowledge, it’s obvious that what we know about evolution is provisional and contentious. Some of the earlier hypotheses need to be approached with caution, since eating meat isn’t as important as once thought.

So what is important in human evolution? Regular readers of Quadrant will recall that Derek Bickerton believes it’s language (Bastard Tongues was reviewed in October 2009; Adam’s Tongue in November 2009). I suspect Bickerton would lament Wrangham’s primate-centric and homo-centric attitude towards evolution, and his failure to embrace the bigger picture afforded by niche-construction theory. However, if both Bickerton and Wrangham have something provisional to offer—and if, depending on our understanding of time, evolution and intelligent design might be the same thing—we could play around with the opening lines of the Fourth Gospel, which recast the opening lines of Genesis: “In the beginning was the Word, and the Word made Wok; animals became human, through Word and Wok. After several million years, give or take a few, God took a break—within time or without time—to gaze at the humans who had evolved, and were still evolving, through Word and Wok. And behold, humans were very good, for the time being.”

Michael Giffin discussed the Catholic novels of Graham Greene in the June issue. 

Comments

Join the Conversation

Already a member?

What to read next

  • Letters: Authentic Art and the Disgrace of Wilgie Mia

    Madam: Archbishop Fisher (July-August 2024) does not resist the attacks on his church by the political, social or scientific atheists and those who insist on not being told what to do.

    Aug 29 2024

    6 mins

  • Aboriginal Culture is Young, Not Ancient

    To claim Aborigines have the world's oldest continuous culture is to misunderstand the meaning of culture, which continuously changes over time and location. For a culture not to change over time would be a reproach and certainly not a cause for celebration, for it would indicate that there had been no capacity to adapt. Clearly this has not been the case

    Aug 20 2024

    23 mins

  • Pennies for the Shark

    A friend and longtime supporter of Quadrant, Clive James sent us a poem in 2010, which we published in our December issue. Like the Taronga Park Aquarium he recalls in its 'mocked-up sandstone cave' it's not to be forgotten

    Aug 16 2024

    2 mins