Question
ONE FROM FOODS TO NUTRIENTS If you spent any time at all in a supermarket in the 1980s, you might have noticed something peculiar going
ONE FROM FOODS TO NUTRIENTS
If you spent any time at all in a supermarket in the 1980s, you might have noticed something peculiar going on. The food was gradually disappearing from the shelves. Not literally vanishingI'm not talking about Soviet-style shortages. No, the shelves and refrigerated cases still groaned with packages and boxes and bags of various edibles, more of them landing every year in fact, but a great many
of the traditional supermarket foods were steadily being replaced by "nutrients," which are not the same thing. Where once the familiar names of recognizable comestiblesthings like eggs or breakfast cereals or snack foodsclaimed pride of place on the brightly colored packages crowding the aisles, now new, scientific-sounding terms like "cholesterol" and "fiber" and "saturated fat" began rising to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. The implicit message was that foods, by comparison, were coarse, old-fashioned, and decidedly unscientific thingswho could say what was in them really? But nutrientsthose chemical compounds and minerals in foods that scientists have identified as important to our healthgleamed with the promise of scientific certainty. Eat more of the right ones, fewer of the wrong, and you would live longer, avoid chronic diseases, and lose weight.
Nutrients themselves had been around as a concept and a set of words since early in the nineteenth century. That was when William Prout, an English doctor and chemist, identified the three principal constituents of foodprotein, fat, and carbohydratesthat would come to be known as macronutrients. Building on Prout's discovery, Justus von Liebig, the great German scientist credited as one of the founders of organic chemistry, added a couple of minerals to the big three and declared that the mystery of animal nutritionhow food turns into flesh and energyhad been solved. This is the very same Liebig who identified the macronutrients in soilnitrogen, phosphorus, and potassium (known to farmers and gardeners by their periodic table initials, N, P, and K). Liebig claimed that all that plants need to live and grow are these three chemicals, period. As with the plant, so with the person: In 1842, Liebig proposed a theory of metabolism that explained life strictly in terms of a small handful of chemical nutrients, without recourse to metaphysical forces such as "vitalism."
Having cracked the mystery of human nutrition, Liebig went on to develop a meat extractLiebig's Extractum Carnisthat has come down to us as bouillon and concocted the first baby formula, consisting of cow's milk, wheat flour, malted flour, and potassium bicarbonate.
Liebig, the father of modern nutritional science, had driven food into a corner and forced it to yield its chemical secrets. But the post-Liebig consensus that science now pretty much knew what was going on in food didn't last long. Doctors began to notice that many of the babies fed exclusively on Liebig's formula failed to thrive. (Not surprising, given that his preparation lacked any vitamins or several essential fats and amino acids.) That Liebig might have overlooked a few little things in food also began to occur to doctors who observed that sailors on long ocean voyages often got sick, even when they had adequate supplies of protein, carbohydrates, and fat. Clearly the chemists were missing somethingsome essential ingredients present in the fresh plant foods (like oranges and potatoes) that miraculously cured the sailors. This observation led to the discovery early in the twentieth century of the first set of micronutrients, which the Polish biochemist Casimir Funk, harkening back to older vitalist ideas of food, christened "vitamines" in 1912 ("vita-" for life and "-amines" for organic compounds organized around nitrogen).
Vitamins did a lot for the prestige of nutritional science. These special molecules, which at first were isolated from foods and then later synthesized in a laboratory, could cure people of nutritional deficiencies such as scurvy or beriberi almost overnight in a convincing demonstration of reductive chemistry's power. Beginning in the 1920s, vitamins enjoyed a vogue among the middle class, a group not notably afflicted by beriberi or scurvy. But the belief took hold that these magic molecules also promoted growth in children, long life in adults, and, in a phrase of the time, "positive health" in everyone. (And what would "negative health" be exactly?) Vitamins had brought a kind of glamour to the science of nutrition, and though certain elite segments of the population now began to eat by its expert lights, it really wasn't until late in the twentieth century that nutrients began to push food aside in the popular imagination of what it means to eat.
No single event marked the shift from eating food to eating nutrients, although in retrospect a little-noticed political dustup in Washington in 1977 seems to have helped propel American culture down this unfortunate and dimly lighted path. Responding to reports of an alarming increase in chronic diseases linked to dietincluding heart disease, cancer, obesity, and diabetesthe Senate Select Committee on Nutrition and Human Needs chaired by South Dakota Senator George McGovern held hearings on the problem. The committee had been formed in 1968 with a mandate to eliminate malnutrition, and its work had led to the establishment of several important food-assistance programs. Endeavoring now to resolve the question of diet and chronic disease in the general population
represented a certain amount of mission creep, but all in a good cause to which no one could possibly object.
After taking two days of testimony on diet and killer diseases, the committee's staffcomprised not of scientists or doctors but of lawyers and (ahem) journalistsset to work preparing what it had every reason to assume would be an uncontroversial document called Dietary Goals for the United States. The committee learned that while rates of coronary heart disease had soared in America since World War II, certain other cultures that consumed traditional diets based mostly on plants had strikingly low rates of chronic diseases. Epidemiologists had also observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease had temporarily plummeted, only to leap upward once the war was over.
Beginning in the 1950s, a growing body of scientific opinion held that the consumption of fat and dietary cholesterol, much of which came from meat and dairy products, was responsible for rising rates of heart disease during the twentieth century. The "lipid hypothesis," as it was called, had already been embraced by the American Heart Association, which in 1961 had begun recommending a "prudent diet" low in saturated fat and cholesterol from animal products. True, actual proof for the lipid hypothesis was remarkably thin in 1977it was still very much a hypothesis, but one well on its way to general acceptance.
In January 1977, the committee issued a fairly straightforward set of dietary guidelines, calling on Americans to cut down on their consumption of red meat and dairy products. Within weeks a firestorm of criticism, emanating chiefly from the red meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee's recommendations were hastily rewritten. Plain talk about actual foodstuffsthe committee had advised Americans to "reduce consumption of meat"was replaced by artful compromise: "choose meats, poultry, and fish that will reduce saturated fat intake."
Leave aside for now the virtues, if any, of a low-meat and/or low-fat diet, questions to which I will return, and focus for a moment on language. For with these subtle changes in wording a whole way of thinking about food and health underwent a momentous shift. First, notice that the stark message to "eat less" of a particular foodin this case meathad been deep-sixed; don't look for it ever again in any official U.S. government dietary pronouncement. Say what you will about this or that food, you are not allowed officially to tell people to eat less of it or the industry in question will have you for lunch. But there is a path around this immovable obstacle, and it was McGovern's staffers who blazed it: Speak no more of foods, only nutrients. Notice how in the revised guidelines, distinctions between entities as different as beef and chicken and fish have collapsed. These three venerable foods, each representing not just a different species but an entirely different taxonomic class, are now lumped together as mere delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves. Now the culprit is an obscure, invisible, tastelessand politically unconnectedsubstance that may or may not lurk in them called saturated fat.
The linguistic capitulation did nothing to rescue McGovern from his blunder. In the very next election, in 1980, the beef lobby succeeded in rusticating the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein squatting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, but would instead arrive dressed in scientific euphemism and speaking of nutrients, entities that few Americans (including, as we would find out, American nutrition scientists) really understood but that, with the notable exception of sucrose, lack powerful lobbies in Washington.*
The lesson of the McGovern fiasco was quickly absorbed by all who would pronounce on the American diet. When a few years later the National Academy of Sciences looked into the question of diet and cancer, it was careful to frame its recommendations nutrient by nutrient rather than food by food, to avoid offending any powerful interests. We now know the academy's panel of thirteen scientists adopted this approach over the objections of at least two of its members who argued that most of the available science pointed toward conclusions about foods, not nutrients. According to T. Colin Campbell, a Cornell nutritional biochemist who served on the panel, all of the human population studies linking dietary fat to cancer actually showed that the groups with higher cancer rates consumed not just more fats, but also more animal foods and fewer plant foods as well. "This meant that these cancers could just as easily be caused by animal protein, dietary cholesterol, something else exclusively found in animal-based foods, or a lack of plant-based foods," Campbell wrote years later. The argument fell on deaf ears.
In the case of the "good foods" too, nutrients also carried the day: The language of the final report highlighted the benefits of the antioxidants in vegetables rather than the vegetables themselves. Joan Gussow, a Columbia University nutritionist who served on the panel, argued against the focus on nutrients rather than whole foods. "The really important message in the epidemiology, which is all we had to go on, was that some vegetables and citrus fruits seemed to be protective against cancer. But those sections of the report were written as though it was the vitamin C in the citrus or the beta-carotene in the vegetables that was responsible for the effect. I kept changing the language to talk about 'foods that contain vitamin C' and 'foods that contain carotenes.' Because how do you know it's not one of the other things in the carrots or the broccoli? There are hundreds of carotenes. But the biochemists had their answer: 'You can't do a trial on broccoli.'"
So the nutrients won out over the foods. The panel's resort to scientific reductionism had the considerable virtue of being both politically expedient (in the case of meat and dairy) and, to these scientific heirs of Justus von Liebig, intellectually sympathetic. With each of its chapters focused on a single nutrient, the final draft of the National Academy of Sciences report, Diet, Nutrition and Cancer, framed its recommendations in terms of saturated fats and antioxidants rather than beef and broccoli.
In doing so, the 1982 National Academy of Sciences report helped codify the official new dietary language, the one we all still speak. Industry and media soon followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids, flavonols, carotenoids, antioxidants, probiotics, and phytochemicals soon colonized much of the cultural space previously occupied by the tangible material formerly known as food.
The Age of Nutritionism had arrived.
TWO NUTRITIONISM DEFINED
The term isn't mine. It was coined by an Australian sociologist of science by the name of Gyorgy Scrinis, and as near as I can determine first appeared in a 2002 essay titled "Sorry Marge" published in an Australian quarterly called Meanjin. "Sorry Marge" looked at margarine as the ultimate nutritionist product, able to shift its identity (no cholesterol! one year, no trans fats! the next) depending on the
prevailing winds of dietary opinion. But Scrinis had bigger game in his sights than spreadable vegetable oil. He suggested that we look past the various nutritional claims swirling around margarine and butter and consider the underlying message of the debate itself: "namely, that we should understand and engage with food and our bodies in terms of their nutritional and chemical constituents and requirementsthe assumption being that this is all we need to understand." This reductionist way of thinking about food had been pointed out and criticized before (notably by the Canadian historian Harvey Levenstein, the British nutritionist Geoffrey Cannon, and the American nutritionists Joan Gussow and Marion Nestle), but it had never before been given a proper name: "nutritionism." Proper names have a way of making visible things we don't easily see or simply take for granted.
The first thing to understand about nutritionism is that it is not the same thing as nutrition. As the "-ism" suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it's still exerting its hold on your culture. A reigning ideology is a little like the weatherall pervasive and so virtually impossible to escape. Still, we can try.
In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. Put another way: Foods are essentially the sum of their nutrient parts. From this basic premise flow several others.
Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists reach the public) to explain the hidden reality of foods to us. In form this is a quasireligious idea, suggesting the visible world is not the one that really matters, which implies the need for a priesthood. For to enter a world where your dietary salvation depends on unseen nutrients, you need plenty of expert help.
But expert help to do what exactly? This brings us to another unexamined assumption of nutritionism: that the whole point of eating is to maintain and promote bodily health. Hippocrates' famous injunction to "let food be thy medicine" is ritually invoked to support this notion. I'll leave the premise alone for now, except to point out that it is not shared by all cultures and, further, that the experience of these other cultures suggests that, paradoxically, regarding food as being about things other than bodily healthlike pleasure, say, or sociality or identitymakes people no less healthy; indeed, there's some reason to believe it may make them more healthy. This is what we usually have in mind when we speak of the French paradox. So there is at least a question as to whether the ideology of nutritionism is actually any good for you.
It follows from the premise that food is foremost about promoting physical health that the nutrients in food should be divided into the healthy ones and the unhealthy onesgood nutrients and bad. This has been a hallmark of nutritionist thinking from the days of Liebig, for whom it wasn't enough to identify the nutrients; he also had to pick favorites, and nutritionists have been doing so ever since. Liebig claimed that protein was the "master nutrient" in animal nutrition, because he believed it drove growth. Indeed, he likened the role of protein in animals to that of nitrogen in plants: Protein (which contains nitrogen) comprised the essential human fertilizer. Liebig's elevation of protein dominated nutritionist thinking for decades as public health authorities worked to expand access to and production of the master nutrient (especially in the form of animal protein), with the goal of growing bigger, and therefore (it was assumed) healthier, people. (A high priority for Western governments fighting imperial wars.) To a considerable extent we still have a food system organized around the promotion of protein as the master nutrient. It has given us, among other things, vast amounts of cheap meat and milk, which have in turn given us much, much bigger people. Whether they are healthier too is another question.
It seems to be a rule of nutritionism that for every good nutrient, there must be a bad nutrient to serve as its foil, the latter a focus for our food fears and the former for our enthusiasms. A backlash against protein arose in America at the turn of the last century as diet gurus like John Harvey Kellogg and Horace Fletcher (about whom more later) railed against the deleterious effects of protein on digestion (it supposedly led to the proliferation of toxic bacteria in the gut) and promoted the cleaner, more wholesome carbohydrate in its place. The legacy of that revaluation is the breakfast cereal, the strategic objective of which was to dethrone animal protein at the morning meal.
Ever since, the history of modern nutritionism has been a history of macronutrients at war: protein against carbs; carbs against proteins, and then fats; fats against carbs. Beginning with Liebig, in each age nutritionism has organized most of its energies around an imperial nutrient: protein in the nineteenth century, fat in the twentieth, and, it stands to reason, carbohydrates will occupy our attention in the twenty-first. Meanwhile, in the shadow of these titanic struggles, smaller civil wars have raged within the sprawling empires of the big three: refined carbohydrates versus fiber; animal protein versus plant protein; saturated fats versus polyunsaturated fats; and then, deep down within the province of the polyunsaturates, omega-3 fatty acids versus omega-6s. Like so many ideologies, nutritionism at bottom hinges on a form of dualism, so that at all times there must be an evil nutrient for adherents to excoriate and a savior nutrient for them to sanctify. At the moment, trans fats are performing admirably in the former role, omega-3 fatty acids in the latter. It goes without saying that such a Manichaean view of nutrition is bound to promote food fads and phobias and large abrupt swings of the nutritional
pendulum.
Another potentially serious weakness of nutritionist ideology is that, focused so relentlessly as it is on the nutrients it can measure, it has trouble discerning qualitative distinctions among foods. So fish, beef, and chicken through the nutritionist's lens become mere delivery systems for varying quantities of different fats and proteins and whatever other nutrients happen to be on their scope. Milk through this lens is reduced to a suspension of protein, lactose, fats, and calcium in water, when it is entirely possible that the benefits, or for that matter the hazards, of drinking milk owe to entirely other factors (growth hormones?) or relationships between factors ( fat-soluble vitamins and saturated fat?) that have been overlooked. Milk remains a food of humbling complexity, to judge by the long, sorry saga of efforts to simulate it. The entire history of baby formula has been the history of one overlooked nutrient after another: Liebig missed the vitamins and amino acids, and his successors missed the omega-3s, and still to this day babies fed on the most "nutritionally complete" formula fail to do as well as babies fed human milk. Even more than margarine, infant formula stands as the ultimate test product of nutritionism and a fair index of its hubris.
This brings us to one of the most troubling features of nutritionism, though it is a feature certainly not troubling to all. When the emphasis is on quantifying the nutrients contained in foods (or, to be precise, the recognized nutrients in foods), any qualitative distinction between whole foods and processed foods is apt to disappear. "[If] foods are understood only in terms of the various quantities of nutrients they contain," Gyorgy Scrinis wrote, then "even processed foods may be considered to be 'healthier' for you than whole foods if they contain the appropriate quantities of some nutrients."
How convenient.
THREE NUTRITIONISM COMES TO MARKET
No idea could be more sympathetic to manufacturers of processed foods, which surely explains why they have been so happy to jump on the nutritionism bandwagon. Indeed, nutritionism supplies the ultimate justification for processing food by implying that with a judicious application of food science, fake foods can be made even more nutritious than the real thing. This of course is the story of
margarine, the first important synthetic food to slip into our diet. Margarine started out in the nineteenth century as a cheap and inferior substitute for butter, but with the emergence of the lipid hypothesis in the 1950s, manufacturers quickly figured out that their product, with some tinkering, could be marketed as bettersmarter!than butter: butter with the bad nutrients removed (cholesterol and saturated fats) and replaced with good nutrients (polyunsaturated fats and then vitamins). Every time margarine was found wanting, the wanted nutrient could simply be added (Vitamin D? Got it now. Vitamin A? Sure, no problem). But of course margarine, being the product not of nature but of human ingenuity, could never be any smarter than the nutritionists dictating its recipe, and the nutritionists turned out to be not nearly as smart as they thought. The food scientists' ingenious method for making healthy vegetable oil solid at room temperatureby blasting it with hydrogenturned out to produce unhealthy trans fats, fats that we now know are more dangerous than the saturated fats they were designed to replace. Yet the beauty of a processed food like margarine is that it can be endlessly reengineered to overcome even the most embarrassing about-face in nutritional thinkingincluding the real wincer that its main ingredient might cause heart attacks and cancer. So now the trans fats are gone, and margarine marches on, unfazed and apparently unkillable. Too bad the same cannot be said of an unknown number of margarine eaters.
By now we have become so inured to fake foods that we forget what a difficult trail margarine had to blaze before it and other synthetic food products could win government and consumer acceptance. At least since the 1906 publication of Upton Sinclair's The Jungle, the "adulteration" of common foods has been a serious concern of the eating public and the target of numerous federal laws and Food and Drug Administration regulations. Many consumers regarded "oleomargarine" as just such an adulteration, and in the late 1800s five states passed laws requiring that all butter imitations be dyed pink so no one would be fooled. The Supreme Court struck down the laws in 1898. In retrospect, had the practice survived, it might have saved some lives.
The 1938 Food, Drug and Cosmetic Act imposed strict rules requiring that the word "imitation" appear on any food product that was, well, an imitation. Read today, the official rationale behind the imitation rule seems at once commonsensical and quaint: "...there are certain traditional foods that everyone knows, such as bread, milk and cheese, and that when consumers buy these foods, they should get the foods they are expecting...[and] if a food resembles a standardized food but does not comply with the standard, that food must be labeled as an 'imitation.'"
Hard to argue with that...but the food industry did, strenuously for decades, and in 1973 it finally succeeded in getting the imitation rule tossed out, a little-noticed but momentous step that helped speed America down the path to nutritionism.
Industry hated the imitation rule. There had been such a tawdry history of adulterated foods and related forms of snake oil in American commerce that slapping the word "imitation" on a food product was the kiss of deathan admission of adulteration and inferiority. By the 1960s and 1970s, the requirement that such a pejorative term appear on fake food packages stood in the way of innovation, indeed of the wholesale reformulation of the American food supplya project that, in the wake of rising concerns about dietary fat and cholesterol, was coming to be seen as a good thing. What had been regarded as hucksterism and fraud in 1906 had begun to look like sound public health policy by 1973. The American Heart Association, eager to get Americans off saturated fats and onto vegetable oils (including hydrogenated vegetable oils), was actively encouraging the food industry to "modify" various foods to get the saturated fats and cholesterol out of them, and in the early seventies the association urged that "any existing and regulatory barriers to the marketing of such foods be removed."
And so they were when, in 1973, the FDA (not, note, the Congress that wrote the law) simply repealed the 1938 rule concerning imitation foods. It buried the change in a set of new, seemingly consumer-friendly rules about nutrient labeling so that news of the imitation rule's repeal did not appear until the twenty-seventh paragraph of The NewYork Times' account, published under the headline F.D.A.PROPOSES SWEEPING CHANGE IN FOOD LABELING:NEW RULES DESIGNED TO GIVE CONSUMERS A BETTER IDEA OF NUTRITIONAL VALUE.(Theseconddeckoftheheadlinegaveawaythegame:PROCESSORS BACK MOVE.)Therevisedimitationruleheldthataslongasan imitation product was not "nutritionally inferior" to the natural food it sought to impersonateas long as it had the same quantities of recognized nutrientsthe imitation could be marketed without using the dreaded "i" word.
With that, the regulatory door was thrown open to all manner of faked low-fat products: Fats in things like sour cream and yogurt could now be replaced with hydrogenated oils or guar gum or carrageenan, bacon bits could be replaced with soy protein, the cream in "whipped cream" and "coffee creamer" could be replaced with corn starch, and the yolks of liquefied eggs could be replaced with, well, whatever the food scientists could dream up, because the sky was now the limit. As long as the new fake foods were engineered to be nutritionally equivalent to the real article, they could no longer be considered fake. Of course the operative nutritionist assumption here is that we know enough to determine nutritional equivalencesomething that the checkered history of baby formula suggests has never been the case.
Nutritionism had become the official ideology of the Food and Drug Administration; for all practical purposes the government had redefined foods as nothing more than the sum of their recognized nutrients. Adulteration had been repositioned as food science. All it would take now was a push from McGovern's Dietary Goals for hundreds of "traditional foods that everyone knows" to begin their long
retreat from the supermarket shelves and for our eating to become more "scientific."
FOUR FOOD SCIENCE'S GOLDEN AGE
In the years following the 1977 Dietary Goals and the 1982 National Academy of Sciences report on diet and cancer, the food industry, armed with its regulatory absolution, set about reengineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and fewer of the bad. A golden age for food science dawned. Hyphens
sprouted like dandelions in the supermarket aisles: low-fat, no-cholesterol, high-fiber. Ingredients labels on formerly two-or three-ingredient foods such as mayonnaise and bread and yogurt ballooned with lengthy lists of new additiveswhat in a more benighted age would have been called adulterants. The Year of Eating Oat Branalso known as 1988served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran's moment on the dietary stage didn't last long, but the pattern now was set, and every few years since then, a new oat bran has taken its star turn under the marketing lights. (Here come omega-3s!)
You would not think that common food animals could themselves be rejiggered to fit nutritionist fashion, but in fact some of them could be, and were, in response to the 1977 and 1982 dietary guidelines as animal scientists figured out how to breed leaner pigs and select for leaner beef. With widespread lipophobia taking hold of the human population, countless cattle lost their marbling and lean pork was repositioned as "the new white meat"tasteless and tough as running shoes, perhaps, but now even a pork chop could compete with chicken as a way for eaters to "reduce saturated fat intake." In the years since then, egg producers figured out a clever way to redeem even the disreputable egg: By feeding flaxseed to hens, they could elevate levels of omega-3 fatty acids in the yolks. Aiming to do the same thing for pork and beef fat, the animal scientists are now at work genetically engineering omega-3 fatty acids into pigs and persuading cattle to lunch on flaxseed in the hope of introducing the blessed fish fat where it had never gone before: into hot dogs and hamburgers.
But these whole foods are the exceptions. The typical whole food has much more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can't quite as readily change its nutritional stripes. (Though rest assured the genetic engineers are hard at work on the problem.) To date, at least, they can't put oat bran in a banana or omega-3s in a peach. So depending on the reigning nutritional orthodoxy, the avocado might either be a high-fat food to be assiduously avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate and supermarket sales of each whole food rises and falls with every change in the nutritional weather while the processed foods simply get reformulated and differently supplemented. That's why when the Atkins diet storm hit the food industry in 2003, bread and pasta got a quick redesign (dialing back the carbs; boosting the proteins) while poor unreconstructed potatoes and carrots were left out in the carbohydrate cold. (The low-carb indignities visited on bread and pasta, two formerly "traditional foods that everyone knows," would never have been possible had the imitation rule not been tossed out in 1973. Who would ever buy imitation spaghetti? But of course that is precisely what low-carb pasta is.)
A handful of lucky whole foods have recently gotten the "good nutrient" marketing treatment: The antioxidants in the pomegranate (a fruit formerly more trouble to eat than it was worth) now protect against cancer and erectile dysfunction, apparently, and the omega-3 fatty acids in the (formerly just fattening) walnut ward off heart disease. A whole subcategory of nutritional sciencefunded by industry and, according to one recent analysis,* remarkably reliable in its ability to find a health benefit in whatever food it has been commissioned to studyhas sprung up to give a nutritionist sheen(and FDA- approved health claim) to all sorts of foods, including some not ordinarily thought of as healthy. The Mars Corporation recently endowed a chair in chocolate science at the University of California at Davis, where research on the antioxidant properties of cacao is making breakthroughs, so it shouldn't be long before we see chocolate bars bearing FDA-approved health claims. (When we do, nutritionism will surely have entered its baroque phase.) Fortunately for everyone playing this game, scientists can find an antioxidant in just about any plant-based food they choose to study.
Yet as a general rule it's a whole lot easier to slap a health claim on a box of sugary cereal than on a raw potato or a carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over in Cereal the Cocoa Puffs and Lucky Charms are screaming their newfound "whole-grain goodness" to the rafters.
Watch out for those health claims.
The Above is the reading for it
How we all look at food today is strongly shaped by the reigning ideology of nutritionism and its key feature scientific reductionism.Using a paragraph explainthe assumptions of nutritionism/reductionism and why Pollan is critical of their limitations.using another paragraphexplain why this ideology has been embraced by the Big Food companies and how they use it to justify heavily processed foods.A paragraph is at least 3-5 sentences
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started