WWW.POPSCI.COM
What was food like before the FDA?
A milkman having his float inspected, circa 1935.   CREDIT: Photo by General Photographic Agency/Hulton Archive/Getty Images. Get the Popular Science daily newsletter💡 We have a tendency to romanticize the past. Think about the food your great grandparents (or even their parents) ate in childhood and you might imagine farm fresh produce, pure milled grains, and pristine meat and dairy. But if they were living in the United States during the mid-to-late 19th century, that vision of food utopia wasn’t likely reality.  Before 1906, there were no federal food safety regulations in the US. Local grocers were a wild west of unlabeled additives, untested chemicals, and inedible fillers. In the gap between the industrialization of the food system during the mid-1800’s and those first laws dictating what could be sold as food, working class Americans spent decades eating “mostly crap,” says Deborah Blum, a Pulitzer-Prize winning science journalist. In her 2019 book, The Poison Squad, Blum details the origin story of the landmark Food and Drug Act. As more folks left farm life behind and came to rely on manufactured food “an enormous amount of food fraud” emerged, Blum tells Popular Science. Nowadays, the overwhelming majority of people continue to purchase their food from grocery aisles, but the food we buy there is much less liable to make us sick. So, how did we get from that past to our current present? And, with regulatory agencies including the FDA facing enormous cuts, what might the future hold? Ground Shells, Brick Dust, and Bones European countries, including Britain, Germany, and France passed food safety regulations about 50 years before the US did. In classic American style, we eschewed top-down restrictions and allowed the free market, free rein. In lieu of federal regulation, there was a haphazard patchwork of state and local laws surrounding certain foods pre-1906. Massachusetts, for instance, passed “An Act Against Selling Unwholesome Provisions” in 1785. But unsafe practices consistently fell through the cracks and into consumers’ stomachs, says Blum. In some cases, food wasn’t food at all.  Pre-pasteurization, milk spoilage and bacterial growth was a major problem. Away from the farm, dairy had to travel farther and keep for longer if people in cities were going to buy it. So, the dairy section became a hotbed of questionable additives. Borax, which you may recognize as a general-purpose pesticide, was used as a milk and butter preservative. Formaldehyde (AKA embalming fluid) was also a common milk additive and antibacterial agent. In addition to preserving the milk, formaldehyde also reportedly had a slightly sweet flavor, which helped improve the taste of rot, Blum explains.  A cartoon titled ‘Poisoning by Food Adulteration’. In November 1858 a confectioner bought Plaster of Paris from a druggist to add to lozenges. Instead of Plaster of Paris, he was acidentaly sold arsenic and 20 people died out of the roughly 200 people poisoned. This case gave ammunition to those trying to get legislation against food adulteration through Parliament (Scholfield Act of 1859). Illustrated by John Leech (1817-1864) an English caricaturist and illustrator, and dated to the 19th century. Arsenic was also used to dye food. CREDIT: Universal History Archive/Universal Images Group via Getty Images. Universal History Archive In cheese, lead compounds were added to boost its  golden color. Plaster of Paris, gypsum, and other white, powdery fillers made their way into milk and flour for color and texture. Flour was often portioned out by the grocer in-store–with mixed results. “If you went to a very honest grocer, you might get real flour. If you didn’t you might get a mix,” she says.  Coffee and spices were particularly terrible offenders. Ground coffee was often about 80 to 90 percent adulterated in the mid-19th century, says Blum. It might be made up of ground bone, blackened with lead, or charred seeds and plant matter. Spices were frequently 100 percent adulterated. Or, in other words, entirely made up of something other than what they were sold as. Cinnamon was frequently brick dust. Ground pepper could have been ground shells or charred rope. “Probably a good half of these products had some adulteration, depending on what you were looking at and how much you were willing to pay,” she says. The wealthy were generally able to afford higher quality, authentic, uncontaminated products.  But for everyone else, the problem of food fraud was so prevalent that people developed a suspicion of ground coffee, Blum says. Consumers started opting for whole beans instead, wherever possible. Suddenly, there was a market for counterfeit whole coffee beans, made of pigeon beans and peas, or even wax and clay. “You can find flyers that went to grocers that said, ‘you can, multiply your profits with our super cheap dirt beans.’” In her research, Blum found a record of a congressional hearing, where a food manufacturer described producing and selling a “strawberry jam” that was entirely red dye, corn syrup, and grass seed. His defense for the practice: “we have to be competitive in the market and other people are doing it too,” paraphrases Blum.  As all of the above was going on, people had little idea what they were consuming. “There was no labeling,” she notes. Though there were many cases of people falling ill. In an Indiana orphanage, multiple children died from formaldehyde poisoning. In New York state, an estimated 8,000 infants died from adulterated “swill milk” in a single year.  Pushing for Purity Calls for change came from multiple fronts, including womens’ groups and the growing “pure food” movement of the late 1800s, says Blum. But one chemist and physician, Harvey Washington Wiley, proved particularly dedicated and ultimately influential.  Wiley began noting and publishing reports on food contaminants during his work at the USDA in the 1880s and ‘90s. His primary job was to develop alternatives to sugar cane, but he started studying and cataloging adulteration in butter, milk, and honey– and later spices and alcoholic beverages. That’s where much of our data on food adulteration at the time comes from, notes Blum. Soon, Wiley was releasing regular bulletins on food adulterants and advocating for national laws. Many of his early attempts ended in failure. Congressional representatives received a lot of money from the food industry, and weren’t receptive to Wiley’s science-backed pleas for labels, transparency, and contamination regulations, says Blum. “He keeps pushing for it. The industry keeps shooting it down, and the political dog fight continues,” she says. Hubert E Mills of the Department of Pasteurized Dairy Products and Dora Morris, technician for the Maryland and Virginia Milk Products Association, checking samples taken from a farm for purity. Circa 1955. CREDIT: Photo by Evans/Three Lions/Getty Images. Evans But then, Wiley shifted tactics. He began conducting a series of experiments that he called the “hygienic table trials” with a group of USDA employees, later dubbed “the poison squad.” All of the dozen or so participants willingly and knowingly signed up to receive three freshly prepared meals, seven days a week, for six months from the newly created USDA test kitchen. Yet, along with their nourishing meals, a subset of the participants were also fed additives commonly found in adulterated food. “You could never have gotten this sort of study approved today,” says Blum. “He poisoned his co-workers.”  The group worked their way through borax, boric acid, salicylic acid, benzoic acid, sulfur dioxide, formaldehyde, copper sulfate, and saltpeter– among other things. Unsurprisingly, the squad was frequently sick and the experiment garnered a ton of publicity. “If you go to newspapers of the time, every single one had a story–’Americans are eating poison,’” Blum says.  The fervor, paired with the public outcry in response to Upton Sinclair’s book The Jungle, about Chicago’s meatpacking plants, led politicians to change their tune. In 1906, Congress passed both the Meat Inspection Act and the Food and Drug Act (colloquially known as “Wiley’s Law”). Later, the Food and Drug Act would be replaced by the Food, Drug, and Cosmetics act of 1938, which has been extensively revised and updated since. From these laws, the modern USDA– responsible for regulating meat and poultry products– and FDA, responsible for all other foods and pharmaceuticals, emerged.  The Future of Food Since the start of federal food regulation, states have beefed up their policies and the food industry has adopted its own standards. Many companies have even signed on to efforts like the Global Food Safety Initiative, which involves third-party testing beyond what’s legally required. Plus, the mere existence of federal law means that people can sue when things go wrong. Litigation is a big driver of compliance and caution at the corporate level, says Blum.  Yet the FDA still plays a key role in oversight, research, and responding to emerging threats like bird flu in milk, says Brian Schaneberg, a chemist and director of the Institute for Food Safety and Health (IFSH) at Illinois Tech.At IFSH, academic researchers collaborate directly with industry and FDA scientists and the institute hosts multiple federal projects and labs. Research there includes work on improving infant formula safety, food contamination from packaging, pathogen prevention in food manufacturing and produce, investigating the causes of illness outbreaks, and Grade A milk validation. “We really touch a lot of areas,” Schaneberg tells Popular Science.  Recently, the Trump Administration slashed more than 3,500 FDA jobs, amid broader slap-dash federal cuts. Despite claims to the contrary, these layoffs included dozens of scientists who conduct quality control and proficiency testing on everything from infant formula to dairy products and pet foods. The cuts have temporarily left the Center for Processing Innovation at IFSH almost entirely unstaffed, Schaneberg notes. From 15 staff, they’re down to four. Other labs across the country were also impacted.   After public pushback, FDA leadership promised to reinstate scientists in key roles and re-open a handful of the shuttered labs last week. At Schaneberg’s institute, federal scientists have been told they’ll be reinstated. Though, he notes they haven’t received formal notices confirming their re-hiring. The long-term fate of FDA research and testing labs also remains uncertain as proposed major budget cuts and a massive reorganization looms. The currently proposed Trump Administration plan would shift most food testing to the states.   [ Related: How to properly wash fruits and vegetables. ] “I’m definitely concerned,” says Schaneberg. He doesn’t see any clear, immediate threat to consumers, but in the long-term– he is worried about the FDA’s ability to ensure food safety if the agency is equipped with fewer staff and resources. “I still think all the big companies are going to do the best thing they can because they don’t want to hurt their brands and they don’t want to impact people.” And many states might have the ability to fill gaps. Yet there’s always bad actors, new brands, new additives, and unknowns, he notes.  It may be much rarer than it once was, but the FDA still detects unsettling instances of food contamination. In 2023 and 2024, the agency investigated high lead and chromium levels in cinnamon applesauce pouches, marketed to children, notes Martin Bucknavage, a senior food safety extension specialist in the Department of Food Science at Penn State University. “There’s those types of things that pop up, and it’s like ‘who– who else is going to go through and do that?,’” he says. The FDA has expertise in the science and the supply chains that few other institutions do, Bucknavage says, along with the ability and authority to respond quickly. With rapid changes and reorganization on the horizon, it’s hard to predict what the effect will be, he adds. “I think immediate-term, our food supply is going to be safe,” Bucknavage says. After all, FDA inspections are far less frequent than companies’ own safety tests and measures. But without the final layer of oversight, it’s possible something could be lost down the line, he says. Blum, with all her knowledge of the treacherous food landscape of decades past, agrees. “I’m not sitting here saying catastrophe, because we don’t actually know,” she says. “But there’s nothing in what the [Trump Administration] is doing that you would look at and say, ‘oh this makes us safer.’”
0 Комментарии 0 Поделились 42 Просмотры