They Swore by the Diet I Created — but I Completely Made It Up

How exposure to misinformation inoculation sometimes makes things worse

Image: Cathy Scola/Getty Images

While on vacation, Marcial Conte, the Brazilian publisher of my first book, met a woman who asked about his work. Upon learning he was responsible for A Mentira do Glutén: E Outros Mitos Sobre O Que Voce Comê (The Gluten Lie: And Other Myths About What You Eat), she lit up.

Her husband, she said, had followed my revolutionary diet protocol and changed his life. Pounds melted away. Myriad health problems resolved themselves.

“She told me to thank you for saving her husband’s life with the ‘UNpacked Diet,’” Conte grinned at me. “Incredible, no? The only change they made was keeping aluminum foil.”

Incredible, indeed. The diet was satire, invented by me, and it came at the end of a book dedicated to exposing pseudoscientific nutrition claims. For the centerpiece of the faux diet, I used just such a claim: that the cause of all modern ailments was food packaging. By “unpacking” your food — that is, by refusing to eat food that had come in contact with plastic, styrofoam, or aluminum foil — I pretended to promise readers a magical panacea for everything from autism to Alzheimer’s, as well as effortless weight loss.

Readers have emailed asking where they can buy the “UNpacked Diet–approved unbleached coffee filters” that I dreamed up as part of the satire.

The satire should have been clear. Every chapter was packed with warnings about precisely the kinds of claims made in the diet, such as:

  • Beware of panaceas… like a diet that promises miraculous weight loss and a solution to every chronic illness.
  • Distrust the promise of secret knowledge hidden by conspiracies… like the diet that “they” don’t want you to know about.
  • Don’t trust individual anecdotes… like the glowing testimonials I included at the end of the invented diet. (I took them from other pseudoscientific diet books.)
  • Stay alert for myths and fallacies such as the “appeal to antiquity”… the idea that our ancestors lived in a dietary paradise and that modern technology is uniquely evil and dangerous.
  • Watch out for grains of scientific truth turned into alarmist falsehoods… like the cherry-picked scientific studies that filled the UNpacked Diet’s footnotes.

Each deceptive tactic in the UNpacked Diet had been scrupulously debunked in the chapters that preceded it. Not only that, but after the diet there was another section called the “UNpacked Diet, UNpacked,” in which I went through each of the deceptive tactics and explained why I chose it.

How could this couple have taken it seriously, much less followed it? Even if they had missed the final section, their reaction to the UNpacked Diet should have been skepticism and disbelief, not enthusiasm.

I would have been more shocked at Conte’s story if I hadn’t already heard from others who had likewise tried the diet. Readers have emailed asking where they can buy the “UNpacked Diet-approved unbleached coffee filters” that I dreamed up as part of the satire, or with follow-up questions about what’s permissible within the framework of the “diet.” In just a few pages, these powerful rhetorical techniques overcame chapter after chapter of carefully crafted guidance on how to resist them.

My experiences as a college professor have only reinforced to me how powerful deceptive rhetoric can be, even when it is explicitly labeled as such. As part of my religion and science course at James Madison University, my students watch a YouTube video purporting to show restoration of vision to a blind girl through faith healing. It’s a textbook object lesson in emotional manipulation and the power of suggestion. Viewers have no clear evidence that the child’s vision was restored. There is no before-and-after comparison of her ability to see. Dramatic music plays in the background. Combined with the pressure of being on stage and the expectations of the adults doing the “healing,” these factors together might lead to a wishful testimonial about a change in the girl’s sight.

I explain exactly how these “healers” deceive their audience with the help of a Netflix special by illusionist Derren Brown, who “faith heals” audience members while telling them that he has no healing powers. He does so by exploiting a combination of his subject’s adrenaline rush, the power of suggestion, and savvy observation. Yet most students remain open to believing the YouTube video. In fact, informal class surveys indicate that by the end of the course, the students are more open to the possibility that faith healers can cure blindness, belief in near-death visions of the afterlife (as opposed to neurological explanations), and the efficacy of pseudoscientific medical treatments such as homeopathy and reiki.

As I emphasize in The Gluten Lie, anecdotes should not be generalized, and ample evidence counters this experience with my students. A 2017 study of climate change misinformation, for example, suggests that inoculating people against flawed arguments may help neutralize the adverse effects of misinformation. Like actual inoculation with a weakened form of a virus, information inoculation protects people against misinformation. These researchers exposed participants to a warning about a common misinformation technique — the use of a large group of fake “experts” — before exposing them to the real thing. Similar work exists showing effectiveness of this approach for resistance to conspiracy theories and industry propaganda.

Nevertheless, studying the prevention of misinformation is difficult, and this work is far from the last word. Many beliefs about “what works” in science communication are in flux. The so-called backfire effect — where people double down on their misconceptions after being presented with a fact-check — was once a truism of science communication but may well prove to be mythical. (Hopefully that fact-check won’t reinforce your prior beliefs!)

Inoculating people against flawed arguments may help neutralize the adverse effects of misinformation.

Although inoculation is promising, my own experiences make me skeptical. Even if inoculation works in certain cases such as climate change and conspiracies, it may not do so in all cases. Panaceas for misinformation are no more plausible than dietary panaceas. Not every kind of misinformation is the same, and it’s unlikely that we can develop a one-size-fits-all solution.

For instance, numerous studies have shown that narratives and anecdotes command significant persuasive power. Inoculation may work against climate change misinformation, which depends on creating a false impression of expert consensus, rather than on individual testimonials. But inoculation might fail against misinformation that depends on personal narratives — such as a video that seems to show a blind girl being healed, or a testimonial about a child’s purported vaccine injury.

In light of this persuasive power of stories, science communicators face a paradox. Should they engage in the journalistic practice of building information around a strong central narrative, leading with a protagonist or an anecdote like the one that began this piece? Or should those hoping to accurately communicate science stick with a drier approach to persuasion, one that’s less likely to use the same rhetorical tricks as misinformation campaigns?

Educators face a similar paradox. As I revise my syllabi, I confront the same difficult decisions I’ve had to make in the past. Should I keep the YouTube video of the healing? The testimonials about reiki and homeopathy? Or should I dispense with them and instead present material solely in the context of rebuttals and debunkings?

The quandary is that I want my students to be able to identify pseudoscience for themselves by experiencing it as it’s often presented: with anecdotes and testimonials. Withholding that opportunity smacks of indoctrination, not education, and feeds into the “they don’t want you to know” conspiracism beloved by charlatans. And yet, it’s gravely irresponsible to give them that opportunity if the net result is increased belief in misinformation.

My current approach is to present my students with what you’ve just read: transparency about my own thought process. Misinformation exploits the (reasonable!) suspicion that authority figures are hiding something, coming up with secret ways to “nudge” us in certain directions or manipulating us with… well, with science communication techniques. Transparency about how we approach the communication of science — or the communication of a lot of things — creates trust, which is essential to effective persuasion. I’ve found that students report increased trust and a sense that I’m an honest broker of information when I take the transparency approach.

At the same time, knowing that some people believe in the healing power of my satirical diet immediately after reading almost 200 pages on why they shouldn’t has left me deeply shaken. Changing how we communicate science can help, but it’s a Band-Aid solution. A real solution means changing education so books like mine are obsolete.

By the time children finish high school, they should be intimately familiar with manipulative rhetorical techniques, common fallacies, and their own susceptibility to persuasive anecdotes. Alongside hours of studying the Krebs cycle and mitochondria, there should be hours allotted to how to distinguish scientific reasoning from pseudoscientific nonsense. From vaccines to climate change, misinformation poses an existential threat when it inhibits our collective decision-making ability. The time has come to start treating it that way.

Professor at JMU . Religion, science, 道. Opinions mine. Latest book: NATURAL — how to love nature without worshipping it, April 2020.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store