A new book tackles AI hype – and how to spot it

Book Cover of AI Snake Oil by Arvind Narayanan and Sayash Kapoor

AI Snake Oil
Arvind Narayanan and Sayash Kapoor
Princeton Univ., $24.95

A few months ago, I was working on an article about oceans across the solar system. Having read my fill about oceans of water, I turned to Google for a quick refresher on oceans made of other stuff, liquid hydrocarbons, for example. For better or worse, I searched “oceans in the solar system not water.” I sought a reliable link, maybe from NASA. Instead, Google’s AI Overviews feature served up Enceladus as one suggestion. This Saturn moon is famous for its subsurface sea — of saltwater. I shut my laptop in frustration.

That’s one small example of how AI fails. Arvind Narayanan and Sayash Kapoor collect dozens of others in their new book, AI Snake Oil — many with consequences far more concerning than irking one science journalist. They write about AI tools that purport to predict academic success, the likelihood someone will commit a crime, disease risk, civil wars and welfare fraud (SN: 2/20/18). Along the way, the authors weave in many other issues with AI, covering misinformation, a lack of consent for images and other training data, false copyright claims, deepfakes, privacy and the reinforcement of social inequities (SN: 10/24/19). They address whether we should be afraid of AI, concluding: “We should be far more concerned about what people will do with AI than with what AI will do on its own.”

The authors acknowledge that the technology is advancing quickly. Some of the details may be out of date — or at least old news — by the time the book makes it into your hands. And clear discussions about AI must contend with a lack of consensus over how to define key terms, including the meaning of AI itself. Still, Narayanan and Kapoor squarely achieve their stated goal: to empower people to distinguish AI that works well from AI snake oil, which they define as “AI that does not and cannot work as advertised.”

Narayanan is a computer scientist at Princeton University, and Kapoor is a Ph.D. student there. The idea for the book was conceived when slides for a talk Narayanan gave in 2019 titled “How to recognize AI snake oil” went viral. He teamed up with Kapoor, who was taking a course that Narayanan was teaching with another professor on the limits of prediction in social settings.

The authors take direct aim at AI that can allegedly predict future events. “It is in this arena that most AI snake oil is concentrated,” they write. “Predictive AI not only does not work today, but will likely never work, because of the inherent difficulties in predicting human behavior.” They also devote a long chapter to the reasons AI cannot solve social media’s content moderation woes. (Kapoor had worked at Facebook helping to create AI for content moderation.) One challenge is that AI struggles with context and nuance. Social media also tends to encourage hateful and dangerous content.

The authors are a bit more generous with generative AI, recognizing its value if used smartly. But in a section titled “Automating bullshit,” the authors note: “ChatGPT is shockingly good at sounding convincing on any conceivable topic. But there is no source of truth during training.” It’s not just that the training data can contain falsehoods — the data are mostly internet text after all — but also that the program is optimized to sound natural, not necessarily to possess or verify knowledge. (That explains Enceladus.)

I’d add that an overreliance on generative AI can discourage critical thinking, the human quality at the very heart of this book.

When it comes to why these problems exist and how to change them, Narayanan and Kapoor bring a clear point of view: Society has been too deferential to the tech industry. Better regulation is essential. “We are not okay with leaving the future of AI up to the people currently in charge,” they write.

This book is a worthwhile read whether you make policy decisions, use AI in the workplace or just spend time searching online. It’s a powerful reminder of how AI has already infiltrated our lives — and a convincing plea to take care in how we interact with it.


Buy AI Snake Oil from Bookshop.org. Science News is a Bookshop.org affiliate and will earn a commission on purchases made from links in this article.

administrator

Related Articles