Friday, November 22, 2024

OpenAI’s Search Tool Has Already Made a Mistake

Must read

OpenAI just announced SearchGPT, but its demo got something wrong.

Illustration by The Atlantic

This is Atlantic Intelligence, a newsletter in which our writers help you wrap your mind around artificial intelligence and a new machine age. Sign up here.

Yesterday OpenAI made what should have been a triumphant entry into the AI-search wars: The start-up announced SearchGPT, a prototype tool that can use the internet to answer questions of all kinds. But there was a problem, as I reported: Even the demo got something wrong.

In a video accompanying the announcement, a user searches for music festivals in boone north carolina in august. SearchGPT’s top suggestion was a fair that ends in July. The dates that the AI tool gave, July 29 to August 16, are not the dates for the festival but the dates for which its box office is closed.

AI tools are supposed to refashion the web, the physical world, and our lives—in the context of internet search, by providing instant, straightforward, personalized answers to the most complex queries. In contrast with a traditional Google search, which surfaces a list of links, a searchbot will directly answer your question for you. For that reason, websites and media publishers are afraid that AI searchbots will eat away at their traffic. But first, these programs need to work. SearchGPT is only the latest in a long line of AI search tools that exhibit all sorts of errors: inventing things whole cloth, misattributing information, mixing up key details, apparent plagiarism. As I wrote, today’s AI “can’t properly copy-paste from a music festival’s website.”


A green SearchGPT screen covered in static
Illustration by Matteo Giuseppe Pani

OopsGPT

By Matteo Wong

Whenever AI companies present a vision for the role of artificial intelligence in the future of searching the internet, they tend to underscore the same points: instantaneous summaries of relevant information; ready-made lists tailored to a searcher’s needs. They tend not to point out that generative-AI models are prone to providing incorrect, and at times fully made-up, information—and yet it keeps happening. Early this afternoon, OpenAI, the maker of ChatGPT, announced a prototype AI tool that can search the web and answer questions, fittingly called SearchGPT. The launch is designed to hint at how AI will transform the ways in which people navigate the internet—except that, before users have had a chance to test the new program, it already appears error prone.

In a prerecorded demonstration video accompanying the announcement, a mock user types music festivals in boone north carolina in august into the SearchGPT interface. The tool then pulls up a list of festivals that it states are taking place in Boone this August, the first being An Appalachian Summer Festival, which according to the tool is hosting a series of arts events from July 29 to August 16 of this year. Someone in Boone hoping to buy tickets to one of those concerts, however, would run into trouble. In fact, the festival started on June 29 and will have its final concert on July 27. Instead, July 29–August 16 are the dates for which the festival’s box office will be officially closed. (I confirmed these dates with the festival’s box office.)

Read the full article.


What to Read Next

  • AI’s real hallucination problem: “Audacity can quickly turn into a liability when builders become untethered from reality,” Charlie Warzel wrote this week, “or when their hubris leads them to believe that it is their right to impose their values on the rest of us, in return for building God.”
  • Generative AI can’t cite its sources: “It is unclear whether OpenAI, Perplexity, or any other generative-AI company will be able to create products that consistently and accurately cite their sources,” I wrote earlier this year, “let alone drive any audiences to original sources such as news outlets. Currently, they struggle to do so with any consistency.”

P.S.

You may have seen the viral clip of the Republican vice-presidential candidate J. D. Vance suggesting that liberals think Diet Mountain Dew is racist. It sounded absurd—but the soft drink “retains a deep connection to Appalachia,” Ian Bogost wrote in a fascinating article on why Vance just might have had a point.

— Matteo

Latest article