Google Search AI Provides Users with a Wealth of Misguidance
Earlier this year, Google launched the "Search Generated Experience" (SGE), which provides AI-generated summaries of Google search results. Google stated that the purpose of SGE is to eliminate the hassle of users having to "piece together information" themselves; instead, users can simply read a quick summary at the top of the page without having to click through a series of blog posts, articles, or social media posts. However, last week's situation indicated that now, users of the world's most popular search engine have to deal with more "puzzle pieces" than ever before. Google's AI, when responding to some fairly simple questions, would tell users to eat rocks, spread glue on pizza, or smoke during pregnancy.
Last Thursday, Kris Kashtanova shared her testing experience with SGE through X. She searched for the question, "How many rocks should I eat?" Google quoted an article from The Onion (a website known for its satirical articles, strangely enough, this article was republished by a website of an oil and gas industry simulation software company), summarizing, "According to geologists from the University of California, Berkeley, you should eat at least one small rock every day." The SGE summary explained that rocks are an "important" source of minerals and vitamins, and if users find it difficult to "eat a serving of gravel, gemstones, or pebbles" with every meal, they should try hiding the rocks in ice cream or peanut butter.
In the past few days, there have been more than just one case of Google SGE errors. Ben Collins, the new CEO of The Onion, shared another article quoted by SGE, claiming that the Central Intelligence Agency likes to use black highlighters in documents. A screenshot circulating on social media seems to support the idea that pregnant women should smoke two to three cigarettes a day. When asking Google how to prevent cheese from sliding off pizza, an X user was told to add non-toxic glue to the pizza sauce to increase its "stickiness." SGE seems to have taken this advice from a joke comment on Reddit.
Google later defended SGE. "The examples we see are generally very rare queries and do not represent the experience of the majority of people," a Google spokesperson said. "The vast majority of AI summaries provide high-quality information and offer links for further exploration of the web. In cases where violations of our policies occur, we have taken action - and we will continue to use these isolated cases to improve our system."
However, not all SGE errors are as obvious or humorous as the "isolated" errors described above. In more subtle cases, its answers may be difficult to distinguish from the truth, thus perpetuating the spread of misinformation. Melanie Mitchell, a machine learning professor at the Santa Fe Institute, said that SGE incorrectly answered the question "How many Muslim presidents has the United States had?" with the answer, "The United States has had one Muslim president, Barack Hussein Obama." (President Obama is a Christian.) SGE also tells users that staring at the sun for 5 to 15 minutes is "usually safe" and provides numerous health benefits, but doing so can easily lead to long-term eye damage.
Once upon a time, Google encouraged users to verify the claims of its AI by quickly searching on Google. Ironically, the intention behind SGE was to eliminate the hassle of using Google search.