This is not surprising at all, but serves to remind us of the limits of current generative AI tools that social apps push you to use.
A new study by the Tow Center for Digital Journalism shows that most major AI search engine tools do not provide the correct source citations for news articles. Instead, they often create fake reference links or don’t respond when asked about a particular article.
This chart shows that most AI chatbots are not very good at providing relevant citations. Elon Musk’s Grok, which he has hailed as “the most truthful AI”, is among the least accurate or reliable resources.
According to the report
” The chatbots answered incorrectly more than 60% queries. The level of accuracy varied across platforms. Perplexity answered 37% of queries incorrectly while Grok 3, had an error rate of 94%.
The report also found that these tools could often provide data from sources which had been restricted to AI scraping.
On some occasions the chatbots incorrectly or refused to answer questions from publishers who allowed them access to their content. They sometimes answered correctly queries regarding publishers who shouldn’t be able to access their content.
This suggests that certain AI providers do not follow the robots.txt command that prevents them from accessing copiesright-protected works.
The main concern is the reliability of AI search tools. They are being increasingly used by web users as search engines. Many youngsters use ChatGPT to do their research, which is a good thing. However, insights such as these show you cannot trust AI tools for accurate information or education on important topics.
As such, it’s not a big deal. Anyone who has used an AI-chatbot knows that its responses aren’t always useful or useable. The concern here is that these tools are being promoted as an alternative to research and knowledge. For younger users, this could result in a generation of people who lack the necessary skills and information to use these programs.
Mark Cuban, a businessman from Texas, summed up this issue pretty well in an SXSW session this week.
AI is not the solution. AI is a tool. AI can be used to enhance any skills that you already have.
Cuban says that AI can be a great tool to improve performance. However, it is not a solution in itself.
The AI will create video, but not a compelling story. AI is able to produce code, which will help you create an app. However, it cannot build the app.
While AI can be a great help, it is not the answer.
In this case, the concern is that by showing children that AI can provide answers to their questions, it may be misleading them. Research has shown that AI’s capabilities are not very good.
To get the best out of these systems you need key analytical and research skills as well as expertise related to these fields.