Cambridge Dictionary has declared "hallucinate" as the word of the year for 2023 and also updated its meaning related to artificial intelligence. The traditional definition is "to seem to see, hear, feel, or smell something that does not exist", usually because of ''a health condition or because you have taken a drug.'' However, it now also relates to AI producing plausible but false information.
The new, additional definition is: ''When an artificial intelligence (a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.''
The development comes after there has been a dramatic rise in generative artificial intelligence (AI) tools like ChatGPT, Bard, and Grok.
''The Cambridge Dictionary team chose 'hallucinate' as its Word of the Year 2023 as it recognized that the new meaning gets to the heart of why people are talking about AI. Generative AI is a powerful tool but one we're all still learning how to interact with safely and effectively this means being aware of both its potential strengths and its current weaknesses'', the dictionary explained.
An example of AI hallucination is when a US law firm used ChatGPT for legal research which led to fictitious cases being cited in court. Another is when Google's Bard made a factual error about the James Webb Space Telescope.
Wendalyn Nichols, Cambridge Dictionary's publishing manager, said, ''The fact that AIs can 'hallucinate' reminds us that humans still need to bring their critical thinking skills to the use of these tools. AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray.
''At their best, large language models [LLMs] can only be as reliable as their training data. Human expertise is arguably more important – and sought after – than ever, to create the authoritative and up-to-date information that LLMs can be trained on.''
The dictionary also added a number of AI-related entries this year, including large language model (or LLM), generative AI (or GenAI), and GPT (an abbreviation of Generative Pre-trained Transformer).
Henry Shevlin, an AI ethicist at the University of Cambridge, said, ''The widespread use of the term hallucinate to refer to mistakes by systems like ChatGPT provides a fascinating snapshot of how we're thinking about and anthropomorphising AI. Inaccurate or misleading information has long been with us, of course, whether in the form of rumors, propaganda, or fake news. Whereas these are normally thought of as human products, ‘hallucinate' is an evocative verb implying an agent experiencing a disconnect from reality.''
''This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one ‘hallucinating. While this doesn't suggest a widespread belief in AI sentience, it underscores our readiness to ascribe human-like attributes to AI.''
Featured Video Of The Day
Tim Cook And Mark Cuban Rely On These AI Tools To Save Time: "It's Changed My Life" Microsoft Copilot Said to Be Witnessing Growing Momentum in India Microsoft Releases AI-Powered Xbox Support Virtual Agent for Xbox Insiders Live Updates: Trump Says He May Not Challenge Election As Long As "It's Fair" Renowned Singer Sharda Sinha, A Padma Bhushan Recipient, Dies At 72 Equatorial Guinea Civil Servant's Over 400 Sex Tapes Spark Massive Row Amid Inflation, Rising Fuel Prices, Voters Head To Las Vegas Polls Video: Toxic Foam On Yamuna In Delhi Raises Concern As Chhath Puja Begins Harris vs Trump - America Elects 47th US President. Election Result Today Track Latest News Live on NDTV.com and get news updates from India and around the world.