WARNING!!!… AI Hallucinations – How AI Thinks in Shades of Gray
@COASTTOCOASTAMOFFICIAL Dr. Bart Kosko is a bestselling author and professor of electrical and computer engineering, and law, at the University of Southern California. Kosko reported on AI hallucinations, which is when an AI-powered language model, such as ChatGPT, generates false information. According to Kosko, who has tested and interrogated many large language models (LLM), these systems are rife with error and are unable to determine if they have answered questions accurately. “It does not tell you the confidence it has when it answers a question and it really can’t,” Kosko said, noting AI-powered language models do not know how because the people who trained them do not know how to teach about hallucinatory errors. “Relying on this, deferring to it… is very dangerous,” he warned.
To learn more about featured guest speakers on this show please visit
https://www.coasttocoastam.com/show/2023-07-22-show/
COAST TO COAST AM is Produced and Owned by Premiere Networks Inc/iHeartMedia
2023 All Rights Reserved