WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

846

Yeah, that is kind of what I was saying when they called it "hallucinations". Its just misdirection saying "it broke or said something stupid or wrong".

Archive: https://archive.today/zc8mX

From the post:

>In the communications surrounding LLMs and popular interfaces like ChatGPT the term ‘hallucination’ is often used to reference false statements made in the output of these models. This infers that there is some coherency and an attempt by the LLM to be both cognizant of the truth, while also suffering moments of (mild) insanity. The LLM thus effectively is treated like a young child or a person suffering from disorders like Alzheimer’s, giving it agency in the process. That this is utter nonsense and patently incorrect is the subject of a treatise by [Michael Townsen Hicks] and colleagues, as published in Ethics and Information Technology.

Yeah, that is kind of what I was saying when they called it "hallucinations". Its just misdirection saying "it broke or said something stupid or wrong". Archive: https://archive.today/zc8mX From the post: >>In the communications surrounding LLMs and popular interfaces like ChatGPT the term ‘hallucination’ is often used to reference false statements made in the output of these models. This infers that there is some coherency and an attempt by the LLM to be both cognizant of the truth, while also suffering moments of (mild) insanity. The LLM thus effectively is treated like a young child or a person suffering from disorders like Alzheimer’s, giving it agency in the process. That this is utter nonsense and patently incorrect is the subject of a treatise by [Michael Townsen Hicks] and colleagues, as published in Ethics and Information Technology.

(post is archived)