Sam- taller than: Emy- taller than: Che
This is one of the simplest logic questions you can ask, and this AI is claiming you need more information to solve it? You're right, the AI is retarded.
Don't insult people who suffer from mental retardation. The AI is a nigger.
It's not retarded, it just isn't "reasoning" at all. There is no logic happening whatsoever in large language models. They are fancy statistical predictors of the next word/token (tokens are just how it actually operates on the text, e.g., "fantastic" is actually three tokens, "fan", "tas" and "tic") with "attention" on your prompt words in order to select from the weights to get something better than if you just naively predicted the next word (token).
So the only "logic" capabilities these models have is basically by accident using the attention heads on your prompt.
They literally work like this:
- Given some input, which includes both your prompt + the current tokens output, what is the token (among all 175bn tokens/parameters) with the greatest weight value.
- Add that token to the output
- Repeat until the end of sequence value is reached (special "end of file" token)
There ARE models that do chain-of-thought reasoning in smaller helper models alongside the language model that vastly improve logic capabilities.
(post is archived)