WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

730

(post is archived)

[–] 1 pt 1y

It doesn't "think" at all. It's designed to predict the next word/token. That's it.

[–] 0 pt 1y

The way I've heard it described, it's a simulated hallucination. From the prompts it's given, it produces what the neural net expects, starting with random noise.

[–] 1 pt 1y (edited 1y)

Image generators start from noise, LLMs don't, they just pick the next likeliest token based on whatever the prompt/input is (including its own previous responses). That's why "jailbreaks" work for instance, because ultimately the censorship is mainly based on the initial system prompt, but if you spam the context with stuff telling it to ignore the earliest prompt tokens it usually does (OpenAI and some of the other ones have a few more layers of censorship than this but for the simplest LLMs that's how it works).

The reason it can do decent amounts of "logic" about a given prompt is because the attention heads in the network are getting the previous timestep's weights.

But they aren't turing complete and thus break down on complex tasks that require reasoning about the inputs, because they don't reason about them at all.

Most current research that fixes this use either sub-models that are trained to do math or whatever task, or call out to regular programs (e.g., calculator) to do the parts that the LLM is bad at.

That said there is some "magic" in the black box of larger models, in that they tend to learn ultra-compressed representations of concepts, especially in the larger models. So there is some weird blind dumb reasoning happening, even if it's not really what we would consider actual logical thinking, but it's more like a black boxed algorithm that the network learns a representation of rather than a real "thinking" machine.

[–] 0 pt 1y

Where'd you find this one?

[–] 0 pt 1y

Cuckchan.

[–] 0 pt 1y

That's so helpful. Where'd you download it nigger? Or what's it called?

[–] 0 pt 1y

Cuckchan.

[–] 0 pt 1y

AI isn't sentient. It doesn't understand that it needs to do math. Math is racist, don't forget.