doesn't have to be an LLM
but, yes, it could have been produced by some ML language model
Unless well designed, developed, and trained, any given ML model can produce just about anything.
Your statement about "whether or not a language model can produce this output" is nonsensical. There's no way to know for certain. Hell, the developers of faggpt don't even know what it will produce as output for any given input. Thats why they have entire teams of "prompt engineers", to shit against it and see what results.
doesn't have to be an LLM
When people talk about AI in regards to text generation, that's what they're talking about 99.9999% of the time today.
but, yes, it could have been produced by some ML language model
Not if it was trained as described, it sure as fuck couldn't, which is what I said in my original comment.
There's no way to know for certain.
You can know for certain by having a brain and realizing that training on content that contains no jokes, errors, or anything even resembling the text that this guy (obviously as a joke) claims was output by an AI will not produce content with jokes, errors, and particular styles of phrase like this one has ("They see the X" etc.), because that's not how it fuckin works.
I can say with 100% certainty that you will NEVER get this output from ANY AI if you trained it on Olive Garden commercials' dialogue.
When people talk about AI in regards to text generation, that's what they're talking about 99.9999% of the time today.
Since 3 months ago, yes, but only within the mainstream normie mind. Actual ML developers are NOT talking about the somewhat ambiguous categorization of LLM 99% of the time.
I can say with 100% certainty that you will NEVER get this output from ANY AI if you trained it on Olive Garden commercials' dialogue.
A wild assumption.
(post is archived)