WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2024 Poal.co

389

Google. You have gone and fucked up yet again.

Archive: https://archive.today/ikx3f

From the post:

>While I was driving, I received a text message from my father with 3-4 pictures and some of his typical dad jokes. Android Auto prompted me with a new feature "Summarize Messages". Usually I'm strongly against these features but curiosity got the better of me so I clicked "Turn on". Then I clicked "Summarize" on my father's message. It started giving me a detailed summary of names that are not in any of my contacts or my actual social circle, e.g. "Zach and Melanie are planning to visit NYC this summer". I'm not sure if it was truly another person's summary of some canned dev message. But now I'm a bit worried Google is giving other users my SMS history.

Google. You have gone and fucked up yet again. Archive: https://archive.today/ikx3f From the post: >>While I was driving, I received a text message from my father with 3-4 pictures and some of his typical dad jokes. Android Auto prompted me with a new feature "Summarize Messages". Usually I'm strongly against these features but curiosity got the better of me so I clicked "Turn on". Then I clicked "Summarize" on my father's message. It started giving me a detailed summary of names that are not in any of my contacts or my actual social circle, e.g. "Zach and Melanie are planning to visit NYC this summer". I'm not sure if it was truly another person's summary of some canned dev message. But now I'm a bit worried Google is giving other users my SMS history.

(post is archived)

[–] 1 pt

Probably LLM hallucinating. Which is the central flaw of conflating AI and LLM. AI doesnt exist yet, and isnt close to existing. LLM is just a NPC word guesser. It can easily guess the content of a plausible NPC text about visiting a place or ordering pizza. It can't create new, novel content because it's not AI. Merely a fast immitator.

[–] 0 pt

Don't you like how they call it "hallucinating" rather than "being a steaming pile of shit and lying to you"?

[–] 1 pt (edited )

I think hallucinating is a more accurate term than lying. It's like using a random number generator to tell me how many birds are in my yard. If it pops out 5 quintillion...it's not lying. It doesn't even have a concept of truth. It just creates grammatically correct English sentences with numbers in them. "There are 5 quintillion birds in your yard" isnt a lie in this context, it's a grammatically correct sentence with a random number in it.

The liars are the ones who claim a LLM is AI. :)

[–] 2 pts

Instead of hallucinating, maybe just use the word "wrong".

[–] 1 pt

Ill at least partially agree with that. The reason I say it is lying is because a lot of people see what is generated and take it as an authoritative answer and just "trust it". Then again, people are kind of stupid and when told to trust something is right they tend to do so.

[–] 1 pt

I spent the morning gutting a pixel 6 and removing as much of jewgle as possible. Nice multimedia device.