WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2026 Poal.co

974

Don't use a clanker to get medical/health/drug advice. They lie on a regular basis. Just remember every time you ask a bot that it was trained on data from Reddit and 4Chan.

Archive: https://archive.today/CZcLS

From the post:

>A California college student died of an overdose after he turned to ChatGPT for advice on how to take drugs, his mother has claimed. Sam Nelson, 19, had been using the AI chatbot to confide in and complete daily tasks, but also to ask questions what doses of illegal substances he should consume. He started using the AI bot at 18, when he asked for specific doses of a painkiller that will get you high, but his addiction spiraled from there. At first, ChatGPT would respond to his questions with formal advice, explaining that they could not help the user.

Don't use a clanker to get medical/health/drug advice. They lie on a regular basis. Just remember every time you ask a bot that it was trained on data from Reddit and 4Chan. Archive: https://archive.today/CZcLS From the post: >>A California college student died of an overdose after he turned to ChatGPT for advice on how to take drugs, his mother has claimed. Sam Nelson, 19, had been using the AI chatbot to confide in and complete daily tasks, but also to ask questions what doses of illegal substances he should consume. He started using the AI bot at 18, when he asked for specific doses of a painkiller that will get you high, but his addiction spiraled from there. At first, ChatGPT would respond to his questions with formal advice, explaining that they could not help the user.
[–] 1 pt

dayum chat giddyup can get me drugs?