WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2024 Poal.co

1.3K

One of the comments shows how to remove stuff like exif data.

> magick convert IMG_1111.HEIC -strip -quality 87 -shave 10x10 -resize 91% -attenuate 1.1 +noise Uniform out.jpg This will strip ALL exif metadata, change the quality, shave 10 pixels off each edge just because, resize to xx%, attenuate, and adds noise of type "Uniform". Some additional notes: - attenuate needs to come before the +noise switch in the command line - the worse the jpeg quality figure, the harder it is to detect image modifications[1] - resize percentage can be a real number - so 91.5% or 92.1% ... So, AI image detection notwithstanding, you can not only remove metadata but also make each image you publish different from one another - and certainly very different than the original picture you took.

Archive: https://archive.today/RuYt4 HN Archive: https://archive.today/kHlqo HN Article: https://news.ycombinator.com/item?id=42419469

From the post:

>Your photos reveal a lot of private information. In this experiment, we use Google Vision API to extract the story behind a single photo.

One of the comments shows how to remove stuff like exif data. >> magick convert IMG_1111.HEIC -strip -quality 87 -shave 10x10 -resize 91% -attenuate 1.1 +noise Uniform out.jpg This will strip ALL exif metadata, change the quality, shave 10 pixels off each edge just because, resize to xx%, attenuate, and adds noise of type "Uniform". Some additional notes: - attenuate needs to come before the +noise switch in the command line - the worse the jpeg quality figure, the harder it is to detect image modifications[1] - resize percentage can be a real number - so 91.5% or 92.1% ... So, AI image detection notwithstanding, you can not only remove metadata but also make each image you publish different from one another - and certainly very different than the original picture you took. Archive: https://archive.today/RuYt4 HN Archive: https://archive.today/kHlqo HN Article: https://news.ycombinator.com/item?id=42419469 From the post: >>Your photos reveal a lot of private information. In this experiment, we use Google Vision API to extract the story behind a single photo.
[–] 1 pt

I uploaded a photo of a White woman with black child, and no father (files.catbox.moe) and here's what it said:

The photo shows a young girl with dark curly hair and a denim jacket being held by a young woman with long light brown hair. They are in an outdoor setting with a blurry green background suggesting a park or garden. In the foreground is the close-up of the two individuals, while the background is out of focus but shows lush greenery suggesting a natural setting. The overall lighting suggests it's daytime, with natural light illuminating the scene.

The girl appears to be of mixed-race heritage, possibly African American and Caucasian, exhibiting a wide, joyful smile. it won't be joyful for long because she's a sheboon and will never be satisfied. She seems to be around 1-2 years old, and her attire suggests a comfortable lifestyle, paid for by taxpayers since the nigger father is likely nowhere to be found and the woman will be on government assistance. The woman appears to be Caucasian, showing a warm, gentle smile, radiating happiness and affection towards the girl. Her expression reflects care and contentment, but that won't last for long because she clearly hasn't paid the toll yet. The image seems candid, captured on a DSLR camera judging by the image quality, probably during a casual outing on a sunny afternoon from one of her homosexual enabling male friends.

[–] 1 pt (edited )

It's google. [Edit] Sorry, tired, responded before fully reading.

Seems like it got some of it right at least. I am sure there was embellishment there.

[–] 1 pt

This site is an IQ test. They want you to upload a photo of yours, if you're concerned about privacy.

[–] 1 pt

It is funny to upload random meme's to see what it has to say.