WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

1.1K
[–] 1 pt

I read how the exploit works. This is amateur hour stuff. They thought they could control what Bash commands are able to do by blocking the small amount of syntax they were aware of. They clearly wrote their exceptions off the top of their heads and did not even look at any Bash documentation.

They blocked <(command) but not >(command). If you look at the Bash docs for the first one it shows you the second one on the same page.

Either way, if you’re trying to allow an LLM to execute code in a complex language like Bash you are playing with fire.

The other interesting part is this was discovered by a company (PromptArmor) that specializes in identifying AI (LLM) security threats. One thing they do is check all the software you are running (including dependencies) and flag everything that is known to be written using LLM tools.

LLMs have spawned a new business for people who clean up their messes. AI is actually creating more jobs.

[–] 1 pt

Oh how the mighty have fallen.

[–] 1 pt (edited )

Another win for the worlds worst enterprise company. They may not beat MS in money (business side only / not sure) but they for sure beat MS in fuck ups on the business side.

I have to deal with IBM garbage on a daily basis due to them being sold to the place I work. Long before I got there, I was hired to move them into the modern times -- We still maintain AIX servers.

The IBM website is a fucking abortion procedure happening if you are forced to navigate it