WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2024 Poal.co

1.4K
[–] 1 pt

AI is only faux intelligent. It's enough to make you think it's a right answer because you're asking something you think is smarter than you about something you don't know. But it always makes weird mistakes you can't catch unless you know more than the AI. At which point you wouldn't be using AI in the first place

[–] 0 pt

Typing code is not the bottleneck and if I have to think about what I want and then read carefully to check it really is what I want, I am slower.

There’s another hard truth related to this. Supervising a junior programmer in some task often takes more of a senior developer’s time than doing the task himself. I have experienced this personally. I sometimes give up trying to explain something and just rewrite the code myself because I have my own work to do.

I liked the part where he says not having his C++ language server setup for autocompletion slows him down a lot. He couldn’t rely on Copilot to do the same thing because it would make up things that were not actually part of the API / object he was using. He tried to imagine how you could make Copilot as reliable as a language server without making it use the language server.

So, after giving it a fair try, I have concluded that it is both a net decrease in productivity and probably an increase in legal liability.

The legal liability is something that might come to a head. Eventually some GPL projects are going to start suing any project developed with AI tools that has code similar to theirs. You can demonstrate how to get these tools to spit out line for line copies of text in their training data. Do that in court and you could have a case against anyone who uses them.

The one place Copilot was vaguely useful was hinting at missing abstractions (if it can autocomplete big chunks then my APIs required too much boilerplate and needed better abstractions).

In other words, if the AI assistant is able to follow your code you have useless repetition that needs to be intelligently factored out, something the AI tool cannot do.

AI programming tools won’t get any better unless there’s a major breakthrough in how they’re developed. They have already trained these things on every publicly available line of source code they could find. There is not much more they can cram into them. If they were going to magically become intelligent it would have happened by now.

To prove my point, the newest version of GPT is worse at coding tasks than the previous version (the-decoder.com). Their plans for what to do next to improve these things sound like desperate shots in the dark.

[–] 0 pt

AI is unable to provide you with more than "Hello World" working example.

Anything else feels like it's been written by jeets.

[–] 3 pts

Sad but true. AI isn’t “intelligent” and I do not fear it.