Egotism is a huge problem. Both the NiH syndrome and "I'm special, so I'm exempt from my own rules and best practices" is a big problem.
Over engineering was pushed mainstream with java. Now everyone thinks they need ten levels of abstraction because they think they might need to use this code slightly differently ten years after it's no longer in use.
Made worse is too many people completely loose sight of context. Best practices at Google doesn't apply to 99% of the coding world. What makes sense for them is stupidity for the majority.
You see this paradigm a lot between the back end and front end world. I have worked on backend systems, and a lot of time, there was a lot of focus on what is right, vs what works. In front end world, it's a roller coaster of just saying fuck it, and doing whatever you want because the framework will take care of it. At my current company, they used an alpha stage technology from Microsoft. As of current, there is no live reload, no error parser, no real linter, no interaction between forms, plaguing bugs everywhere, limitations of the framework causing blockers. But hey, apparently this was the best solution available ...
I feel it.
"New and shiny is always better."
Reusable software is almost never reused. And the extra effort to write, review, test, and debug is bad tradeoff.
This is the correct answer. I have rarely seen any of these conventions lead to any sort of productivity increase. I was talking to my new job, and we both agreed that over abstraction is dangerous, since it can give too much information about architecture. This is also very true of the current micro services architecture trend, where you have an endpoint for everything
(post is archived)