You refer nonchalantly to “best practices.” I have not yet read your reference but where you have three years as a developer I have over three decades and I have seen very little in the way of even good practices, and I have watched as the better practices of my early days have disappeared in favor of fads and gimmicks.
And the worst practices have not only persisted but worsened. For example
- Illegible code. I was shocked when I started getting paid to code. Most of the code I have seen from others has been deliberately jagged and cluttered, not only was there no visible effort to make it easy to read but in many cases the developer had cultivated a bizarre and visually obfuscated layout that as clearly a kind of scent-marking. A “personal style” that nobody else could read.
- Poor structure. Code is not organized into blocks or sections; there are mid-function returns; unrelated operations are interleaved.
- Compulsive optimization. When performance is critical we need to bend the rules but most code runs at user interface timings so bending the rules to make it run a microsecond or two faster serves only to destabilize it. Real optimization is at the design level, not at the level of lines of code. This probably comes from whiteboard interviewing, which is where most of us have ou first contact with the industry and we are challenged to write screaming fast code instead of solid code.
I could go on. But the point is that these practices and others were already problems thirty years ago and they have gotten worse, not better.
And now “best practices” routinely include silliness like agile methodology and test-driven development, useless practices that waste tons of time and add no value.
Meanwhile concentration and focus have all but disappeared when at one time they were the most important part of managing developers.