In my experience software in general, not just games, has continually become buggier over time.
When I started programming in the early 70s the prevalent attitude among developers was zero tolerance for bugs. Any bug in a system was a top priority to address. Bugs in vendor software were kept at a low enough level that fast paths to core developers for reporting and resolving bugs were possible and existed.
Sure software was generally simpler in those days. But against that much vendor code was written in assembler. I even wrote a bit of stuff directly in machine code in those days.
Over time lower and lower quality code has become acceptable in the world. There are still exceptions such as large inhouse commercial systems, mainstream database servers, automobile computers, many console games, GPS devices, etc. But most stuff is bug riddled these days. Even my PVR has a handful of trivially observable and very annoying bugs in its firmware.
I think that Microsoft is the single player most responsible for the downward trend over time. They were the first vendor who consistently prioritized new features over solid code. Their apologists convinced many people of the now generally accepted but incorrect belief that all complex programs have bugs. I.e. that it is the nature of the beast, which it definitely is not.
Another factor which has increased bug counts is the current common practice of separating developers from testers from maintainers. Developers should be more responsible for producing relatively bug free code in the first place. And the original developers of code should be its maintainers - the current practice of having separate maintenance teams prevents developers from the painful learning experiences they'd have if they had to clean up their own messes.