The article outlines a problem that doesn't really exist. PC developers are free to build GPU and CPU specific versions of their games. They can even go further and support specific chipset features. But so far, it hasn't been worth the investment to do so. Well, not since the 90's when a single prodigy could do it on their own.
Most developers don't even write their own engines anymore. If they really wanted to optimize the game they could build an optimized OS for their direct hardware gaming engine. Well, in theory. The costs, talent, organization, etc requirements and risks all individually make it completely impractical to do so.
Additionally, people like John Carmack have noted for years that that for most games, content development is not limited by consumer hardware capabilities, but the time/cost/delivery restrictions and development hardware. So the full capabilities of PC hardware aren't even practically exploitable.
I believe that with the original Xbox, consoles became "good enough" to justify ditching the hassles of PC gaming for a large percentage of PC gamers and developers. For most types of popular games at least. It's less a case of the developers switching markets and more a case of the market switching platforms.
With console hardware so old, the time is now for a PC developer to do something truly groundbreaking. But unfortunately, I don't think enough interested people own the hardware to justify the investment.
added:
There's another way to look at this too. If a major console did not use DirectX, PC gaming could be in much worse shape than it is now.