From the very beginning of video game development, games have been subject to two main constraining factors: the limited capacity of the current hardware, and the developers’ ability to understand that capacity in order to make the most of it. These factors have led to the creation of games so skillfully crafted that, on a technical level, they seemed far more advanced than what the hardware of the time could accomplish. They have also facilitated the sustained, steady evolution to the point where, today, truly photorealistic graphics are within our grasp.
In this sense, gaming consoles have played a key role in the evolution of video games. Though many of us prefer PC gaming, it’s undeniable that consoles have been the driving force of the industry, to the point where consoles have monopolized the development cycle of the gaming world. Gone are the days of PC-exclusive games that worked to utilize this platform's hardware to the fullest; today, the spotlight is trained on this generation’s new console, and the effect that this has had on the gaming industry is clear.
While consoles have largely had a positive impact on game development, the negative consequences can’t be ignored. Hardware life cycles have been drawn out considerably. Combined with the focus on console-exclusive games, this has led to an atrophy of the development of games that utilize the newest gen’s PC hardware to the fullest, and therefore has slowed the once-rapid evolution of video games to a crawl.