Originally Posted by
Rennadrel
And? It's already been proven that GPU power does matter. PS3 and 360 games look like ass compared to their PC counterparts, so GPU's do matter greatly. The Wii U port of Mass Effect 3 looks a million times better then the PS3 and 360 versions do, despite the fact that it was coded poorly and has some minor frame rate issues, the textures and actually running in HD make it a superior version. These welfare GPU's in the PS3 and 360 can't handle new games without overloading the CPU to run effectively, and they still look like crap. If the next generation lacks anything more then mid range graphics, they are not going to be able to compete with PC graphics, nor will they be able to run new and more advanced game engines like Unreal Engine 4. CPU speed is only a portion of gaming performance, and part of the problem is that development has become so reliant on CPU speed so the GPU didn't matter as much, moving forward the GPU will have to matter in order to code games to run well. CPU speed wouldn't matter if the GPU could handle advanced textures and rendering effectively, and that is why CPU speed won't matter as much next gen, because games will be putting very little strain on GPU's for the first couple of years.
Also larger worlds equates to a shit load more bugs. I don't want another Skyrim type of game where you can fall through the world at random or run up the side of cliffs on a near 90 degree angle before hitting a wall and falling to your death. And odds are, if you want all those advanced features of a more open world and more unique characters and things, you won't see it on consoles because consoles have to be affordable and they won't ever have the hardware to render those kinds of features effectively.