First off I'm a PC gamer, haven't had a console in a good while now. This isn't a console VS PC debate, so please just leave that shit at the door
.
The big launches this Winter (DA:I, AC:Unity, FC4, CODAW) have all been the first foray into what most people would classify as "next gen" (In my opinion normal PC gaming but w/e). But all four have had insanely high minimum specs, to the point where on launch they either flat out won't run on Dual Core, or are crippled by them.
Now this isn't because they actually need those 4+ cores, it's because they've been designed not to work on them. They're hard coded to run on core#3 (in some cases), meaning a quad core is a must (what with core 1 being core#0).
It's kind of obvious that publishers are moving into doing this on a level we've not really seen. Before you'd get a game that'd run, get bug fixes, but still wasn't fully "optimized" like a console version. Now you get a version broken on purpose for some users, and not optimized at all for others. Then of course there's WatchDogs, wherein the engine was nerfed for PC.
I'm coming to my update time, and quite frankly I'm not sure I can be bothered with PC this time round. If publishers are going to make me run .dll injections just to run their games, why bother?
What do you guys think, is this just laziness on their part or an active effort to move budget PC gamers back to the console?