I don't disagree, but when Crysis was released the meme "But Can It Run Crysis" was created. It shows that the average computer in late 2007 wasn't ready to deal with this game. This is not the fault of the game, but the fault of the hardware industry. Going by the Steam hardware survey of November 2008, which was a year later from when Crysis was released, it shows the average PC wasn't really ready for it. The game only requires DirectX9.0a, which by 2007 damn near everyone should have. But this is right after Vista, which slow down PC's especially graphics, and after the Geforce FX series was a horrible mess at running games in true DX9.0. And both ATI and Nvidia had problematic drivers for Vista plus the Xbox 360 and PS3 were a more popular alternative as a result, which resulted in a lot of PC gamers who didn't have the hardware needed to run Crysis properly. Around that time I think I have a ATI Radeon X1950 and I had no problem playing that game.
Wasn't like this hasn't happened before in PC gaming history but after the Xbox 360 and PS3 the expectations of gamers was much higher. Remember Quake required a math coprocessor which not many people had at the time, and Quake 3 was extremely demanding for its time. So much so that the ATI Rage 3D chip was thrown in desktop PCs just to satisfy the ability to play that game, albeit very poorly. Can't play Quake on your 486 SX PC? Just use the plethora of upgrade chips that gave you Pentium like functionality and speeds. In 1999 you had a number of graphic cards to choose from. ATI, Nvidia, 3dFX, Matrox, S3, PowerVR are the ones I remember. Hell, Intel actually made graphic cards back then, and was pretty good too. In 2007 we had ATI and Nvidia. Today we basically have AMD and Nvidia, and for most people it's just Nvidia. The only saving grace today is that DX11 is all you need to run games and we've had DX11 since 2009. Is it ever a wonder why today we have $500+ graphic cards? Nvidia is basically an oligopoly.
https://web.archive.org/web/20081214...vey/videocard/