I'm using a 6870 black edition, and in WoW, no matter how much I fiddle with the settings, the max I get is always 60, even though there is no limit on that. It's no real problem, just wondering why that is.
I'm using a 6870 black edition, and in WoW, no matter how much I fiddle with the settings, the max I get is always 60, even though there is no limit on that. It's no real problem, just wondering why that is.
That's because your monitor's refresh rate is capped at 60 hz.
Your monitor can only display so many frames per second, regardless of what your video card can do. So to fix it, up the refresh rate if you can.
Not to mention with v-sync off your making your video card work way harder then it needs to which in turn will eventually burn up the card. Your eyes can only process 32fps or something like that so 60fps is like having 100fps or more really. I know my card with v-sync off runs at over 100fps but you don't notice it and its again over working your card.
Hmmster, because all sorts of fellow "gamers" keep repeating it. Those people who actually pay attention and read the documents "proving" this realize that it's a total pack of bull, since the documents actually don't say anythng about it. :>
Yeah, V-Sync is the issue.
But since your monitor only has a 60Hz refresh rate, you're not getting more than 60 FPS anyway, no matter how much you push it. Sure, you can turn off V-Sync and it will process more FPS but you won't actually see them. So it's effectively a waste.
Movies are generally shot in 24 FPS and that's roughly the minimum FPS we need to see a "fluent" picture. That doesn't mean however that we humans don't notice the difference between 30 and 60 FPS. It might be subtle but we certainly notice a difference. Once we get over 60 FPS though it gets hard to tell though. Very few people will notice a difference between a 60Hz and a 120Hz monitor.
Crowe, the thing about movies being shot in 24fps - that looks fluid because you get motion blur, since you're actually filming a live person doing it. CGI often uses motion blur filters to make it look more realistic and fluid.
To compensate for a lack of motion blur in real-time video rendering (video games, etc), they need to render more images in the same amount of time - video games often don't look properly fluid at under 30 fps, and 50-60 is generally considered the point where they become as fluid as live-action film.
However, on your other points, you're accurate. FPS rates above the refresh rate of your monitor are basically an exercise in epeen. :> Turning V-Sync on, with a high-end video card, will often mean that your video card lasts longer, and stays cooler.
I'm not sure it's true that having VSync off is necessarily working your card harder than with it on. It doesn't explicitly tell the game/graphics card to render more of anything. It just tells it to wait for the display to finish drawing the current frame before sending out the next.
I'd think personally that if your game normally was being run at 100 fps, and in raids with lots of AOE effects it fell down to 75-80fps you still would never notice a thing, but if it was capped at 60fps and it fell somewhere to 45-50 then you'd notice it feel a bit more laggy.
Gershuun @ Borean Tundra US - Interface & Macros Moderator
I can get over 130 FPS on my computer (max around 90 in raids), but I have it capped at 60/70 to prevent my computer from getting ridiculously hot, 60 is plenty to raid with imo
The gaming community has always been split on this issue I guess... As much as the Intel\AMD opinions.
Good thing there's almost always an option to turn it on\off.
Gershuun @ Borean Tundra US - Interface & Macros Moderator
Also, if your card is capable of 90 fps, say 75 while in a 25-man raid.... running with VSync on means you'll probably never drop below that 60, as your card has capacity that it's not using. When a big nasty AOE-fest starts, you might dip 1-3 fps, but then it'll pick right back up after a second or two.