That's honestly the first time I've heard of a GPU having it's own screen splash upon boot.
Imagine if every component had that. It'd feel like starting up a game for the first time and frantically spamming esc/enter to skip past all the splash intros only to find out THEY'RE UNSKIPPABLE GOD FUCKING DAMMIT :O
My 9800GTX used to do that.
Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32GB Kingston HyperX Fury 2666MHz | Gigabyte Windforcex3 HD 7950 | Samsung 951PRO nVME 512GB | Crucial M550 256GB | Crucial MX200 1TB | Western Digital Caviar Black 2000GB | Samson SR 850 | Zalman ZM-Mic1 | Noctua NH-D15 | Fractal Define R5 | Seasonic X850
Former author of the TankSpot.com Protection Paladin guide
- - - Updated - - -
About annoying traits though Tetris, how about nvidia cards still using 3d clocks when using multi-display setup. People have only been complaining about it for...3 years?
I go from 168W idle to 138W idle (that is with screens and speakers running) when switching on inspector's multi display saver, and literally nothing changes performancewise. That's a free 30W powersave with 1click. I still can't believe they haven't properly fixed it by now :/
What do you have to draw that much at idle? Now that you fixed it, I mean.
My entire rig pulls 77~w from the wall and my PSU isn't even bronze, so there's still room to spare. Even so, AMD cards (HD6970 was at least) are also a lot warmer with dual monitors plugged in. Not as much as nVidia does in my experience though.
Yeah the idle is the powerconnector of my whole system combined. So monitors, speakers the whole lot. PC itself is about 59W idle.
But basicly forcing the 2D clocks saves 30W, and i get no glitches or stutterings onscreen. I really wonder why nvidia doesn't fix it's multi display idle behaviour.
Idle With nvidia inspector display saver:
- - - Updated - - -
I like the irony on that this is reversed these days. :P
These days nVidia drivers may have the occasional hiccup but besides that they're more or less on an equal footing. E.g. BF4 started giving AMD owners a red screen of death before AMD came along and fixed it :P
Well, this isn't too promising. The ol' 2500K @ 4.7ghz is constantly at 90%+ usage in BF4 while GPU (780 @ 1228/7000) is hovering around 50-70% depending on map. It should be the other way around. Looks like even a decently overclocked 2500K is finally starting to see it's architectures' limits in gaming. Benchmarks also show a pretty big leap in framerates with i7 CPU's when paired with a single high-end GPU.
No idea what more I can do besides upgrading to a 4770K + Z87 board, and no I don't have the money to make that kind of an upgrade right now.
It wouldn't be a very "happy" upgrade either given Haswell's heat issues, I want a chip that runs as cool and as overclock-friendly Sandy
Actually scratch that, I'm a retard. Turns out Shadowplay was "shadow-recording" the whole time, would explain the usage - the GPU was frantically encoding to H264 in the background. Once I switched it to manual, BAM, 95%+ GPU usage, 90%+ CPU usage.
Yeah I'm a retard -_-
No, Win8 isn't needed
Wondering if nVidia/AMD have another driver in the works to boost performance further.