Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450
I was going to say the pixel count isn't that high and that thus we wouldn't be that far behind, but then I did a doubletake when I saw that my 1440p isn't even 4 million pixels, while apparently the 4k resolutions start above 8 million.
:O
That thing is actually incrediably compact for the power it yields. I doubt it will ever make it to stores though, that would just be outdated by the time it gets released.
---------- Post added 2013-01-07 at 11:23 PM ----------
Meh i'm not sure. We'll probebly be at the 8xx or 9xx line before we get 4k monitors for decent prices.
I have a reply for this, too. Their 7970 Matrix Platinum wasn't even good, compared to others.
Watch the whole video and you'll understand, he goes seriously in-depth.
---------- Post added 2013-01-08 at 12:51 AM ----------
Pretty sure they were never happy with the lowered buses for 600s. Twas not originally intended. The 780, from what we can tell from leaked specs, if it has a GK110 chip, would be back to 384 bit.
---------- Post added 2013-01-08 at 12:52 AM ----------
Perhaps not, but people who are in the market for the x80/x90 level GPUs are usually also capable of getting the more expensive monitors, like a 4k. Still, the 780, if we are to understand it like we have been, could power 2560x1440, 5760x1080, etc etc, very damned well.
Perhaps Asus will be able to solve what both AMD and Powercolor could not
AMD was working on the 7990, and never could get it to work, so they handed the whole thing over to Powercolor
Powercolor's 7990 is honestly a piece of junk, most people who have them are not happy with them, they are temperamental at best and require custom drivers that often crash
as it stands, there does need to be a competitor to the 690, Nvidia has artificially limited the 600 series across the board because they really don't have competition from AMD
No, AMD just didn't want to waste the resources. They could do it. But at what expense? Power hungry, thermally inefficient and pricey.
The dual GPU cards are prestige only, if it would ounce the GTX 690 in performance, but require a nuclear reactor to run and be difficult to cool, not to mention that they produce dual GPU cards at a loss (RnD to manufacturing is much much larger with these cards compared to the small scale of release), they just saw no point spending the money on that when they are in the financial situation they are, for questionable PR gains.
if they had an idea to make a card have 2 gpus, reach a certain level of performance, use a certain amount of power, and do it by a deadline (this is a business they are running) they never could, the project was scrapped, thats not the same as not wasting resources
dual gpu cards make the R&D money back in the server area, companies dump them into servers for a compute card, and they buy tons of them, every time EVGA was sold out of 690's it was because a company called up and ordered 30 of them for a datacenter, this has even lead to the older 590 appreciating in value, most used 590s i see for sale are being sold used for more than i paid for mine new, a 590 or 690, even at a market price of $1k, is still cheaper and more powerful, and more usable than a Tesla, and takes up the same amount of slot space in a box
if AMD could get wider OpenCL adoption for professional environments, they would not have to worry about recouping R&D costs on a dual gpu card
Right now, 4k resolution is for a special enthusiast crowd, same with the 690, and even the 680. Most realize that very little separates the 670 and 680, like you, and choose to go for the 670, but then you have the folks that go for the 680, or in Cyanotical's case... 2 690s. lol
I am enthusiastic. But not THAT enthusiastic :< Bisides that my simple minimum wage job doesn't pay for stuff like that even if i still live with my parents :/
People need to understand the concept of gaming, enthusiast, and professional a bit better. Gaming is of a nature where you want maximum performance w/o becoming insanely expensive otherwise who is playing the games? Enthusiasts, well this is basically bench and burn no care of concern toward bugs and inefficiency just raw power. When it comes to professional grade items it becomes a different beast completely now you are talking about people's careers lives etc. Lets say a Pixar 3D artist is trying to make a new movie in a timely fashion before the technology is completely outdated, he or she would need something with an assload of raw power, but being an artist it must be of refined qualities on every input and output as well. Seeing a full range of colors can be an entire new idea or thought in an artists mind, and being able to put it on the screen according to your idea is ideal.