Page 2 of 7 FirstFirst
1
2
3
4
... LastLast
  1. #21
    Quote Originally Posted by tetrisGOAT View Post
    What's stopping you from getting one now? That you want 120Hz? Those are likely a wee bit off, still.

    Either way, we need a renewal in the resolution-department. 1920x1080 and the FullHD-concept is one of the worst things that has happened to it, since once you reached, everything has stagnated. Do we need to push onwards post-4K? It's easy for me to say "no" at this time, but at the same time, ten years ago, I would've laughed at the idea of ever needing more than 2GiB of RAM or 1GiB of VRAM. Resolution isn't really translatable or relatable here, but we will be needing improvements. 2560x1600 in a 5" phone? Perhaps not. In a 15.6"-17" notebook? Yes. 4K in 24-30"? Yes. After that, I personally am of the opinion that we would just eat a performance-hit for no noticable benefit. Resolutions of 1440p and above severely diminishes the need for AA, however, which is a plus.
    Just because you can buy a 1440p monitor, doesn't mean it's the norm. We need Youtube/Media to fully support 1440p for people to switch. Also when someone asks for monitor advice, you won't just recommend him a 1440p unless he specifically states he wants one.
    System Specs -
    CPU - i5 2500k @ 5ghz | CPU Cooler - RASA RX240 | Motherboard - Asus Maximus IV Gene-Z/Gen3| GPU - Nvidia GTX 590|
    SSD - Crucial M4 128GB | RAM - 8GB Corsair Vengeance | PSU - Corsair TX850M | Case - CoolerMaster HAF 922
    | Monitor - Crossover 27Q (2560x1440) | Sidewinder X4 Keyboard |


  2. #22
    Quote Originally Posted by Nab View Post
    Just because you can buy a 1440p monitor, doesn't mean it's the norm. We need Youtube/Media to fully support 1440p for people to switch. Also when someone asks for monitor advice, you won't just recommend him a 1440p unless he specifically states he wants one.
    It's not the norm because we have 1080p. Standard means trying to settle on one. I imagine they're hoping to push 4k as the standard size down the road.

  3. #23
    Titan Synthaxx's Avatar
    Join Date
    Feb 2008
    Location
    Rotherham, England/UK
    Posts
    13,053
    Quote Originally Posted by Nab View Post
    Just because you can buy a 1440p monitor, doesn't mean it's the norm. We need Youtube/Media to fully support 1440p for people to switch. Also when someone asks for monitor advice, you won't just recommend him a 1440p unless he specifically states he wants one.
    Youtube does support 1440p. It's titled as "Original" on videos that support it (though that truly just applies to anything >1080p). Youtube even supports 4K resolution, though as you can imagine, those files are massive, and even to buffer 1 second of footage takes quite some time. The quality still seems lackluster for most videos, but that could be down to the uploader using a subpar encoder. Most importantly, searching for 1440p just get's you a fucking load of dubstep videos with still images... because clearly something like that warrants the extra file size.

    Gaming on 1440p is only a different experience in some games. This is good and bad. For example, in D3, the cursor does NOT scale up, so losing it in the heat of battle is very easy and as your face will get eaten by zombies unless you're on your toes about finding the cursor again. BF3 though, you seem to get a slightly better FoV in jets and such. SC2 you seem to get some extra clarity around the edge of units.

    Text, while "physically smaller", seems to be much clearer on 1440p. It's true advantage lies in multitasking and creative tasks. 1080p for gaming and movies until we get to 4K (where the clarity really steps up), and 1440p or 1600p for creativity and management.

    As for the topic at hand (GTX 780), i said just a few days ago that if they try to up the clock speeds but don't increase voltage, then i'd expect us to be in for yet another GTX580 stability fiasco (not everyone experienced problems, but mine were so bad it almost put me off of NVIDIA completely, until i settled on the 670 and everything was all sparkly and colourful again). If they do create a stable card with those specs, it'll truly be something else. Crazy texture performance (already a strong point), plenty of cores to handle particle effects (an area where NVIDIA traditionally struggled compared to ATI), shader performance like no other, and enough memory to easily handle 1440p, maybe even 1600p, gaming.

    Sure, the tech to create the "ultimate GPU" has existed for years (and no, i don't refer to natural advancement in technology), but it makes sense they want to exploit the market and potential as much as possible until a silicon alternative becomes easy to work with. Pacing it in such a way and pitching cards to all levels of consumers is much better for everyone.

    I remember the 7800GTX from yester-year. I never personally owned one (i had a 7600GT from that series of cards), but i recall it being the new hotness for BF2 at that time. It took what was then an extremely heavy game, and made it go down smooth. Seeing a similar name and then seeing the specs of it brings back that sort of feeling, except for the current top-end titles (Crysis 3 as an example, though not yet released, looks to be very heavy on the GPU and it looks like they've avoided doing a direct console port so that they can build DX11 and HD textures in as base options).

    We'll see more details as we close in on a release and no doubt NDA leaks will happen (as they always do if you look in the right places).
    Coder, Gamer - IOCube | #Error418MasterRace #ScottBrokeIt
    Knows: Node.js, JS + JQuery, HTML + CSS, Object Pascal, PHP, WQL/SQL

    PC: 750D / 16GB / 256GB + 750GB / GTX780 / 4670K / Z87X-UD4H | Laptop: 8GB / 120GB + 480GB / GTX765M / 4700MQ

  4. #24
    Quote Originally Posted by Zeara View Post
    Hmmm, isnt Nvidia usually releasing cards with a xx4 in the mid range area. 104 was mid range if im not mistaken. And 110 is the new high end. Could be wrong tho.

    And it wont be launched this year or even the first quarter. I think the factories should have some try outs already than. Tape out or w/e its called.
    GK 110 was supposed to be 670/680, meanwhile GK104 was the 660 and down, but blah blah they couldnt do it so Gk104 is the current 680.

  5. #25
    The Insane DeltrusDisc's Avatar
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    15,102
    Quote Originally Posted by Milkshake86 View Post
    GK 110 was supposed to be 670/680, meanwhile GK104 was the 660 and down, but blah blah they couldnt do it so Gk104 is the current 680.
    It wasn't blah blah they couldn't do it, from what I recall, the GK 110 would have royally man-handled the HD 7970 and thus would have just screwed all comparisons and competition. While sure it sucks they didn't release it, it was probably better for us all.

  6. #26
    Quote Originally Posted by DeltrusDisc View Post
    It wasn't blah blah they couldn't do it, from what I recall, the GK 110 would have royally man-handled the HD 7970 and thus would have just screwed all comparisons and competition. While sure it sucks they didn't release it, it was probably better for us all.
    Nah they had issues making it. This isnt the first time Nvidia has issues with the manufacturing proces of the cards.

  7. #27
    The Insane DeltrusDisc's Avatar
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    15,102
    Quote Originally Posted by Zeara View Post
    Nah they had issues making it. This isnt the first time Nvidia has issues with the manufacturing proces of the cards.
    I'm talking post-issues. They had issues yes, and then they said they were going to make it. They were having issues with 28nm, which is both GK 104 and 110. If they could make 104 they could make 110.

  8. #28
    Quote Originally Posted by DeltrusDisc View Post
    I'm talking post-issues. They had issues yes, and then they said they were going to make it. They were having issues with 28nm, which is both GK 104 and 110. If they could make 104 they could make 110.
    Making one, does not mean you can make the other. The 110 was way bigger than the 104 (iircc). The problem was more likely that they could make them, but the fall out of broken chips was just to big. Making it to costly.

    Bottem line, manufacturing issues were the reason they didnt make/sell it. Or that is what i think what happened.

  9. #29
    Herald of the Titans shroudster's Avatar
    Join Date
    Jul 2012
    Location
    netherlands
    Posts
    2,890
    iirc GK 110 wasn't released as gtx 680 because it was deemed to powerfull for the market. (so they made used it in new quadro series i believe)
    and by powerfull also comes cost,production etc (it just likely wouldn't be worthwhile from a marketing perspective)
    but as gamers/enthusiasts alike i think it would have been; "moar power gimme!"

  10. #30
    Could be something of both.
    Manufacturing issues, so less chips to sell. Consumers are used to a 500-600 dollar/euro max on high end GPU's (Or there about for single ones). So to get out of the costs they might have sold it for way more than that.
    They found out it would be way to powerfull to compete with AMD (or something like that). Combine that with the high cost of making it. And making it a quadro only card is not that far fetched. That side of market makes up way more of their profit anyway. The consumer market for these cards is way smaller

    But this is all speculation on my side of course :P

  11. #31
    Quote Originally Posted by shroudster View Post
    iirc GK 110 wasn't released as gtx 680 because it was deemed to powerfull for the market. (so they made used it in new quadro series i believe)
    and by powerfull also comes cost,production etc (it just likely wouldn't be worthwhile from a marketing perspective)
    but as gamers/enthusiasts alike i think it would have been; "moar power gimme!"
    Honestly, I think that's pretty much just marketing bullshit to sound better than their competitor. Coming from a company that were willing to appear at a presentation of one of their 400-series graphics cards with a fake card makes me inclined they're willing to bullshit as they please.

  12. #32
    probably kepler refresh

  13. #33
    Herald of the Titans shroudster's Avatar
    Join Date
    Jul 2012
    Location
    netherlands
    Posts
    2,890
    Quote Originally Posted by Drunkenvalley View Post
    Honestly, I think that's pretty much just marketing bullshit to sound better than their competitor. Coming from a company that were willing to appear at a presentation of one of their 400-series graphics cards with a fake card makes me inclined they're willing to bullshit as they please.
    there were some massive overperforming mystery cards in some leaked benchmarks before keplar release.
    perhaps they can bullshit it but considering the performance of the quadro card they made with it the overperforming part is true to quite some extend.

  14. #34
    Quote Originally Posted by shroudster View Post
    there were some massive overperforming mystery cards in some leaked benchmarks before keplar release.
    perhaps they can bullshit it but considering the performance of the quadro card they made with it the overperforming part is true to quite some extend.
    This would seem like the most obvious strategy to me - they have their 7xx line already in their back pockets. It could've been the 6xx line, but they've kept their cards firmly held to their chest. The 7970 isn't a concern for 95% of the gaming audience. People with multiple displays or x1600 desires will have bought it, but even it stumbles in places at those resolutions and in those setups.

    If those GK 110's are essentially the 7xx series, they'll probably just fine tune it and resolve any issues with mass producing it in the next 6 months, then fire it out all over the place. Again I'd basically expect it to be aimed at making x1080 doable on any game at 120fps, with max settings, AA topped out, etc and they'll also push for x1440/1600 being playable at 60fps constant, with a single card, with maxed out settings, which at the moment (for most games) requires an SLI setup. No doubt it'll be able to run 5760x1080 solo, though at less than 60fps constant. I'd suggest they'll aim to make tri-monitor setups a standard using 2 card SLI along with paired monitors at x1440/1600. Which is again where I'd expect a 790 to be aimed.

  15. #35
    Quote Originally Posted by shroudster View Post
    there were some massive overperforming mystery cards in some leaked benchmarks before keplar release.
    perhaps they can bullshit it but considering the performance of the quadro card they made with it the overperforming part is true to quite some extend.
    To be more clear, I don't disagree the GK110 may have been very powerful. However, I'm inclined to disagree that their reason to not release the GK110 was simply because of that.

    Whether it be heat-issues, voltage issues or the likes, I would be more inclined to believe that the GK110 was simply not ready to be released on the market to begin with, hence the GK104 being released as the full line-up.

  16. #36
    Herald of the Titans shroudster's Avatar
    Join Date
    Jul 2012
    Location
    netherlands
    Posts
    2,890
    Quote Originally Posted by Drunkenvalley View Post
    To be more clear, I don't disagree the GK110 may have been very powerful. However, I'm inclined to disagree that their reason to not release the GK110 was simply because of that.

    Whether it be heat-issues, voltage issues or the likes, I would be more inclined to believe that the GK110 was simply not ready to be released on the market to begin with, hence the GK104 being released as the full line-up.
    true likely not the sole reason for it. (still a gaming gpu single chip going for a few grand would not be an efficient market)
    im likely skipping the 7xx unless their price gets really interesting or my current 670 isn't keeping up with games. (which i doubt for a while for now ^^)

  17. #37
    Field Marshal
    Join Date
    Sep 2012
    Location
    Fort McMurray AB
    Posts
    80
    Heres hoping that big kepler is worth an upgrade from the 680 lightnings. Performance and feature wise.

    At this point im starting to lean towards the 89xx series from amd or if next gen isnt worth it ill just wait for 99xx or the 880.
    Last edited by EllishaPally; 2012-11-21 at 07:46 PM.

  18. #38
    The Insane DeltrusDisc's Avatar
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    15,102
    Meanwhile, during your debates of whether or not to upgrade from a 670 or 680....

    I'm sitting here with my 560 Ti. -.-

  19. #39
    Field Marshal
    Join Date
    Sep 2012
    Location
    Fort McMurray AB
    Posts
    80
    Quote Originally Posted by DeltrusDisc View Post
    Meanwhile, during your debates of whether or not to upgrade from a 670 or 680....

    I'm sitting here with my 560 Ti. -.-
    560ti is still a pretty good card though. You'll notice more of a difference when you do upgrade

  20. #40
    Should give the Ti an SLI twin. Bypass the 6xx family wholesale.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •