Thread: Gtx 1080

Page 19 of 103 FirstFirst ...
9
17
18
19
20
21
29
69
... LastLast
  1. #361
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    Quote Originally Posted by Lathais View Post
    Except with the way AMD consistently improves how games run on their cards over time and nVidia ignores and some even speculate the purposefully cripple performance on older cards, that's simply not true. Besides, if you were going for the highest performer, currently, the Fury X beats the 980ti. Plain and simple.
    I don't care for the 980ti, right now would be literally the worst time to get one.

    And all this bs of "crippling performance" yadayada, I'm running on a 4 year old card and I can still play absolutely anything on medium and not go below 60fps. I have never once noticed any performance degradation after a software update. Only improvements.

    Keep your stupid conspiracy theories where the sun doesn't shine.
    Last edited by Evolixe; 2016-05-10 at 08:01 PM.

  2. #362
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by mascarpwn View Post
    That's an SLI/Crossfire benchmark though. Hey, note, I'm not saying you're wrong and hermagerd the 980ti is teh bezt! I'm just telling you what I've read in benchmarks.

    Here's another single gpu review:

    http://tweakers.net/productreview/11...gaming-6g.html
    You're linking old benchmarks Mascar, use the newer ones =)

  3. #363
    Quote Originally Posted by mascarpwn View Post
    That's an SLI/Crossfire benchmark though. Hey, note, I'm not saying you're wrong and hermagerd the 980ti is teh bezt! I'm just telling you what I've read in benchmarks.

    Here's another single gpu review:

    http://tweakers.net/productreview/11...gaming-6g.html

    HardOCP should start including frame times though. Those have incredibly high impact how fluid your experience is.
    and here's another single gpu review:
    http://www.techpowerup.com/reviews/G...Gaming/23.html

  4. #364
    Deleted
    Quote Originally Posted by Artorius View Post
    You're linking old benchmarks Mascar, use the newer ones =)
    Ah, sorry guys. I just googled and posted the first few links I found. Hardware.info's benchmarks also include frame times, but now I'm afraid those aren't done with Crimson drivers either.

    Anyway, the differences should be negligible now and bickering about who's best is stupid. Both cards are good so you can't go wrong.

    Quote Originally Posted by Lathais View Post
    and here's another single gpu review:
    http://www.techpowerup.com/reviews/G...Gaming/23.html
    These show the same results (at 1440p) my links did, do they not?
    Last edited by mmoc47927e0cdb; 2016-05-10 at 08:09 PM.

  5. #365
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by mascarpwn View Post
    Ah, sorry guys. I just googled and posted the first few links I found. Hardware.info's benchmarks also include frame times, but now I'm afraid those aren't done with Crimson drivers either.

    Anyway, the differences should be negligible now and bickering about who's best is stupid. Both cards are good so you can't go wrong.
    It's the natural thing to do. Just googling "Fury X benchmarks" makes a lot of sense but in this case you'll simply end up with the articles released when the card was released which aren't telling the truth. It's really a pain in the ass, the shortest path is to simply go at techpowerup and pick whatever benchmark is the newest one.

    The truth is that the 980Ti is a card with very conservative clocks for what it can go up to, therefore you have a "big OC headroom".
    The Fury X however, isn't supposed to be running at that clock. It kills all its efficiency in the trade to get all the extra performance possible, even using a watercooler.

    AMD had to release a card better than the 980Ti/Titan X so their stock card was already with the highest clocks possible, that's why it can't OC.

    It is true that the stock Fury X beats the stock 980Ti or the stock Titan X, but the OCd 980Tis are better than all of them at DX11 titles.

    Look at this and see how ridiculously power efficient Fiji XT can be:



    How the actual fuck can 2 Fijis consume less power than a single damn Fiji? Clocks.

    Regardless of it, AMD really had the upper-hand at every single category minus the 370 x 950 one, and the OCd 980Ti x Fury X one. But that only came true after the Crimson drivers, which is why people tend to think Nvidia performed better last gen due to outdated benchmarks.

  6. #366
    Quote Originally Posted by Drunkenvalley View Post
    Experience has taught me otherwise. Buying at release is an awful idea. And v/p is literally the single most useful metric if you can't just splurge on the most extreme cards.
    Yeah what if you were one of those guys who bought the original titans? I remember thinking, "yeah, maybe, it looks like a work of art" but now they are just old electronics lol.

    Quote Originally Posted by Artorius View Post
    AMD had to release a card better than the 980Ti/Titan X so their stock card was already with the highest clocks possible, that's why it can't OC.
    I think there was a similar story with fermi and it's obscene tdp. I really don't need something around me boiling off at 110c thanks!
    Last edited by Afrospinach; 2016-05-10 at 08:19 PM.
    The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.

  7. #367
    Quote Originally Posted by mascarpwn View Post
    These show the same results (at 1440p) my links did, do they not?
    At 1440p? Yeah, but the 4k results are quite different. Really though, who cares about 1440p? It's basically already a dead resolution.

    As Artorius pointed out, yeah, you can find benchmarks all over the place that show the 980ti beating the Fury X, because most benchmarks were done when the card first came out, before the Crimson drivers. The Crimson Drivers changed things, significantly. Most people just did not notice at all. Basically though, what we have is this:

    950 > 370
    970 < 290/290x
    980 < 390x
    980ti = Fury X

    That means that over all, AMD was better last gen. Across the board, they offered better value at each price point, except on the very low end.

    Also, you have to take nearly all of those benchmarks with a grain of salt. They test things at max settings, which means Gameworks settings on. If you turn off the Gameworks settings, you get a very different picture. Nine times out of 10 those settings have no real noticeable effect. nVidia has been doing this for years to make their cards look better than they really are. It's kind of sickening really, that they have to resort to that level of deception to make their cards look better than they actually are. Even with the Gameworks settings on though, AMD holds it's own and proves that it can surpass in most situations.

  8. #368
    Deleted
    Quote Originally Posted by Lathais View Post
    At 1440p? Yeah, but the 4k results are quite different. Really though, who cares about 1440p? It's basically already a dead resolution.
    Oh no, not another person that throws the word "dead" around without knowing actual facts.

    Quote Originally Posted by Artorius View Post
    It is true that the stock Fury X beats the stock 980Ti or the stock Titan X, but the OCd 980Tis are better than all of them at DX11 titles.

    Look at this and see how ridiculously power efficient Fiji XT can be:
    Thanks

    In that case, it would be fair to compare the Fury X (which is heavily OCed) to a similarly OCed 980ti like the Gigabyte model, right? I'm not trying to be headstrong or anything, I just don't think it's fair to crown a heavily OCed card king, and say "yeah, that gtx980ti is faster but it's OCed, so it doesn't count".
    Last edited by mmoc47927e0cdb; 2016-05-10 at 08:26 PM.

  9. #369
    Deleted
    Quote Originally Posted by Fascinate View Post
    Lathais is an odd character. He has a 960 in his PC but he comes off as a rabid AMD fanboy lol. We all know it depends on what game you are talking about that they switch wins its been like this for many generations, but he seems convinced AMD sweeps the boards from the 380x all way to the fury x. Thing with lathais is he will never admit he is wrong, he will only twist his argument to try and defend his clearly incorrect original statements. Wish there was a forum ignore feature he would have been off my radar sometime ago, but i cant just let him get away with posting nonsense for the people that post here looking for good advice.
    No need to get personal, would be a shame to get banned again. Perhaps we should stick to tech talk? Like mods have reminded us about a few pages back.

    Either way it's the same debate every time new tech is released, we won't know anything real until the NDA is lifted and until then it's fine to speculate as long as you're open to everyones opinion.

  10. #370
    Quote Originally Posted by mascarpwn View Post
    Oh no, not another person that throws the word "dead" around without knowing actual facts.
    Apparantly, you are the one that doesn't keep up with the facts. Hopefully Artorius will educate you as he is much more knowledgeable in that area than I am.

  11. #371
    Quote Originally Posted by mascarpwn View Post
    Oh no, not another person that throws the word "dead" around without knowing actual facts.
    Actual facts? 1440p is a redheaded stepchild of a res and it always has been. Look at the price of them compared to 4k. I am not finding the good deals on 1440p. I got mine 3 years ago for 300$, they are still about the same if you are scraping the barrel yet 4k is what now? 450$ for a dell rather than my korean b panel?

    Steam:

    2560 x 1440 1.51% +0.05%

    When we have readily accessible hardware to drive a 4k 1440p will probably die rather quickly, hell even laptops are getting 4k which is a PITA if you are looking for a gaming machine because it is becoming so damned common compared with 1080p. 1440 has just never been a readily supported resolution.
    The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.

  12. #372
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by mascarpwn View Post
    Oh no, not another person that throws the word "dead" around without knowing actual facts.
    It's really dead content wise so I'd expect the tiny PC community using it to disappear as well (Steam survey tells me 1.5% of their users use 2560x1440 monitors)
    Thanks

    In that case, it would be fair to compare the Fury X (which is heavily OCed) to a similarly OCed 980ti like the Gigabyte model, right? I'm not trying to be headstrong or anything, I just don't think it's fair to crown a heavily OCed card king, and say "yeah, that gtx980ti is faster but it's OCed, so it doesn't count".
    That's the correct statement. The OC'd 980Ti is the best card. The Fury X might be better at DX12 though, but we don't really have enough titles yet.

    - - - Updated - - -

    Quote Originally Posted by Vegas82 View Post
    Funny considering 4K is being seen as a transitional resolution to 8K(which is why many movies haven't been upconverted to 4K, they're just going to do 8K versions).
    http://www.uhdalliance.org
    That's wishful thinking considering that the 4K standard was finished not so long ago, and this time they actually put way higher specs than what we can even do.

    The color input is BT2020 ffs, no display can get close to it. It's awesome future-proofing.

  13. #373
    Deleted
    Quote Originally Posted by Artorius View Post
    It's really dead content wise so I'd expect the tiny PC community using it to disappear as well (Steam survey tells me 1.5% of their users use 2560x1440 monitors)

    That's the correct statement. The OC'd 980Ti is the best card. The Fury X might be better at DX12 though, but we don't really have enough titles yet.
    That's what I thought as well.

    As for 1440p, I wasn't referring to television - obviously. Why would I? We're discussing computer graphics cards dedicated to gaming. I was referring to computer monitors. I don't have actual statistical data, but I'd think most people with a 980ti, or card of similar caliber, would have invested in a 1440p, it screen being the sweet spot between resolution and excellent performance, rather than 4k where you'd have to deliver quite a chunk of performance in order to achieve acceptable frame rates.

    Quote Originally Posted by Lathais View Post
    Apparantly, you are the one that doesn't keep up with the facts. Hopefully Artorius will educate you as he is much more knowledgeable in that area than I am.
    Apparently, I was right about the previous facts as well. Show me some proof that 1440p is "dead" for pc gaming. I really don't mind saying "oh, wow, I didn't know that! Thanks for that useful information!" if you prove your gut-feeling is more than just that.
    Last edited by mmoc47927e0cdb; 2016-05-10 at 08:42 PM.

  14. #374
    Quote Originally Posted by Drunkenvalley View Post
    Experience has taught me otherwise. Buying at release is an awful idea. And v/p is literally the single most useful metric if you can't just splurge on the most extreme cards.
    I'm going to have to disagree with your first point. Buying at release is a far better value than buying 20% off half way through a card's product cycle. A GPU's value slowly depreciates over time and then crashes when the next generation becomes available. I purchased my 970 near its release for $360 and sold it today for $230.

    I had it for 100% of it's product cycle for 27% of its original price. That is a better value than buying a card 50% through its product cycle for 80% of the original price.

    Now to bring to post back onto topic: I'm leaning more towards the 1070. The 1080 squeezed between it and a future 1080ti wrecks the price/performance and resale vaule of the card like the 970 and 980ti did to the 980.

  15. #375
    Quote Originally Posted by mascarpwn View Post
    That's what I thought as well.

    As for 1440p, I wasn't referring to television - obviously. Why would I? We're discussing computer graphics cards dedicated to gaming. I was referring to computer monitors. I don't have actual statistical data, but I'd think most people with a 980ti, or card of similar caliber, would have invested in a 1440p, it screen being the sweet spot between resolution and excellent performance, rather than 4k where you'd have to deliver quite a chunk of performance in order to achieve acceptable frame rates.



    Apparently, I was right about the previous facts as well. Show me some proof that 1440p is "dead" for pc gaming. I really don't mind saying "oh, wow, I didn't know that! Thanks for that useful information!" if you prove your gut-feeling is more than just that.

    Because PC gaming pretty much always follows what television and movies do. Just like 1440x900 was basically a dead resolution the moment it came out because media adopted 720p. 1440p is doing pretty much the exact same thing 1440x900 did. Came out, got used a bit, did not get adopted by media and then died.

  16. #376
    Deleted
    Quote Originally Posted by Lathais View Post
    Because PC gaming pretty much always follows what television and movies do. Just like 1440x900 was basically a dead resolution the moment it came out because media adopted 720p. 1440p is doing pretty much the exact same thing 1440x900 did. Came out, got used a bit, did not get adopted by media and then died.
    That's an interesting theory. But do you have any statistical data to back up your claim that 1440p is "dead" when it comes to gaming? I think most owners of high-end cards own 1080p or 1440p screens and a very small percentage of those people actually invests in 4k setups. I, however, have nothing solid to back this up; it's a what I think based on my own choices and I'm not afraid to admit that. What about you?

    "Apparently, I'm the one that doesn't keep up with facts", right?
    Last edited by mmoc47927e0cdb; 2016-05-10 at 09:04 PM.

  17. #377
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by mascarpwn View Post
    As for 1440p, I wasn't referring to television - obviously. Why would I? We're discussing computer graphics cards dedicated to gaming. I was referring to computer monitors. I don't have actual statistical data, but I'd think most people with a 980ti, or card of similar caliber, would have invested in a 1440p, it screen being the sweet spot between resolution and excellent performance, rather than 4k where you'd have to deliver quite a chunk of performance in order to achieve acceptable frame rates.
    Apparently, I was right about the previous facts as well. Show me some proof that 1440p is "dead" for pc gaming. I really don't mind saying "oh, wow, I didn't know that! Thanks for that useful information!" if you prove your gut-feeling is more than just that.
    Okay, let's start with the most widely used resolution at computer monitors: 1920x1080.

    Where did this resolution come from? Why do we use it? Does it make sense to use it instead of any other resolution?

    It came from the film industry. We use it because the people making monitors are the same who are making televisions and the displays are mass produced for both. No it doesn't make any sense to be used at computers.

    The entire 16:9 aspect ratio doesn't make any sense to be used at computers, specially Laptops with tiny screen. Most "computer content" is in text format which is obviously better when you can display more of it without having to scroll. Most websites use text as their main form of communication, and the general work done on computers involve text.

    The 4:3 aspect ratio was good for websites and text documents, then we started to get more variations to make a better usage of tiny laptop displays like 3:2 or even sqrt(2):1.

    Why did the industry start to make tiny, almost unusable, 16:9 displays even for laptops? 16:10 can display 2 A4 documents side by side at a 19" display when you'd need 23" to do the same at 16:9 (I really don't remember the numbers well but it's something like this, I can look for the exact values later).

    Why did we go from the good 16:10 to 16:9? When doing computer tasks 16:10 is better, when playing games or doing those kind of things I'd argue that 21:9 is better than both. So why?

    Quote Originally Posted by Wikipedia
    Around 2008–2010, there was a rapid shift by computer display manufacturers to the 16:9 aspect ratio and by 2011 16:10 had almost disappeared from new mass market products. According to Net Applications, by October 2012 the market share of 16:10 displays had dropped to less than 23 percent.[6]

    The primary reason for this move was considered to be production efficiency[3][7] - since display panels for TVs use the 16:9 aspect ratio, it became more efficient for display manufacturers to produce computer display panels in the same aspect ratio as well.[8] A 2008 report by DisplaySearch also cited a number of other reasons, including the ability for PC and monitor manufacturers to expand their product ranges by offering products with wider screens and higher resolutions, helping consumers to adopt such products more easily and "stimulating the growth of the notebook PC and LCD monitor market".[2]

    The shift from 16:10 to 16:9 was met with a mixed response. The lower cost of 16:9 computer displays, along with their suitability for gaming and movies and the convenience of having the same aspect ratio in different devices, was seen as a positive.[3][9] On the other hand, there was criticism towards the lack of vertical screen real estate when compared to 16:10 displays of the same screen diagonal.[9][10] For this reason, some considered 16:9 displays less suitable for productivity-oriented tasks, such as editing documents or spreadsheets and using design or engineering applications, which are mostly designed for taller, rather than wider screens.[9][11][12]
    Simply because the TVs were 16:9. Yes, that's the reason. The monitor market is bound to follow it's bigger parent market which is the TV market.

    All the things that "pc gamers" think are "gaming monitor" things in fact were designed and appeared first at the film industry.

    Let's take the 120Hz refresh rate as an example.

    You know film is often done in 24fps right? (It's actually closer to 23.976 but whatever).

    Now how we do we do display this 24fps thing at a commonly used 60Hz display? How can we fit the frames at the refreshes if 60 isn't a direct multiple of 24? There isn't a way to. What we do is display half the frames 2 times and the other half of the frames 3 times in a row.

    Like this: 2-3-2-3-2-3-2-3-2-3-2-3-2-3-2-3

    (2+3)/2=2.5 and 2.5*24=60.

    Yeah... But this makes half the frames stay at the screen for 50% more time than the other half, and movements look slightly wrong. That's called judder, and the technique to display 24fps content at 60Hz is called 3:2 pulldown.

    So now what's the best way to eliminate this judder problem? Simple. Increase the display resolution to anything that is a multiple of 24hz. That's how 120Hz appeared. Now you can simply display each frame 5 times and the movement is perfectly done.

    Okay so how about 144Hz?!

    144Hz is a 3D refresh rate. How do you do 3D? You send different frames for each of your eyes, half of them precisely. So with 144Hz you end up sending 72 frames for each eye each second correct? Yes. 72 is a multiple of 24. At 144Hz each film frame is displayed 3 times for each eye, 3*24=72.

    Sure, there are some odd monitors with 165Hz or 200Hz refresh rates. But those are the minority.



    It'll ~disappear as soon as the 4K monitors start to be the new standard monitors, of course it'll take a loooooooooong time to truly disappear. It's been ~10 years since they produced 1440x900 monitors and there are still 5% of people using them.
    Last edited by Artorius; 2016-05-10 at 09:09 PM.

  18. #378
    Quote Originally Posted by Aberrict View Post
    I'm going to have to disagree with your first point. Buying at release is a far better value than buying 20% off half way through a card's product cycle. A GPU's value slowly depreciates over time and then crashes when the next generation becomes available. I purchased my 970 near its release for $360 and sold it today for $230.

    I had it for 100% of it's product cycle for 27% of its original price. That is a better value than buying a card 50% through its product cycle for 80% of the original price.

    Now to bring to post back onto topic: I'm leaning more towards the 1070. The 1080 squeezed between it and a future 1080ti wrecks the price/performance and resale vaule of the card like the 970 and 980ti did to the 980.
    You're welcome to disagree. Personally, I think a single bad purchase invalidates any "value" you'll have from the rest of your computers combined. Additionally, I don't work my model around guessing the right card to buy completely blindly, I'll rather wait for benchmarks and data and pick what's at my preferred pricepoint.

  19. #379
    Quote Originally Posted by Drunkenvalley View Post
    You're welcome to disagree. Personally, I think a single bad purchase invalidates any "value" you'll have from the rest of your computers combined. Additionally, I don't work my model around guessing the right card to buy completely blindly, I'll rather wait for benchmarks and data and pick what's at my preferred pricepoint.
    Why do you think I'm going to choose my next card purchase without benchmarks to determine the best price/performance ratio?

  20. #380
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    Quote Originally Posted by Aberrict View Post
    I'm going to have to disagree with your first point. Buying at release is a far better value than buying 20% off half way through a card's product cycle. A GPU's value slowly depreciates over time and then crashes when the next generation becomes available. I purchased my 970 near its release for $360 and sold it today for $230.

    I had it for 100% of it's product cycle for 27% of its original price. That is a better value than buying a card 50% through its product cycle for 80% of the original price.

    Now to bring to post back onto topic: I'm leaning more towards the 1070. The 1080 squeezed between it and a future 1080ti wrecks the price/performance and resale vaule of the card like the 970 and 980ti did to the 980.
    I don't really care for resale price either tbh, I use cards untill they are practically worth squad. Or I pass them on to my little brother.

    But I definitely can't be arsed to upgrade every new release. Or even two.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •