Page 25 of 45 FirstFirst ...
15
23
24
25
26
27
35
... LastLast
  1. #481
    Quote Originally Posted by Lathais View Post
    That's just not the way it works though. Devs need to optimize for the hardware and the hardware manufacturer's do help by working with them to optimize the code and by optimizing drivers for specific games. Doesn't matter how you think it should work, it matters how it does work.
    I know that it's how it works right now, that's why we have AMD APIs and Nvidia APIs. Dev optimize for whoever pays better.
    i7-6700K @ 4.6GHz | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | 256GB Samsung EVO 850 PRO | 2TB WD2003FZEX | 2TB WD20EFRX | Creative Sound Blaster Z | Thermalright Silver Arrow IB-E Extreme | Corsair RM650x | Cooler Master HAF X | Logitech G400s | Tt eSPORTS POSSEIDON | Kingston HyperX Cloud | BenQ XL2411T

  2. #482
    Quote Originally Posted by VooDsXo View Post
    everything synthetic and otherwise is actually showing Titan XP ahead by a decent amount in terms of raw numbers, if its worth the $ increase thats subjective, but I'm only finding results like you'd be claiming with a Titan X.
    Yeah, I explained earlier the Titan X was originally referred to as the Titan XP in the press, later became the Titan X (Pascal) and the new one is specifically the Titan XP or Xp.

    The newer Titan XP is significantly faster than the 1080ti, despite sharing the same clocks, which is why I've re-stated over and over that it's the changes they've made on the board and to how the bandwidth is manipulated that makes the real difference.

  3. #483
    Quote Originally Posted by Shinzai View Post
    Yeah, I explained earlier the Titan X was originally referred to as the Titan XP in the press, later became the Titan X (Pascal) and the new one is specifically the Titan XP or Xp.

    The newer Titan XP is significantly faster than the 1080ti, despite sharing the same clocks, which is why I've re-stated over and over that it's the changes they've made on the board and to how the bandwidth is manipulated that makes the real difference.
    1080Ti uses the slighly cut-down version of Titan X chip, Titan XP uses the full spec GP102 chip, the same one used in P6000. P6000 has been outperforming Titan X in gaming, it simply has more processing power.
    i7-6700K @ 4.6GHz | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | 256GB Samsung EVO 850 PRO | 2TB WD2003FZEX | 2TB WD20EFRX | Creative Sound Blaster Z | Thermalright Silver Arrow IB-E Extreme | Corsair RM650x | Cooler Master HAF X | Logitech G400s | Tt eSPORTS POSSEIDON | Kingston HyperX Cloud | BenQ XL2411T

  4. #484
    Quote Originally Posted by Fascinate View Post
    It will be around 1080 performance is my guess ya. Some benchmarks said it was 1070 but there is no way that is true, AMD would get laughed outta the market.
    That's how they've been doing for ages and they survived.

  5. #485
    Quote Originally Posted by Kuntantee View Post
    That's how they've been doing for ages and they survived.
    If AMD doesn't want to best Intel / nVidia they can just fuck off for all I care.
    "Every country has the government it deserves."
    Joseph de Maistre (1753 – 1821)

  6. #486
    Quote Originally Posted by Amalaric View Post
    If AMD doesn't want to best Intel / nVidia they can just fuck off for all I care.
    They kinda did tbh. In terms of data crunching (aka multi-threaded performance, encoding/decoding etc.), Ryzen 1700 is miles a head of i7. Not sure about the Ryzen 1700's performance against Intel xeon processors tho.

    This is the primary reason why I prefer Ryzen 1700 over i7. I will be doing stuff other than gaming.
    Last edited by Kuntantee; 2017-07-09 at 10:47 AM.

  7. #487
    Warchief Zenny's Avatar
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,074
    Review of the water cooled version:

    https://www.pcper.com/reviews/Graphi...-Cooled-Review

    Decent gains, roughly equals a stock 1080 now, albeit with a 50% larger die and double the power usage.

    Assuming AMD does it's post launch driver optimization, we are looking at a card that can be around 10% over the 1080 in best case scenarios.

    Hope they price the RX version for $449 at most.

  8. #488
    Quote Originally Posted by Zenny View Post

    Hope they price the RX version for $449 at most.
    That watercooler desing alone is so expensive, it will not be near $449, if it has the watercooler. Then again RX version might launch with a different design.

  9. #489
    Warchief Zenny's Avatar
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,074
    Quote Originally Posted by mrgreenthump View Post
    That watercooler desing alone is so expensive, it will not be near $449, if it has the watercooler. Then again RX version might launch with a different design.
    I assume a triple 8-pin design with a really hefty air cooler would get the job done, in terms of cooling and power needed. The problem is RX Vega needs to retail for lower then a 1080 in price due to it being very similar in performance whilst having some rather large drawbacks. The 1080 can be found on specials all the time now due to it being so long on the market.

    So, under $500 is basically a requirement if they want to ship any significant numbers.

  10. #490
    Quote Originally Posted by Zenny View Post
    I assume a triple 8-pin design with a really hefty air cooler would get the job done, in terms of cooling and power needed. The problem is RX Vega needs to retail for lower then a 1080 in price due to it being very similar in performance whilst having some rather large drawbacks. The 1080 can be found on specials all the time now due to it being so long on the market.

    So, under $500 is basically a requirement if they want to ship any significant numbers.
    The underlined part is the one that is really important.

    RX Vega looks to be the top-end of that design. That's all their going to get out of that chip.

    And in about 3 months from now (2 1/2-ish from RXV launch) professinal Volta (V100) is going to start shipping. (Early Q3 this year).

    Expect consumer Volta right after that. (Another 2-ish to 3-ish months - late Q4 or early Q1 2018 at the outside).

    So, within just a few months of launch, at best, AMD is going to be back to competing for the mid-range, with nothing to show at the top end at all, as all reports coming out of people who have engineering samples of Volta is that it is as much of a leap over Pascal as Pascal was over Maxwell.

    So, the RX Vega will end up being about as powerful as a 2060/1160 (whatever we call them) that is going to end up retailing for ~260.

    AMD needs to get the lead out.

    They showed with Ryzen that they can still innovate. Lets see some of that in the GPU space. I'd like some options in the future when it is time to rebuild.

  11. #491
    Quote Originally Posted by Kagthul View Post

    Expect consumer Volta right after that. (Another 2-ish to 3-ish months - late Q4 or early Q1 2018 at the outside).
    If Vega doesn't outperform the 1080 by a fair margin, we won't see consumer volta(outside Titan) until spring 2018.

  12. #492
    The Lightbringer Evildeffy's Avatar
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,258
    Except that Volta for the consumer market will not come out in 2017 since GDDR6 won't enter mass production untill "early 2018" which can be the entirety of Q1 2018.

    Volta will use GDDR6 from any PoV we've heard so far, so unless you have sources that tell me that Volta will continue on GDDR5X and that the V100 is a completely different animal to the consumer variants I'm going to keep it at Volta very likely Q2 2018.

    Since Micron is the only one to produce GDDR6 for now ... it won't be earlier than Q2 2018.

    Also it is unwise to even remotely assume an all new architecture performance.

  13. #493
    hmm, so pretty much liquid-cooling and/or also bumping up the power target to 125% (thus going over 400W) was needed for the FE silicon to hold a solid 1600 Mhz boost

    and at 1600 Mhz its equal to stock GTX 1080


    from gamersnexus hybrid liquid-cooled modding of FE they got a max OC of 1700 Mhz I believe - that could be on par with 1900-2000 Mhz 1080



    question is - will air-cooled aftermarket Vega cards reach any of these (1600 stable, much less 1700) or will that be purely AIO territory ?
    Last edited by Life-Binder; 2017-07-17 at 07:32 PM.

  14. #494
    I was going to post something in response to Evildufus' idiocy, but thought better of it.

    Why is it so hard for people to actually keep up on things that have changed since they were first talked about nine months ago?

    /picardfacepalm

    User was infracted.
    Last edited by noteworthynerd; 2017-07-27 at 09:27 PM.

  15. #495
    It needs to significantly undercut the 1080 in terms of pricing to be remotely a competitive option. It was being benched in the article against Nvidia cards at Founders Edition clock rates, so it's probably already 5-10% slower than AIB 1080s. Plus, if AMD already had to put on a water cooling solution to reach those performance levels, it's a pretty good sign that there isn't near as much head room for AIB manufacturers to improve on the performance (certainly not as much as there is for 1080 AIB cards to move up from the Founders Edition cooler). In other words, if it's only matching reference card 1080 performance, it's already basically behind the 1080.

    It's also consuming a full 200 watts more power than a 1080 (and 100 watts more than a 1080 Ti) based on those benchmarks. That is pretty brutal, and is going to add to another $50+ to the effective price of the GPU, because while a 1080 would be fine on a quality ~550w EVGA or Seasonic PSU that can be had for $40-50, this thing is probably going to need a 750+ W PSU for $100+. The price has to reflect the extra PSU requirements. On top of that, even if everything else was equal, by default, I wouldn't go with an AMD GPU over an Nvidia GPU because of the inferior drivers and frequency of driver updates. That factor also needs to be compensated.

    Therefore, I'd make the following conclusions off that article.
    - It's probably 5-10% slower than a decent AIB 1080 if it's only matching a Founder's.
    - It's going to require that you spend $50-$75 more on your PSU
    - If you're using it for 6 hrs a day for heavy gaming, it's going to cost you ~$50 extra a year in electricity expenses (assuming 12c/kWH).
    - It has a weaker driver infrastructure/stability/update rate than Nvidia cards

    Assuming the baseline price of the 1080 stabilizes to ~$500, I really can't see this thing being competitive at anything more than $350.

    - - - Updated - - -

    Quote Originally Posted by Kagthul View Post
    The underlined part is the one that is really important.

    RX Vega looks to be the top-end of that design. That's all their going to get out of that chip.

    And in about 3 months from now (2 1/2-ish from RXV launch) professinal Volta (V100) is going to start shipping. (Early Q3 this year).

    Expect consumer Volta right after that. (Another 2-ish to 3-ish months - late Q4 or early Q1 2018 at the outside).

    So, within just a few months of launch, at best, AMD is going to be back to competing for the mid-range, with nothing to show at the top end at all, as all reports coming out of people who have engineering samples of Volta is that it is as much of a leap over Pascal as Pascal was over Maxwell.

    So, the RX Vega will end up being about as powerful as a 2060/1160 (whatever we call them) that is going to end up retailing for ~260.

    AMD needs to get the lead out.

    They showed with Ryzen that they can still innovate. Lets see some of that in the GPU space. I'd like some options in the future when it is time to rebuild.
    And, this GPU is not going to be mid-range viable unless they can work miracles with the power utilization. Mid-range buyers are not typically going to put up with huge PSU requirements, and exotic water cooling setups and/or the fan noise/heat/power consumption this design is going to bring. The RX 400 series was successful in the mid range in part because it had reasonable power consumption.

  16. #496
    Warchief Zenny's Avatar
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,074
    The 1080 shipped months before the time Micron stated GDDR5x would be in mass production. Heck Nvidia confirmed the 1080 was using GDDR5x even before Micron did the announcement.

    For the last 5 generations Nvidia has been releasing xx80 class products on a roughly 18 month schedule, give or take a couple months. The longest gap being 20 months. Assuming we are looking at another 20 month gap gives us a launch date of Jan 2018, but it could be sooner. GDDR5x has just hit speeds of 16GBps so Volta can launch with that if needed, that would give a 256bit card 512Gbps, which is plenty.
    Last edited by Zenny; 2017-07-17 at 08:03 PM.

  17. #497
    The Lightbringer Evildeffy's Avatar
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,258
    Quote Originally Posted by Kagthul View Post
    I was going to post something in response to Evildufus' idiocy, but thought better of it.
    Look at that ... you're still an asshole, good thing that hasn't changed I see.

    Quote Originally Posted by Kagthul View Post
    Why is it so hard for people to actually keep up on things that have changed since they were first talked about nine months ago?

    /picardfacepalm
    http://wccftech.com/sk-hynix-gddr6-n...-gpu-gtc-2017/

    9 months ago .. hmm cool, I didn't realize Computex was held in Oktober 2016 last year... could've sworn we had one 2 months ago.

    Or are you referring to that singular rumour of fudzilla and fudzilla only that Volta will use GDDR5X instead?
    Hmmm ... Announcement from SK Hynix to be believed or Fudzilla's rumour mill who's been wrong more times than I care to count?...

    If you're going to act all smug and superior you may want to back up your claims instead of responding like that.

    So either respond like a normal person or don't respond at all.

    - - - Updated - - -

    Quote Originally Posted by Zenny View Post
    The 1080 shipped months before the time Micron stated GDDR5x would be in mass production. Heck Nvidia confirmed the 1080 was using GDDR5x even before Micron did the announcement.

    For the last 5 generations Nvidia has been releasing xx80 class products on a roughly 18 month schedule, give or take a couple months. The longest gap being 20 months. Assuming we are looking at another 20 month gap gives us a launch date of Jan 2018, but it could be sooner. GDDR5x has just hit speeds of 16GBps so Volta can launch with that if needed, that would give a 256bit card 512Gbps, which is plenty.
    Quite possibly but I'm remembering mass production dates of GDDR5X being after the availability launch, not during it's paper launch.

    Also prototyping != Mass Production.

    Clarification:
    https://www.digitaltrends.com/comput...5x-production/ <-- Mass Production in started in early May 2016.
    http://wccftech.com/micron-gddr5x-me...ss-production/ <-- Confirmation of mass production way before.

    GTX 1080 purchase capability officially March 10th 2016... well how many could you buy for real until about July?
    Availibility in the NL was only 40 cards of the 1080/1070 till July hit in total from March.
    You do the math why.

    User was infracted.
    Last edited by noteworthynerd; 2017-07-27 at 09:28 PM.

  18. #498
    Quote Originally Posted by Evildeffy View Post
    9 months ago .. hmm cool, I didn't realize Computex was held in Oktober 2016 last year... could've sworn we had one 2 months ago.

    Or are you referring to that singular rumour of fudzilla and fudzilla only that Volta will use GDDR5X instead?
    Hmmm ... Announcement from SK Hynix to be believed or Fudzilla's rumour mill who's been wrong more times than I care to count?...

    If you're going to act all smug and superior you may want to back up your claims instead of responding like that.

    So either respond like a normal person or don't respond at all.
    They will use whatever is available. It's not like there is much of a difference between the two in terms of performance (at least initially).

    Quote Originally Posted by Evildeffy View Post
    Quite possibly but I'm remembering mass production dates of GDDR5X being after the availability launch, not during it's paper launch.

    Also prototyping != Mass Production.

    Clarification:
    https://www.digitaltrends.com/comput...5x-production/ <-- Mass Production in started in early May 2016.
    http://wccftech.com/micron-gddr5x-me...ss-production/ <-- Confirmation of mass production way before.

    GTX 1080 purchase capability officially March 10th 2016... well how many could you buy for real until about July?
    Availibility in the NL was only 40 cards of the 1080/1070 till July hit in total from March.
    You do the math why.
    GTX 1080 launched at the end of May 2016 (and that was FE), what are you talking about? You could definitely buy a partner card at the end of June in the US, in the EU they mostly started shipping at the same time as GTX 1070 (which was beginning of July, when I got mine). You could preorder a FE and get it at the day of launch (I know a handful of people who did it), but buying an overpriced FE where you also had to pay for the shipment from the US is pretty stupid.
    i7-6700K @ 4.6GHz | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | 256GB Samsung EVO 850 PRO | 2TB WD2003FZEX | 2TB WD20EFRX | Creative Sound Blaster Z | Thermalright Silver Arrow IB-E Extreme | Corsair RM650x | Cooler Master HAF X | Logitech G400s | Tt eSPORTS POSSEIDON | Kingston HyperX Cloud | BenQ XL2411T

  19. #499
    I received my Gigabyte 1080 G1 in the first half of June 2016 in EU


    although I live in a smaller EU country where not everything gets sold out or pre-ordered (you can still buy cards even in mining craze peaks and we had aftermarket Pascal cards available shortly after official launch)

    not like ~Germany or UK

  20. #500
    The Lightbringer Evildeffy's Avatar
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,258
    Quote Originally Posted by Thunderball View Post
    They will use whatever is available. It's not like there is much of a difference between the two in terms of performance (at least initially).
    It is required in controller design in the silicon, can't switch between controllers on a die.
    The difference is more than in performance, it's also in power use, GDDR5X uses 1.8V where-as GDDR6 uses 1.35V as an example.
    So it's either designed with GDDR6 or GDDR5X, can't be both and seeing as how SK Hynix pretty much confirmed it being used in the next generation graphics cards ... well you get the point.

    Quote Originally Posted by Thunderball View Post
    GTX 1080 launched at the end of May 2016 (and that was FE), what are you talking about? You could definitely buy a partner card at the end of June in the US, in the EU they mostly started shipping at the same time as GTX 1070 (which was beginning of July, when I got mine). You could preorder a FE and get it at the day of launch (I know a handful of people who did it), but buying an overpriced FE where you also had to pay for the shipment from the US is pretty stupid.
    I think you're misunderstanding something here.

    The mass production of GDDR5X was already going and announced by Micron when the cards for sale, official sale went up 10th of May, but even then availability was scarce, was from nVidia itself and retail didn't receive anything till 27th of May, but you could buy them (first batch) directly.
    The FE edition you could get but again that was only ~40 cards in total for the entirety of the Netherlands for almost 2 months when in early July the partner cards were finally starting to get introduced into the market.
    (Pre-ordering an nVidia GTX 1080 FE to NL was not possible btw, so we had to make due with what nVidia shipped us)

    I did specifically state "Availability in the NL", NL being Netherlands, a country in the EU and also one with the highest tech throughput in Europe.

    My point with this was that Mass Production was announced before and confirmed to start before the GTX 1080 was being sold.
    Since I highly doubt AMD's Navi will be the sole user of GDDR6 in 2018 and since AMD has bet it's chips on HBM... nVidia is the only other graphics manufacturer present which brings me yet again to my earlier point:
    If nVidia's Volta cards will use GDDR6 then that means it will not launch untill Q1 - Q2 2018 at the earliest.
    I will state again that you cannot design a silicon base with both GDDR5(X) and GDDR6 controller, it is not feasible.

    GDDR5 was an exception to this rule as it was specifically designed to be for the same controller but GDDR5(X) and GDDR6 are different generation and mix-matching isn't going to work.

    To expand on that with the "GV100" card ... that will use HBM2 and is an entirely different beast much like "GP100" was, they are not comparable in the least with the consumer line of cards.

    Assuming Volta will launch when it's going to use GDDR6 within this year (very early Q4 according to some people's estimation here) when mass production won't be till "early 2018" is presumptuous and stupid.
    If all memory manufacturers (Micron, SK Hynix and Samsung) come forward tomorrow that Mass Production of GDDR6 has started right now... then it'd be a possibility but do you see that happening since 2 months ago those very same memory manufacturers stated "early 2018" for that mass production date?
    Nope? I don't either... that's my point.

    - - - Updated - - -

    Quote Originally Posted by Life-Binder View Post
    I received my Gigabyte 1080 G1 in the first half of June 2016 in EU

    although I live in a smaller EU country where not everything gets sold out or pre-ordered (you can still buy cards even in mining craze peaks and we had aftermarket Pascal cards available shortly after official launch)

    not like ~Germany or UK
    You were lucky, trying to get a GTX 1080 was almost impossible here, like I said ... NL was a high throughput country.
    The fact we only got 40 cards for the entire damned country was stupid as well (contacts @ suppliers).

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •