Page 31 of 45 FirstFirst ...
21
29
30
31
32
33
41
... LastLast
  1. #601
    ANd thats showing prices around ~1000€ or higher across europe for the RX Vega 64....

    Thats not a good sign. No way this thing is price competitive if thats even close to true.

  2. #602
    Quote Originally Posted by Kagthul View Post
    ANd thats showing prices around ~1000€ or higher across europe for the RX Vega 64....

    Thats not a good sign. No way this thing is price competitive if thats even close to true.
    Assuming they don't pull a rabbit from their hat, or are deceiving everyone to not get another Fury vs 980ti situation(which is doubtful).

  3. #603
    Warchief Zenny's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,171
    Pricing I have gotten so far indicates it's more expensive then a 1080ti. What the hell AMD?

  4. #604
    Deleted
    Quote Originally Posted by Zenny View Post
    Pricing I have gotten so far indicates it's more expensive then a 1080ti. What the hell AMD?
    I'm sure the card has a high price tag, however it seems imo AMD does have a lot of hardware under the hood, they always have done, mean look at the 7970, but this is the issue AMD has, the hardware I'm sure is powerful, just needs to be utilised, the Fury X under vulcan is a fantastic showcase of this but the card might be way too complex to have at a low price and going to be under utilised.

    However on another note, the new AMD drivers are actually decent.

  5. #605
    Quote Originally Posted by Zenny View Post
    Pricing I have gotten so far indicates it's more expensive then a 1080ti. What the hell AMD?
    RX Vega is DoA (unless if miners want it even at that price and power draw)

    FE Vega though could sell

    - - - Updated - - -



    - - - Updated - - -

    kek

    http://www.guru3d.com/news-story/sap...s-surface.html


    Sapphire Radeon RX Vega 64 Card SKU names surface


    Website elchapuzasinformatico in the past has been pretty spot on with information that should not be out there. This round they apparently stumbled into three Sapphire Radeon RX Vega cards.

    From what we can derrive from the information, there will be two cooling solutions for Vega cards, liquid (LCS) and obviously air-cooled ones. Sapphire seems to be releasing three models:

    SAPPHIRE RADEON RX VEGA 64 8G HBM2 HDMI + TRIPLE DP, LIQUID COOLING 2048-bits - Water Cooler
    SAPPHIRE RADEON RX VEGA 64 8G HBM2 HDMI + TRIPLE DP LIMITED EDITION 2048-bit - 2 slot active
    SAPPHIRE RADEON RX VEGA 64 8G HBM2 HDMI + TRIPLE DP 2048-bit - 2 slot active

    The Limited edition probably is a higher clocked edition. We cannot really verify this information whatsoever, however it would make no sense to make up naming like that. RadeoN RX Vega 64.If you think that "64" suffix is a little weird, it isn't really when you think about it. A full Vega chip has 64 shader processor clusters with 64 shader processors each = 4096 shader processors.

    There is more info though, El Chapuzas mentions a 699 Euro and 899 Euro pricing in-between the cheapest and that liquid cooled edition. That would be ex VAT here in the EU with prices roughly similar in USD. El Chapuzas states that their source is reliable. As always take news like this with a grain of salt, but looking at the naming schema, it would not surprize me if it is correct.

    AMD will make formal announcements next week.

  6. #606
    Warchief Zenny's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,171
    We must be missing something here, 1080 performance for $700-$900? Unless RX Vega is two Vega chips connected with Infinity Fabric or something equally insane, but that seems doubtful. Another possibility is that suppliers were given the incorrect pricing?

  7. #607
    Deleted
    Quote Originally Posted by Zenny View Post
    We must be missing something here, 1080 performance for $700-$900? Unless RX Vega is two Vega chips connected with Infinity Fabric or something equally insane, but that seemsk doubtful. Another possibility is that suppliers were given the incorrect pricing?
    700-900€ would be 820 to 1060 $...

    1€=1.17$ today.

    With gtx 1080 being around 500€ up wards and 1080 ti bring 700€ upwards, that card would need to beat the 1080ti by some margin...

    There is also a rumor the liquid cooled version would be above 1300€.
    Last edited by mmoc4ec7d51a68; 2017-07-28 at 06:51 AM.

  8. #608
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,867
    I don't think it will cost higher than 1080Ti - it just makes no sense, as it is obviously going to be inferior going by existing information.

    If that Vega would be better or at least on part with 1080Ti, there would not be these feats of desperation with blind tests and bullshiting in even the best case scenario game for AMD - Doom. If it was a viable alternative to 1080Ti - they'd just show FPS and make sure to buzz everyone's ears out how they own 1080Ti there with hard numbers. Blind tests, no hard numbers, alternative card with same chip being pretty shit.

    Yup, set sail for fail alright.

  9. #609
    Deleted
    4 weeks ago there was an offer about 650€ for a 1080Ti. Today it costs 740€.

    If Vega is in 700€ range it NEEDS to compete with a 1080 ti. Not to mention the rumored 900€ or 1300 wq versions...

    But i dunno if AMD could affprd it. HBM2 is expensive. They might need to take 700€ upwards to make any profit. Despite of performance being on the same level like a 500€ card.

  10. #610
    Deleted
    Quote Originally Posted by Kagthul View Post
    ANd thats showing prices around ~1000€ or higher across europe for the RX Vega 64....

    Thats not a good sign. No way this thing is price competitive if thats even close to true.
    Not gonna happen, it has to be a sort of mistake for sure.

    I'm pretty sure they are going to undecut the 1080 by 50/75€.

    Like I previously said, I'm stuck with a 1080 and G-Sync so I won't change my rig for AMD any time soon.. However, I'm building my GF's PC and I'm waiting to see what AMD offers in terms of VEGA since we got her a FreeSync display

  11. #611
    Quote Originally Posted by Atirador View Post
    Not gonna happen, it has to be a sort of mistake for sure.

    I'm pretty sure they are going to undecut the 1080 by 50/75€.

    Like I previously said, I'm stuck with a 1080 and G-Sync so I won't change my rig for AMD any time soon.. However, I'm building my GF's PC and I'm waiting to see what AMD offers in terms of VEGA since we got her a FreeSync display
    There is no way they are going to undercut 1080. They can undercut 1080Ti, by about $100 at most.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  12. #612
    Deleted
    Quote Originally Posted by Thunderball View Post
    There is no way they are going to undercut 1080. They can undercut 1080Ti, by about $100 at most.
    What's their point of undercutting a 1080Ti with the power of a GTX 1080? If you are not stuck onto FreeSync ecosystem there is no reason what so ever for high end gamers to use Vega.

    Unless all theories are wrong and Vega has the same power of a Ti or close to it.

  13. #613
    Deleted
    For me, it needs to be within about $50-75 of the 1080, I'll cope with that and the extra consumption in order to not bind myself to Gsync. Vegarx isn't looking anywhere close to a 1080ti, presuming the f.e edition card's drivers aren't baulked in every workload.

  14. #614
    I think it needs to come in below the 1080 in pricing (assuming equivalent performance which is what everyone is expecting).

    - The FreeSync vs GSync thing is only really relevant if you are buying a completely new setup from scratch. Most people don't upgrade a monitor as often as a video card, so unless they are planning to upgrade both, and/or are already sitting on existing FreeSync monitors, it's an irrelevant point. The vast majority of the people probably either don't care about adaptive sync, already have GSync monitor if they do (because AMD cards haven't been competitive in ages), or have no intention of upgrading their monitor.
    - Any cost savings from the adaptive sync thing is going to be eaten up in a new build by increased PSU requirements. Simply put, a 1080 recommends a minimum of a 550w and a Vega will recommend an 850w.
    - You're still stuck with AMD drivers if you buy the Vega, which are historically less robust, updated for new games less frequently, and receive fewer game developer optimizations (due to being the minority of the GPU share for years now).

    The adaptive sync argument feels like desperation for AMD to justify releasing a gaming card that is just not viable (if the leaked pricing is accurate). Why even release it at all if you can't at least compete from a price to performance perspective (let alone performance leadership)?

  15. #615
    Someone lay out the adaptive sync argument. Here's why i think I know just from being me:

    1. Gsync monitors cost about 200 dollars more than freesync because of the Nvidia tax
    2. freesync is the inevitable winner of the adaptive sync war because Intel is going with them
    3. Currently each brand of card only works with their respective adaptive sync hardware
    4. Nvidia has given no direct signs of supporting freesync in the future, but it is speculated that they will(pure guessing). AMD will never and cannot support gsync

    Is this true?

    I mean, the adaptive sync argument is pretty freakin' weak when considering a graphics card, imho. A monitor, perhaps, but not a graphics card. I'm currently on a 1440p gsync at 144hz, and won't be upgrading until we get uncompressed 4k @ 144hz ips sub 700 bucks, which iirc the latest media port released doesn't even support(including the one on the 1080ti, which I think is 4k@120hz).

    So is that right? Check my guesswork/casualknowledge, lol.

    PS: Another reason the argument is weak, is because Nvidia has shown no sign of stopping support of gsync on future cards, correct?
    Last edited by Zenfoldor; 2017-07-28 at 04:26 PM.

  16. #616
    Deleted
    Quote Originally Posted by Zenfoldor View Post
    Someone lay out the adaptive sync argument. Here's why i think I know just from being me:

    1. Gsync monitors cost about 200 dollars more than freesync because of the Nvidia tax
    2. freesync is the inevitable winner of the adaptive sync war because Intel is going with them
    3. Currently each brand of card only works with their respective adaptive sync hardware
    4. Nvidia has given no direct signs of supporting freesync in the future, but it is speculated that they will(pure guessing). AMD will never and cannot support gsync

    Is this true?

    I mean, the adaptive sync argument is pretty freakin' weak when considering a graphics card, imho. A monitor, perhaps, but not a graphics card. I'm currently on a 1440p gsync at 144hz, and won't be upgrading until we get uncompressed 4k @ 144hz ips sub 700 bucks, which iirc the latest media port released doesn't even support(including the one on the 1080ti, which I think is 4k@120hz).

    So is that right? Check my guesswork/casualknowledge, lol.

    PS: Another reason the argument is weak, is because Nvidia has shown no sign of stopping support of gsync on future cards, correct?
    I disagree with the arguement being weak, I think its stupid of people to get a high end card or even a decent mid range card and not spend decent money on a monitor, they are equally as important as each other as you need to get the most out of the fidelity and smoothness bump that a decent monitor provides with its feature set.

  17. #617
    Gsync also works better and more consistently then Freesync thanks to the chip doing the work and not really much dependant on drivers/implementation

    practically in every AMD driver release - issues list usually mentions something or other with Freesync, usually either not working in a specific game or flickering

    but you pay less for it, yeah


    either way I dont think Intel supporting one or the other will matter much (do Intel IGP users really care about VRR ?) and also - when HDMI 2.1 takes over en masse - both Gsync and Freesync will be moot, since HDMI 2.1 has baked in VRR .. any HDMI 2.1 device (TV or monitor) will have VRR with any HDMI 2.1 GPU

  18. #618
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Life-Binder View Post
    Gsync also works better and more consistently then Freesync thanks to the chip doing the work and not really much dependant on drivers/implementation

    practically in every AMD driver release - issues list usually mentions something or other with Freesync, usually either not working in a specific game or flickering

    but you pay less for it, yeah
    G-Sync has as many issues as FreeSync, don't kid yourself.
    FreeSync 2 also seems to have better variables than G-Sync, which is a follow-up.

    Quote Originally Posted by Life-Binder View Post
    either way I dont think Intel supporting one or the other will matter much (do Intel IGP users really care about VRR ?) and also - when HDMI 2.1 takes over en masse - both Gsync and Freesync will be moot, since HDMI 2.1 has baked in VRR .. any HDMI 2.1 device (TV or monitor) will have VRR with any HDMI 2.1 GPU
    Intel iGPU users WANT VRR more than anyone as it's a bigger effect on them.
    Also FreeSync and G-Sync will not be moot as what you're referring to is part of an optional HMDI 2.1 spec, so the support is there, as it has been for years with Adaptive Sync, doesn't mean nVidia will support it and technically FreeSync already does ... so G-Sync is already at a disadvantage here.

    Both technologies have their merit and some work better in different situations than others.

    Ridiculing either when you have no knowledge of the specs is not a good idea to do.

  19. #619
    hdmi 2.1 will be a game changer

    even i will ditch Gsync when it lands





  20. #620
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Life-Binder View Post
    hdmi 2.1 will be a game changer

    even i will ditch Gsync when it lands
    Keep telling yourself that... reality will not agree with you.

    nVidia has never put any effort into doing it for DisplayPort instead promoting their own technology.
    What makes you think they will put the effort into HDMI into an optional spec away from their own technology again?

    That's wishful thinking without any sort of basis, you can keep dreaming though... never a bad thing to do.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •