Page 45 of 77 FirstFirst ...
35
43
44
45
46
47
55
... LastLast
  1. #881
    Quote Originally Posted by Remilia View Post
    Well, question on sold out is is it due to selling well or due to supply issue. OCUK only shipped 1k units (they do preorders, but that's... pointless). I don't know what other vendors publicize it but 1k unit for one of the biggest UK suppliers is pretty small.
    AMD is giving Nvidia a lot of time to sell their cards without competition. Since all we have on Vega 10 is rumored dates atm, which is October at the soonest. Nvidia is going to have fun eating that portion of the market till then.

  2. #882
    Implying nVidia hadn't already eaten up that portion of the market in the first place. It's not like the rhetoric going around for years hasn't been "omg too little too late AMD" regardless of what AMD have done.

  3. #883
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Bigvizz View Post
    AMD is giving Nvidia a lot of time to sell their cards without competition. Since all we have on Vega 10 is rumored dates atm, which is October at the soonest. Nvidia is going to have fun eating that portion of the market till then.
    There's not a lot of portion to eat in that market though. It's like 14-16% or so of the entire market. It'd be better to leave that market alone and go for the market that actually matters.

  4. #884
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by dadev View Post
    It was proven numerous times by Umbra. Games use it, and there's integration ready in UE4 for it even. It is a well known fact. And only idiots like Evildeffy keep on arguing.
    @Evildefy you know what. idgaf. Do what you want. Believe what you want. But know this, your opinion is worth shit. While my opinion is not my own only. You have 0 idea how those things work and yet you argue and argue to no end.
    You can keep telling yourself that maybe it'll work out for you at some point.
    Also I never lashed out at you so good job with that, well done.

    Quote Originally Posted by dadev View Post
    Like WTF? First you say DX11 doesn't scale? Now this?!
    Please show me exactly where I said "DX11 does not scale".
    I stated, numerous times, DX11's draw call is limited by a primary CPU thread, which I've linked you and you acknowledged the post on saying "He's saying what I am!".

    Quote Originally Posted by dadev View Post
    You never saw a single driver line code in your life! Bullshit!
    Then you quote the article writer who fail at interpreting his own graphs!!
    Then you comment on game engine and how to design (for OpenGL of all things) in which again you have 0 experience.
    Oxide is funded by AMD.
    Then you miserably fail at interpreting MSDN. Good job, so far so good. But this is why MSDN a mess, people like you fail on it non stop.
    Gameworks develoeprs? Fuck off! You have no idea what you're talking about! Engine developers are not responsible for tessellation, it's an artist tool. Idiot!
    And then we have some good old nvidia bashing, something about interupt requests, like wtf?
    I think it's funny how you go from "This article is saying what I'm saying" to "This article is failing to interpret his own data!" on the same article.
    I also think it's funny where you pulled that magic "I comment on how to design (for OpenGL of all things)" when I clearly didn't comment on HOW to design things and least of all in OpenGL of which I even stated I don't know too much about except some of it's functions.
    Oxide funded by AMD is even more hilarious... does that mean that Eidos-Montreal and Cloud Imperium Games are also funded by AMD? Also admitted by the own developer is that nVidia has spent far more time with them than AMD, well done.
    Also hilarious is the misinterpreting MSDN on my part then going on to blame MSDN when it clearly states what it does, so whom am I to believe? A nobody or a large conglomerate whom made the tool?

    Then we get to the best part... GameWorks not responsible for tesselation, I think the entire controversy in for example Witcher 3 disagrees with you entirely.
    (among other games with GameWorks controversies)
    nVidia bashing is warranted when it comes to such tactics as it's a fact, I have however stated more times than I care to count that their cards are good.

    And the fact you don't know what an Interrupt Request on a CPU cycle is .. should be concerning as a developer.

    But you know what you are right about 1 thing:
    I do not give a fuck with someone who's being a like a little child so this is my last post towards you as well so enjoy your peace and quiet, think of things as you will.
    I certainly do not care what you think of me, not even for a moment, when I go back to sleep tonight I shall be sleeping like a baby.
    However I've treated you with respect so I expect to get a response in kind but that is something lost on you it seems.

    Where Remilia is trying another approach to tell you things you seem to not understand is more than I expected him to do even seeing the walls of text he has clearly read, i can only commend him for trying to tell you but it's clearly futile.

    So I shall close off with this for you:
    Enjoy your own superhuman feats that no other developer seems capable of doing.
    I shall enjoy seeing you lead the industry of graphics development into a new age, just be sure to use "dadev" as your nickname so I know it's you.

  5. #885
    Quote Originally Posted by Remilia View Post
    There's not a lot of portion to eat in that market though. It's like 14-16% or so of the entire market. It'd be better to leave that market alone and go for the market that actually matters.
    HAHA! ok....
    That's still 14-16% of the market that AMD has no place in atm, well till much later likely 2017. While Nvidia will likely push their 1060 release in early fall or sooner to match the RX480. Guess we'll see to be honest I don't see AMD making up any ground over the next 6 months. Just my opinion.
    Last edited by Bigvizz; 2016-06-22 at 11:50 AM.

  6. #886
    Quote Originally Posted by Bigvizz View Post
    HAHA! ok....
    There are a lot more people buying <$300 videocards than >$300 videocards. So it's actually the other way around, AMD is the only one offering new products for the majority of people buying these kind of cards.

  7. #887
    Quote Originally Posted by Bigvizz View Post
    HAHA! ok....
    Rem was generous. GTX x70 has an unexpectedly high marketshare, but the x80 card and x80 ti have much lower representation. The lower end of the selection ala GTX x60 and x50 make up a much bigger percentage, and is what AMD is targeting.

  8. #888
    Deleted
    Quote Originally Posted by Fascinate View Post
    Thing is amd has no competition for the 1070/1080 and those are selling like HOTCAKES. Ive literally not seen a single 1070/1080 in stock at amazon or newegg for msrp. Ill never spend that kind of money on a GPU so it looks like AMD gets my cash this time round.
    dont confuse them being out of stock with them selling a lot. it most likely just means supply is low (or is kept artificially low to give the impression they sell alot/increase desirability).

    that amd slide that most people don't go over $300 for a gpu makes a lot of sense and matches other similar sources.

    - - - Updated - - -

    Quote Originally Posted by WskyDK View Post
    Kinda kicking myself.
    Went ahead and bought a 1080 already. Probably should have waited for the 480 to release prior to, but I was getting sick of the fucking coil whine on my Strix 970.
    Definitely going o do more research before buying another card (likely next generation)
    you can probably resell it for profit, since they are still very scarce, especially if it's still unopened.

    tho i have to wonder why a person who can/is willling to spend $7-800 on a gpu is happy to buy a $2-300 one.
    Last edited by mmoc982b0e8df8; 2016-06-22 at 01:34 PM.

  9. #889
    I am Murloc! WskyDK's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    20 Miles to Texas, 25 to Hell
    Posts
    5,802
    Quote Originally Posted by Him of Many Faces View Post
    tho i have to wonder why a person who can/is willling to spend $7-800 on a gpu is happy to buy a $200 one.
    From (extremely) early benchmarks the 480 in crossfire appears to be on par with the 1080, for almost half the price. I'll always prefer a single card solution, but if roughly the same performance can be had for that steep of a price difference then I'm paying for my impatience.
    Quote Originally Posted by Vaerys View Post
    Gaze upon the field in which I grow my fucks, and see that it is barren.

  10. #890
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Bigvizz View Post
    HAHA! ok....
    That's still 14-16% of the market that AMD has no place in atm, well till much later likely 2017. While Nvidia will likely push their 1060 release in early fall or sooner to match the RX480. Guess we'll see to be honest I don't see AMD making up any ground over the next 6 months. Just my opinion.
    The high end market is nothing compared to the mid to low end. I doubt the 1060 can match the 480. I'm willing to bet that Nvidia is going to rebrand the GTX 980 as the 1060. Cause it's the only way they can match the 480 in performance.

  11. #891
    Quote Originally Posted by Evildeffy View Post
    You can keep telling yourself that maybe it'll work out for you at some point.
    Also I never lashed out at you so good job with that, well done.
    I'm short tempered, especially with people who don't ever listen. And really, for that (and only that) I am willing to say sorry.

    Please show me exactly where I said "DX11 does not scale".
    I stated, numerous times, DX11's draw call is limited by a primary CPU thread, which I've linked you and you acknowledged the post on saying "He's saying what I am!".
    Here:
    A draw call can only be utilized on a single CPU core
    If you read the primary advantages of DX12 and what they can do it is advertised that everything is now parallel capable natively and that draw call and other commands are no longer executed serially.
    I can find more. Are you done playing this game? Before you ask, serial essentially means no multithreading.

    I think it's funny how you go from "This article is saying what I'm saying" to "This article is failing to interpret his own data!" on the same article.
    1. Please don't misquote. Article can't fail to interpret its data, its author can.
    2. I'll be more accurate if you can't read between the lines. You linked the article, I looked at the graph (because I really don't need someone to chew up the data for me) and it told the same story as I did. I pointed it out. Then you proceeded to quote that author on something random which I couldn't care less about. So I dismissed it. Now I read it and the answer depends on the support provided by the driver. He's absolutely wrong when it's NVidia's driver and right for AMD and Intel. This is because Microsoft said: "vendors please implement this, but if you won't we'll emulate". Blaming seriality on runtime emulation is a low move. It's like saying that DX12 is crap api because of warp. You agree with this?

    I also think it's funny where you pulled that magic "I comment on how to design (for OpenGL of all things)" when I clearly didn't comment on HOW to design things and least of all in OpenGL of which I even stated I don't know too much about except some of it's functions.
    OK, explain this then:
    Correct it is but when it was implemented WoW's engine was entirely overhauled, it wasn't just a patch in it was included into a new expansion, I don't remember if it was Cataclysm or MoP.
    I might took it a little too far, but I read it as a frown, "They overhauled the entire engine and didn't put in proper DX11 support? No way man."
    But yes way, and your comment has everything to do with engine design. Very few are the engines that were made for 2 or 3 vastly different APIs and were good in most respects for all intended targets. Right now I can think of only one example. Making WoW engine dedicated for multithreading in DX11 will require a tremendous effort to achieve the same for DX9 and OpenGL paths. And no, you cannot just say "don't enable this for DX9/OpenGL". This changes how the entire engine is built.

    Oxide funded by AMD is even more hilarious... does that mean that Eidos-Montreal and Cloud Imperium Games are also funded by AMD? Also admitted by the own developer is that nVidia has spent far more time with them than AMD, well done.
    No idea about the other companies. But vendor time spent is irrelevant. In fact, I would expect nvidia to spend more time so that they'll cover for lost performance on async.

    Also hilarious is the misinterpreting MSDN on my part then going on to blame MSDN when it clearly states what it does, so whom am I to believe? A nobody or a large conglomerate whom made the tool?
    See this:
    Because you're not even looking at the basics, GPUView still looks at the DirectX API thus you're still a level above where you need to be.
    The logger traces DMA packets going to the kernel after they've been chewed up virtually by everyone. This is as close as you ever get to the hardware save for a dedicated tool by NVidia/AMD. Then you proceed to quote MSDN and fail to interpret it (mainly because it wasn't saying anything of significance, but how could you know that?).
    The MSDN quote you're looking for is:
    Quote Originally Posted by MSDN
    It looks at performance with regard to direct memory access (DMA) buffer processing and all other video processing on the video hardware. GPUView is useful for developing display drivers that comply with the Windows Vista display driver model. GPUView is introduced with the release of the Windows 7 operating system.
    Does it require more explanation or now you understand that the logger doesn't look at the DX api?

    Then we get to the best part... GameWorks not responsible for tesselation, I think the entire controversy in for example Witcher 3 disagrees with you entirely. (among other games with GameWorks controversies)
    Get this once and for all, artists decide on tessellation. Not engine developers nor gameworks developers. The latter groups responsible for sensible tessellation factors, nothing more. If the art guys who made Witcher 3 decide to use more tessellation it was their artistic decision.
    I'll walk you through it once again so that you can see the difference, tessellation is a tool for artists, in contrast async is a tool for engine developers.
    Now, what happened is that NVidia probably provides a full blown solution for hair and crap (I'm really not into gameowrks, except for using it for SLI in VR) and surprise surprise, they have a lot of tessellation there. The artists, once again, the artists liked it and magic happened.
    But then again, if you have a problem with this why not bring it up to AMD to make a better tessellator? Do you also blame AMD for not playing fair with async because Maxwell doesn't have it? Why should one vendor be excused for not supporting an artistically or technologically required feature while the other does?

    And the fact you don't know what an Interrupt Request on a CPU cycle is .. should be concerning as a developer.
    How did you come to that conclusion? Much more likely I know about it more than you. But then again, I dismissed your yet another attempt to derail the discussion into something entirely unrelated to the original topic so that your false statements can be concealed by meaningless noise. Visibility culling doesn't require any sort of interrupt requests and the sorts (why on earth would it?). You blame a game which didn't use a good object culling system to say that it can't be done very well on the CPU? Your logic is flawless! Current facts are, per-object culling can be done better on CPU, deal with it. Will this change in the future? Who knows, maybe a quantum computer will finally be ready in a month and then visibility culling will forever be better on CPU. So some marketing guy said something, you're ready to take his word just like that?

    However I've treated you with respect
    You certainly did not! Ignoring and arguing against what you've been told in the Nth time is pure disrespect. Not spending any time on research on things you don't know about and then arguing about it is again disrespect. Posting random stuff to derail the discussion is again disrespect. You've given me nothing but disrespect, so you get paid back in exactly the same coin.
    Last edited by dadev; 2016-06-22 at 04:19 PM.

  12. #892
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by dadev View Post
    <A concrete wall of text>
    I had already stated I will no longer go into discussion of this since it is beyond clear you and I do not see things equally.
    Take that as you will, I do not care, if you wish to think of it as your victory then go right ahead.

    I will however state that, regardless of how you define respect incorrectly, I accept your apology.

  13. #893
    Warchief Zenny's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,171
    Turns out those 1500-1600 Mhz on air claims might have been a bit presumptuous:

    http://wccftech.com/radeon-rx-480-thermal-tests-leak/

    We can only know for certain once reviews are in.

  14. #894
    @Evildeffy for me the only thing clear is that you refuse to accept facts. It is not a matter of seeing things differently, but rather one of ignoring facts. Seeing things differently is liking a movie or not, but when you say that 1+1=3 and insist on it then it's not a matter of "seeing things differently".

    For instance, I completely disagree with Remilia and I think that the chance that per-object visibility culling on the gpu will be more efficient is quite slim, and even if it will the effort spent on that could've gone into something far more useful. That's a matter of seeing things differently.

  15. #895
    Quote Originally Posted by Zenny View Post
    Turns out those 1500-1600 Mhz on air claims might have been a bit presumptuous:

    http://wccftech.com/radeon-rx-480-thermal-tests-leak/

    We can only know for certain once reviews are in.
    Not just air though, on the reference design. Did you really expect the blower style reference design, with reference PCBs to be a great overclocker? Best wait and see what the third party cooling systems look like and how they do.

  16. #896
    So this 480 (4gb version) is going to be better than GTX 980 without OC'ing or what? They say it's between GTX 980 and 980 Ti in terms of performance. Is that correct?

  17. #897
    Quote Originally Posted by Kuntantee View Post
    So this 480 (4gb version) is going to be better than GTX 980 without OC'ing or what? They say it's between GTX 980 and 980 Ti in terms of performance. Is that correct?
    Also some AMD 470 scores on there



    Last edited by Turaska; 2016-06-22 at 06:48 PM.

  18. #898
    What is the point of GTX 980 Ti?

    Edit: Okay, it's probably a bit higher than Titan X

    --

    Soooo, GTX 980 for 200 bucks. Sounds good to me.
    Last edited by Kuntantee; 2016-06-22 at 06:46 PM.

  19. #899
    Quote Originally Posted by Kuntantee View Post
    What is the point of GTX 980 Ti?

    Edit: Okay, it's probably a bit higher than Titan X

    --

    Soooo, GTX 980 for 200 bucks. Sounds good to me.
    Personally I'd splash out a little more for a 1070 and just overclock it, it'll probably last you longer than a 980 would.

  20. #900
    Quote Originally Posted by Turaska View Post
    Personally I'd splash out a little more for a 1070 and just overclock it, it'll probably last you longer than a 980 would.
    Doubling the price isn't exactly a little more, to play the same resolutions a little bit faster.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •