Page 28 of 28 FirstFirst ...
18
26
27
28
  1. #541
    Quote Originally Posted by Tiberria View Post
    The FreeSync vs GSync thing is a terrible comparison and just rationalizing poor price/performance ratios. Bottom line is, most people do not replace their monitor whenever they replace their GPU or even do a full system build, and anyone that wants one of those technologies probably already has it. If anything, using FreeSync is actually a deterrent to adoption, because any one that wanted performance above the mainstream GTX 1060 level has been forced into Nvidia cards for 2+ years now, meaning that they probably have GSync monitors that would lose functionality if they switched manufacturers.

    Not only that, but any perceived advantage in having the cheaper/open source sync technology is more than 100% nullified by the PSU and power usage requirements. If they want to use the monitor pricing thing as their rationalization, surely they should also factor in the price of a better PSU that Vega will need AND the extra electrical costs over the lifespan of the card.
    There are people that aren't in the 1080+ price range, let alone the extra GSync pricing. There is definitely a market for FreeSync. The market for Vega will be determined by the price and performance.

  2. #542
    Herald of the Titans Evildeffy's Avatar
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    2,914
    Quote Originally Posted by Thunderball View Post
    It was in 8000 series somewhere, was also in notebooks everywhere. Probably went in to a lot of budget cards then. Nothing new, honestly, GF108 is another example. Extreme budget cards were always like that. AMD just rebranded most of their product line, including the chips, two times, including highend cards.
    Very well .. humour me .. what flagship has been rebranded 2 times according to you by AMD.
    Also the G92 chip wasn't used in notebooks and was rebranded I believe 5 times in total, nVidia chip here.
    Please don't act like both companies are saints, they are not and rebranding does make sense from a financial and stock PoV.

    Quote Originally Posted by Thunderball View Post
    Vega FE is out, Apple released the specs for their Vega based cards, all indicate that Vega FE is the top silicon they got. The only thing they can improve at this point is clocks, and clock scaling is pretty easy to predict. We also have Fury X pricing, there is no way to expect top RX Vega to be significantly cheaper. We dont know the end level of performance, that's for sure, but we can for sure tell that it wont compete with GTX 1080Ti.

    You are the first person on this forum to accuse of AMD fanboism, dont fool yourself.
    So because we know Vega FE's specs and abilities we are 100% sure of what RX Vega will be?
    I just gave you a prime example of why that can change very quickly.

    We can assume things but we can never know things until it is revealed and because I prefer to go along with the scientific way of cold hard facts rather than assumption makes me a fanboy? Get over yourself.

    Your method of life is "Oh look, the brother of that guy is a terrorist, therefore we can logically assume his brother and the rest of his family is too!".
    I am SUCH an AMD Fanboy that I have an Intel Core i7-990X, Intel Core i7-4960X, Intel Core i7-3770K and SuperMicro X9SPV-M4 with Intel Core i7-3555LE (server) along with my self-built NAS system which has the ASUS E45M1I-Deluxe motherboard with an integrated AMD Fusion E450 CPU/GPU, what a HUGE AMD Fanboy I am.

    I also have a GTX 1080 (GigaByte G1 Gaming), an ASUS GTX 760 ITX, an MSI Radeon R9 390X Gaming and an integrated Intel HD graphics.

    Yes I am SUCH a huge AMD fanboy that I have a total of 2 products of theirs, 1 being a low power NAS/HTPC mobo and 1 graphics card vs. the 2 nVidia cards and 4 other Intel based systems.

    I correct people and get corrected if I'm wrong, all of which I've admitted to in the past if I were wrong.
    I post what are rumours and what are facts, because you are unwilling to wait for facts and rather work on assumption makes me a fanboy?

    That is arrogance to the first degree of stupidity, you may want to work that way and no-one will stop you but don't come in here presenting things as fact when we aren't 100% sure of if they are correct in the first place.

    I'll ask you in the simplest of terms for you to understand:
    Do you know for a fact what the EXACT specifications and drivers are for RX Vega that is not a rumour?
    If the answer to that question is "No" or "Yes because I know Vega FE" then the answer is still a "No".

    Conjecture/Assumption != Facts.

    Quote Originally Posted by Thunderball View Post
    I'll remind.

    http://cdn.wccftech.com/wp-content/u...MD-Ryzen_2.png - This is Cinebench. AMD ended up using 2 channel memory and downclocking i7-6900K to get those results. There was a bunch of other similarly conducted "tests" in that presentation.
    Funny how in the Horizon event it was still winning against the 6900K with superior clocks and quad channel memory and still does.
    Cinebench cares not for Dual/Quad channel memory, results are still the same, don't see the point of you posting a correct score still.

    Quote Originally Posted by Thunderball View Post
    http://cdn.wccftech.com/wp-content/u...erformance.jpg - Those are results against a 7700K, both stock 1440p, using a GTX 1070. They also had a 6800K with a similar handicap as before. Basically bottlenecking the system to hide the difference in performance.
    Funny thing is that was a press slide that was leaked, if you read the actual slide you'll see that they hid nothing and explicitly gave system details to the press there.
    Also funny is how both a 6900K was pitted against a Ryzen 7 1800X in the horizon event with the same specs, only CPU difference, and it still beat the 6900K without hamstrings.
    You want to talk about leaks yet fail to include launch event data as well where there was no deception.
    Or were the results (that you could test for yourself) that they showed entirely fake even though the entire community supported it?

    Quote Originally Posted by Thunderball View Post
    If that's not lying that's at least misleading. Only one slide focusing on price to performance (misleading aswell, showing 6900K on par with R7 1700 in "composite performance").
    It's showing the 6900K above the R7 1700 and R7 1700X and below the R7 1800X in performance but not value, there's nothing wrong with that slide, if anything it's gotten better with price cuts.
    Or are you telling me right now straight to everyone's faces that Intel has the better price/performance ratio?

    I'd like to know the answer to that question.

    Answer with logic and facts, no conjecture.
    And if you can't do that then don't answer at all.

  3. #543
    Quote Originally Posted by Gray_Matter View Post
    There are people that aren't in the 1080+ price range, let alone the extra GSync pricing. There is definitely a market for FreeSync. The market for Vega will be determined by the price and performance.
    I'd argue that at the pirce/performance range that AMD has been sitting at recently (with the RX 480/580 only playing at the 1060 level), that there is no market for a premium monitor feature set like FreeSync. They would need to sit at the 1080+ performance level and stay there for it to be all that relevant, and I don't know that coming in at that performance level now, a full year after it's been available through Nvidia cards is all that compelling. You would have to think that almost anyone that wants the 1080 performance/GSync has probably already bought a 1080 or 1080 Ti and already bought or is saving for a GSync monitor.

  4. #544
    Quote Originally Posted by Tiberria View Post
    I'd argue that at the pirce/performance range that AMD has been sitting at recently (with the RX 480/580 only playing at the 1060 level), that there is no market for a premium monitor feature set like FreeSync. They would need to sit at the 1080+ performance level and stay there for it to be all that relevant, and I don't know that coming in at that performance level now, a full year after it's been available through Nvidia cards is all that compelling. You would have to think that almost anyone that wants the 1080 performance/GSync has probably already bought a 1080 or 1080 Ti and already bought or is saving for a GSync monitor.
    Actually, adaptive sync is better for lower end card. If you have a higher end card and are pushing past your monitor refresh rate, adaptive sync does nothing for you. As a replacement for V-Sync, what it does do is prevent your dips below 60 showing you 30FPS even when you are at 55. So here's a PCGamer 16 game average:
    http://www.pcgamer.com/geforce-gtx-1060-review/

    With a 1080p 60hz monitor with either G-Sync or Freesync, the only cards that are even going to benefit from adaptive sync are the 380X, 380, 960 and 950. Adaptive Sync would be doing absolutely nothing for the rest. If you step up to a 75hx Freesync, it benefits the 480 but not the 1070/1080. Even if you had a 120hx monitor, it's not doing anything at all for the 1080. You'd need a 1080p 144hz G-Sync before the G-Sync makes a difference on the 1080.


    For some reason, a whole lot of people seem to think Adaptive Sync is for top-end video cards. This is not true, top-end video cards blow past the FPS where Adaptive Sync makes any sort of difference. The lower end card you have the more adaptive sync helps you.

  5. #545
    Quote Originally Posted by Lathais View Post
    Actually, adaptive sync is better for lower end card. If you have a higher end card and are pushing past your monitor refresh rate, adaptive sync does nothing for you. As a replacement for V-Sync, what it does do is prevent your dips below 60 showing you 30FPS even when you are at 55. So here's a PCGamer 16 game average:
    http://www.pcgamer.com/geforce-gtx-1060-review/

    With a 1080p 60hz monitor with either G-Sync or Freesync, the only cards that are even going to benefit from adaptive sync are the 380X, 380, 960 and 950. Adaptive Sync would be doing absolutely nothing for the rest. If you step up to a 75hx Freesync, it benefits the 480 but not the 1070/1080. Even if you had a 120hx monitor, it's not doing anything at all for the 1080. You'd need a 1080p 144hz G-Sync before the G-Sync makes a difference on the 1080.


    For some reason, a whole lot of people seem to think Adaptive Sync is for top-end video cards. This is not true, top-end video cards blow past the FPS where Adaptive Sync makes any sort of difference. The lower end card you have the more adaptive sync helps you.
    That's because at the budget range where you are considering lower end cards, you probably don't have the extra budget to invest in a premium monitor upgrade. And, if you do have that extra $100-$200 price premium, most people would rather just put it into a better GPU or CPU than the monitor.

  6. #546
    Quote Originally Posted by Tiberria View Post
    That's because at the budget range where you are considering lower end cards, you probably don't have the extra budget to invest in a premium monitor upgrade. And, if you do have that extra $100-$200 price premium, most people would rather just put it into a better GPU or CPU than the monitor.
    Yeah, which is why, IMO, adaptive sync really makes no sense at all yet. It only benefits those with lower end graphics cards, but the money spent for it would be better off spent just getting a better graphics card.

  7. #547
    There really isnt a premium on gsync if you are looking at high end monitors (which people should be putting ahead of GPU in priority list of things to buy).
    https://pcpartpicker.com/products/mo...t=price&page=1

    Cheapest 1440p 144hz panel has gsync. The prices are crazy on the 1080p gsync models, but who buys a 1080p monitor on a PC that costs north of a grand?

  8. #548
    Adaptive sync will still help prevent tearing with high end gpu's, you just have to cap your frames slightly below what your panel can do.

  9. #549
    Well ya adaptive sync is good for any strength of card, but the true benefit of the technology is it allows you to spend less on your graphics card. Anything over ~80-90 FPS is the same to me as 165 FPS, there is no possible way i could differentiate those framerates from each other. If i bought a 1080 or ti down the road all that would allow me to do is increase graphics settings which in most games are hardly distinguishable between high and medium, ultra being a setting i will never use in any game because of its poor optimization.

    A 1060 and a gsync monitor is the best combo people can get into, way better idea than buying a 1080ti on some crappy 1080p monitor.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •