Page 3 of 3 FirstFirst
1
2
3
  1. #41
    Quote Originally Posted by munkeyinorbit View Post
    hahaha. owned. My thoughts is that some fanboy wanted to start an Internet "discussion" when they do not even understand what is going on.
    How was that being owned? The article said nothing of what he said. He said the 56 crushes the 1070 and the 64 beats the 1080 handily. Both of those claims were wrong. The 56 does beat the 1070, but it doesn't crush it....and the 64 isn't on the 1080's level...

    There's no fanboyism here, I want AMD to succeed and push Nvidia, I want there to be actual competition in the GPU market because it will benefit us the consumer.

  2. #42
    https://www.youtube.com/watch?v=roNc1AAevMM

    Vega 64 cant even catch a mildly overclocked 1080 when it is OCed to the gills.

  3. #43
    Quote Originally Posted by Xinkir View Post
    Looks like those 4k 120-165 hz panels coming out soon will have to wait to really be fully utilized.
    We dont even have much 1440p 144Hz IPS panels, let's start with those.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  4. #44
    Quote Originally Posted by Xinkir View Post
    Looks like those 4k 120-165 hz panels coming out soon will have to wait to really be fully utilized.
    Yea, that's why I went with a 1440p monitor now. It's not IPS, but honestly I don't care. It's a great gsync monitor. Those 120-165hz 4k monitors will be too expensive too. So waiting a few years for tech as well as pric s tondrop on good 4k monitors will be good.

  5. #45
    I'd imagine with the inclusion of variable refresh rates in the HDMI 2.1 spec we're going to see G-Sync and FreeSync go the way of the dodo in the next few years.

  6. #46
    Stood in the Fire Iannis's Avatar
    10+ Year Old Account
    Join Date
    Jun 2009
    Location
    Ironforge
    Posts
    453
    Well this all makes me sad. My 980ti struggles a bit at 1440p in several games I enjoy.
    Wanted to skip 1080, and Vega seems to just be a thirsty 1080 as well. Really hoped
    Volta would be out before Christmas. I wanted to toss it into a new Coffee lake rig. Oh well, maybe next spring.

  7. #47
    Quote Originally Posted by kaelleria View Post
    I'd imagine with the inclusion of variable refresh rates in the HDMI 2.1 spec we're going to see G-Sync and FreeSync go the way of the dodo in the next few years.
    ^ this


    though mass implementation of HDMI 2.1 will take a while, but its inevitable just like 2.0 replaced 1.4

  8. #48
    Quote Originally Posted by kaelleria View Post
    I'd imagine with the inclusion of variable refresh rates in the HDMI 2.1 spec we're going to see G-Sync and FreeSync go the way of the dodo in the next few years.
    When/if that happens, I'll be happy...I love my monitor, but for a 4k monitor with gsync it was a tad ridiculous...I could've bought a 55-60" TV for the same price. Would love to see monitor prices normalized again.

  9. #49
    Quote Originally Posted by RuneDK View Post
    When/if that happens, I'll be happy...I love my monitor, but for a 4k monitor with gsync it was a tad ridiculous...I could've bought a 55-60" TV for the same price. Would love to see monitor prices normalized again.
    It'll take 2-3 years more than likely. We'll start seeing super expensive HDMI 2.1 stuff at CES 2018, then at CES 2019 we'll see all the low cost, offbrand panels with HDMI 2.1.

    Give or take a year of course. All you have to do is look at how HDR has played out to get a feel for the timeline and pricing.

  10. #50
    Looks like flim forced to buy any gpu now and then sell during march to buy volta if its really good

  11. #51
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by kaelleria View Post
    I'd imagine with the inclusion of variable refresh rates in the HDMI 2.1 spec we're going to see G-Sync and FreeSync go the way of the dodo in the next few years.
    Not going to happen with FreeSync as the very technology you mention is actually being used in the HDMI 2.1 spec you're so cheering for.

    As well as the fact that it's an OPTIONAL technology to use, not MANDATORY to remain to the spec.

    FreeSync works over DisplayPort and HDMI, why do you think HDMI absorbed it into their spec? FreeSync yay.

    AMD works with open standards and "FreeSync" is nothing but a modified "Adaptive Sync" which was already in DisplayPort standards long before it existed.
    And FreeSync is fully compliant with "Adaptive Sync" standards as well as the HDMI one you're about to get... so the only one you'll be missing is nVidia.

    Have fun with that

  12. #52
    Quote Originally Posted by kaelleria View Post
    I'd imagine with the inclusion of variable refresh rates in the HDMI 2.1 spec we're going to see G-Sync and FreeSync go the way of the dodo in the next few years.
    Doesn't that basically just mean you'll be able to use the various forms of Adaptive Syc(FastSync/FreeSync/GSync) on HDMI 2.1 instead of being require to use Display Port as it currently is? While I could be wrong, this is not going to replace G-Sync and FreeSync, it's just going to allow them to be used on HDMI instead of require Display Port.

  13. #53
    Quote Originally Posted by Evildeffy View Post
    Not going to happen with FreeSync as the very technology you mention is actually being used in the HDMI 2.1 spec you're so cheering for.

    As well as the fact that it's an OPTIONAL technology to use, not MANDATORY to remain to the spec.

    FreeSync works over DisplayPort and HDMI, why do you think HDMI absorbed it into their spec? FreeSync yay.

    AMD works with open standards and "FreeSync" is nothing but a modified "Adaptive Sync" which was already in DisplayPort standards long before it existed.
    And FreeSync is fully compliant with "Adaptive Sync" standards as well as the HDMI one you're about to get... so the only one you'll be missing is nVidia.
    with HDMI 2.1 you are not forced to use Freesync ecosystem and forced into their *cough, subpar* high-end GPUs and rely on Radeon drivers to have Freesync work consistently and bug-free

    yay !


    in theory even Intel iGPUs, once they start having HDMI 2.1, will be able to have VRR



    Doesn't that basically just mean you'll be able to use the various forms of Adaptive Syc(FastSync/FreeSync/GSync) on HDMI 2.1 instead of being require to use Display Port as it currently is? While I could be wrong, this is not going to replace G-Sync and FreeSync, it's just going to allow them to be used on HDMI instead of require Display Port.
    replace in this case means "made obsolete" (presumably)

    by having a HDMI 2.1 device you will have VRR

    - - - Updated - - -

    the biggest part here is that with HDMI 2.1 we will finally get VRR on TVs at least

    Freesync already works over HDMI since a year or more ago IIRC

    and yet in all that time, as far as I know, there have been zero Freesync (much less Freesync 2) TVs announced, much less actually released .. TV manufacturers just dont care about Freesync over HDMI


    but they will care about HDMI 2.1
    Last edited by Life-Binder; 2017-08-18 at 03:42 PM.

  14. #54
    Quote Originally Posted by kaelleria View Post
    It'll take 2-3 years more than likely. We'll start seeing super expensive HDMI 2.1 stuff at CES 2018, then at CES 2019 we'll see all the low cost, offbrand panels with HDMI 2.1.

    Give or take a year of course. All you have to do is look at how HDR has played out to get a feel for the timeline and pricing.
    I'm personally fine with the timeline. HDR is a nice addition, HDMI 2.1 seems like itll be a great addition as well. Normalizing the price of 4k Monitors in the next five years is something I am going to be looking forward to. Current prices being so ridiculous is really my problem. I love my PC, but $800 for a 4k monitor right now is absurd. Can't wait to see the price of 120-165hz 4k monitors in a few months/year...it will be even more ridiculous.

  15. #55
    I dont see how HDMI 2.1 will drop prices on 4K monitors tbh

    if anything if HDMI 2.1 costs them to implement then the prices may rise a little

  16. #56
    Quote Originally Posted by Life-Binder View Post
    I dont see how HDMI 2.1 will drop prices on 4K monitors tbh

    if anything if HDMI 2.1 costs them to implement then the prices may rise a little
    Like all technology, price comes down over time. At first yes hdmi 2.1 will be expensive. It should go down over time. Gsync is a "premium" feature right now. HDMI 2.1 probably won't be a "premium" feature in five years.

  17. #57
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Life-Binder View Post
    with HDMI 2.1 you are not forced to use Freesync ecosystem and forced into their *cough, subpar* high-end GPUs and rely on Radeon drivers to have Freesync work consistently and bug-free

    yay !

    in theory even Intel iGPUs, once they start having HDMI 2.1, will be able to have VRR
    You don't seem to understand... "FreeSync" isn't really AMD's ecosystem.. it's DisplayPort's ecosystem.
    And it doesn't matter, bugs will always remain.. you think G-Sync works without issues? Drivers and VRR will always require updating.

    Intel already supports "Adaptive Sync" which is DisplayPort's own and by that extension supports AMD's FreeSync as well as the 2 are fully compliant with each other.

    And repeating that AMD's FreeSync over HDMI will be used, as it is again just "Adaptive Sync" for your TVs etc.
    Whether you like it or not going with HDMI 2.1 you WILL be entering FreeSync territory and it'll be free-to-use... whether you want to or not is another matter.

    nVidia will NOT follow this as they will remain in their own proprietary ecosystem.

    Quote Originally Posted by Life-Binder View Post
    replace in this case means "made obsolete" (presumably)

    by having a HDMI 2.1 device you will have VRR

    the biggest part here is that with HDMI 2.1 we will finally get VRR on TVs at least

    Freesync already works over HDMI since a year or more ago IIRC

    and yet in all that time, as far as I know, there have been zero Freesync (much less Freesync 2) TVs announced, much less actually released .. TV manufacturers just dont care about Freesync over HDMI

    but they will care about HDMI 2.1
    Because TVs generally have no use whatsoever for VRR technology.
    98% is movies, series, etc. which has 0 use for VRR because of a fixed low frame rate operation.
    The other 1,9% are console games which don't go beyond 60Hz regardless and are often lower and fixed at either 30 or 60 making VRR once again mostly useless.

    And TVs are generally not used as monitors, therefore there has been 0 need to introduce something like VRR into TVs and you'll still not see them for the most part even if the standard comes out because unless consoles start becoming beastly PCs with 100+ FPS you'll still not see it.
    Why do you think there are barely any TVs out there using DisplayPort connectors and technology?

    Unfortunately .. both FreeSync and G-Sync technologies will remain to muck things up!

  18. #58
    "FreeSync" isn't really AMD's ecosystem
    Freesync certainly is AMDs

    the VESA adaptive sync itself isnt


    you think G-Sync works without issues?
    mine has worked 100% flawlessly for a year now

    though, thats without SLI


    Whether you like it or not going with HDMI 2.1 you WILL be entering FreeSync territory
    I will be using HDMI 2.1s Game Mode VRR, not Freesync


    nVidia will NOT follow this as they will remain in their own proprietary ecosystem.
    your crystal ball tell you that they will not support certain HDMI 2.1 features ?

    not like Gsync or existing Gsync monitors will dissapear anywhere



    Because TVs generally have no use whatsoever for VRR technology.
    98% is movies, series, etc. which has 0 use for VRR because of a fixed low frame rate operation.
    The other 1,9% are console games which don't go beyond 60Hz regardless and are often lower and fixed at either 30 or 60 making VRR once again mostly useless.
    TVs can and do get used as displays for PC games

    whether on the desk as an actual monitor or just with a gamepad connected to the PC

    in fact thats what a lot of OLED enthusiasts do atm since that is the only real way to get your PC games on OLED right now (there is technically a Dell OLED monitor, but its only come out recently, costs ~$3000-3500 for a 30" screen and its OLED implementation is worse than LG 4K OLED Tvs from what Ive read)

    and for these reasons a lot of them also eagerly wait for HDMI 2.1 LG/Sony OLEDs



    And TVs are generally not used as monitors, therefore there has been 0 need to introduce something like VRR into TVs and you'll still not see them for the most part even if the standard comes out
    again with the crystal ball

    LG/Sony etc. will fully support HDMI 2.1 on their TVs and VRR support will be a good incentive to get those for both PC gamers and future VRR-enabled consoles (Xbox 1 X, PS5, Xbox 2)

    .. the only question is whether we get HDMI 2.1 in 2018 or in 2019
    Last edited by Life-Binder; 2017-08-18 at 04:19 PM.

  19. #59
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Life-Binder View Post
    Freesync certainly is AMDs

    the VESA adaptive sync itself isnt
    Yes it is but you misunderstood what I've been trying to get across...

    Quote Originally Posted by Life-Binder View Post
    mine has worked flawlessly for a year now

    though, thats without SLI
    And I can tell you from my clients and simple Google searches that G-Sync is still having issues, especially with differing frequency multi-monitor systems where the 2nd monitor has issues with blinking, flickering etc.
    Bugs are everywhere...

    Quote Originally Posted by Life-Binder View Post
    I will be using HDMI 2.1s Game Mode VRR, not Freesync
    On what will you use that exactly? Your console(s)? Because nVidia sure as hell won't support it

    Quote Originally Posted by Life-Binder View Post
    your crystal ball tell you that they will not support certain HDMI 2.1 features ?

    not like Gsync or existing Gsync monitors will dissapear anywhere
    The HDMI 2.1 feature is optional to implement and would be free vs. G-Sync ... the premium ecosystem nVidia has built-up to work with.
    Adaptive Sync is "Free" as well... nVidia never wanted to use that either.
    And I just said both FreeSync and G-Sync will remain active as ever damnit...

    Quote Originally Posted by Life-Binder View Post
    TVs can and do get used as displays for PC games

    whether on the desk as an actual monitor or just with a gamepad connected to the PC

    in fact thats what a lot of OLED enthusiasts do atm since that is the only real way to get your PC games on OLED right now (there is technically a Dell OLED monitor, but its only come out recently, costs ~$3000-3500 for a 30" screen and its OLED implementation is worse than LG 4K OLED Tvs from what Ive read)

    and for these reasons a lot of them also eagerly wait for HDMI 2.1 LG/Sony OLEDs
    And the amount of people using TVs as monitors compared to the rest of the market is 0,01% .. making VRR virtually useless for sales, more cost to implement and why DisplayPort is rare on TVs.

    Also "Only real way to get PC games on TVs is OLED" .... what the hell?

    It still doesn't change that it is such a large minority that even 0,01% is a massive overestimation of PC users using TVs as gaming screens.
    Mostly because TVs have anywhere from 20 - 80 millisecond lag and their panels generally aren't high frequency at all but use tricks to make you think it is.

    Quote Originally Posted by Life-Binder View Post
    again with the crystal ball

    LG/Sony etc. will fully support HDMI 2.1 on their TVs and VRR support will be a good incentive to get those for both PC gamers and future VRR-enabled consoles

    .. the only question is whether we get HDMI 2.1 in 2018 or in 2019
    You state I'm the one with the crystal ball to predict the future but let me ask you:

    Has LG or Sony (since you bring these up all the time) told you that they will specifically support AND IMPLEMENT Variable Refresh Rate of the HDMI 2.1 spec?
    Since it's, once again, an OPTIONAL spec to support you can say you "fully support" HDMI 2.1 but omit VRR spec and you would still be right as you fully support the mandatory HDMI 2.1 specs, not the optional ones.

    You realize that VRR technology isn't something that you can just "implement" and then all devices that use HDMI 2.1 ports can use it right?
    Separate firmware and hardware have to be made and tailored to each other to work, it's not a matter of "Plug & Play" as you think it is.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •