Page 8 of 10 FirstFirst ...
6
7
8
9
10
LastLast
  1. #141
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by meowfurion View Post
    Please show your source for your statement. Or is that an assumption?
    Look at the Steam Hardware Survey and show me how many people are using $500+ graphic cards? About 45% of the people on Steam are using GTX 960's, 750 Ti's, 1050 Ti's, and 1060's. Just those cards alone make up that much of the user base. The 1070, 1080, 1080 Ti, 980, and the 980 Ti comes to less than 6%. Everything else is a mixture of outdated and even less capable graphic cards, or integrated graphics. Though there are quiet a few people with the 950, 1050, 970, and 750. Which the 970 isn't really a high end card compared to the 1070 of today in terms of cost.

    If you're a 1070, 1080, or 1080 Ti owner I would be concerned. I would throw in Vega owners but I'm sure anyone with those cards are mining their brains out. If majority of people own mid range and lower GPU's, then what reason do developers have to make games that can use the power of these $500+ GPUs? Keep in mind that a 980 Ti is equivalent to a 1060, and that's about where developers will focus on. You have a 1080 Ti, then enjoy the Ultra settings which hardly make a visual difference from high. Nvidia will soon release their Ray-Trace GPU's and the 1080 Ti with all it's might can't do Ray-Tracing. Though it could but Nvidia won't let you.

    http://store.steampowered.com/hwsurvey/videocard/

  2. #142
    Quote Originally Posted by Dukenukemx View Post
    -snip-
    Its because everything between a 960 (and arguably a 750ti) and a 1060 can handle 1080p gaming just fine. If you have a higher budget you'll go for a higher resolution and a better card. Resolution is going to be more of a burden on a card than settings. Also why alienate a demographic no matter how small if you can help it? Especially when the end result is a better product both short and long term. Like really.. why make a shittier product intentionally? Hell... look at kingdom come deliverance. They're an indie company that put settings in their game for FUTURE video cards.

    This whole thread is just bait.

    Quote Originally Posted by Dukenukemx View Post
    I'm honestly shocked that nobody thought of using the PS4 Pro and the Xbox One X to mine crypto on.
    A bit off topic and old news but it immediately made me think of this
    Last edited by Vertigo X; 2018-04-18 at 02:03 PM.

  3. #143
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Vertigo X View Post
    Its because everything between a 960 (and arguably a 750ti) and a 1060 can handle 1080p gaming just fine. If you have a higher budget you'll go for a higher resolution and a better card. Resolution is going to be more of a burden on a card than settings.
    I don't like the idea of resolutions being the benchmark for these cards. People buy them cause it's within their budget not because these can do 1080p. $250 seems to be the limit people are willing to spend for a graphics card, and in countries where GPU prices are even higher, they aim for less capable graphic cards.
    Also why alienate a demographic no matter how small if you can help it? Especially when the end result is a better product both short and long term. Like really.. why make a shittier product intentionally? Hell... look at kingdom come deliverance. They're an indie company that put settings in their game for FUTURE video cards.
    To put features that benefit only 2%-3% is a hard pill to swallow for game developers. This is why Nvidia created Gameworks so they can work with developers to put features that would make use of the more powerful graphic cards, cause the developers wouldn't otherwise. This is also why Gameworks pushes for more polygons than needed, and why hairworks is on animals that aren't visible. It's getting really hard to justify buying a $500+ card without games that push for it. 4k isn't enough of a reason for most people.
    This whole thread is just bait.
    No, it has a point. The market for high end graphic cards isn't sustainable enough without putting specific unique features. Think of the Geforce 4 Ti and the Geforce 4 MX. One would think the Geforce 4 MX had all the same features or was at least a Geforce 3. But no, it was a Geforce 2 rebrand. Now we have Nvidia who is about to release graphic cards that can do Ray-Tracing, but we know it can't do 100% Ray-Tracing as these aren't fast enough to do it. So would a GTX 2060 or a GTX 2050 be able to playably run Ray-Tracing? The 1070 became a high end card, and the 1060 took its place.

    If Nvidia were to segregate the haves from the have nots, how exactly is Nvidia going to push for Ray-Tracing in games? Especially when the previous generation is incapable of Ray-Tracing, or so we're told. Probably Gameworks the fuck out of games is how they'll do it, but this is no good for the industry. Anything beyond $250 is a very minor minority. Without a competitor for Nvidia, we're looking at a similar situation like what happened with Intel and how we were stuck at quad cores for years.
    Last edited by Vash The Stampede; 2018-04-19 at 01:36 AM.

  4. #144
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Mainstream GPUs used by the mainstream portion of gamers - more news at 11.

    I don't even need to go to any survey to figure out that vast majority of gamers sit with FHD monitors and use $150-250 GPUs or something they have for years now, if that and that's what game developers gun for.

    Nvidia could charge $2000 for 1080Ti and it would not change jack shit - people would just be extra salty, but that's the only effect it would have.

    The whole 4k and ultra-high end GPUs and even $500+ USD gpus have such low representation, that nobody honestly would bat an eyelash if tomorrow ground opens up and swallows that whole leaving no trace.

  5. #145
    Quote Originally Posted by pansertjald View Post
    So you call an 8 core cpu for low to mid range?
    Considering 8 cores have been out what feels like a decade now.... yes..

  6. #146
    Quote Originally Posted by scelero View Post
    Considering 8 cores have been out what feels like a decade now.... yes..
    Number of cores has little to do with performance in games, particularly once you get to "enough" cores....

    which is about 3-4 to max out the most demanding Triple-A games.

    I mean, yeah, FX-series chips with 8 cores have been out since forever.

    They were routinely outperformed by Dual-core i3s with HT, but hey, 8 cores made them "high end", right?

  7. #147
    Quote Originally Posted by Amalaric View Post
    This!

    What game developer is going to make games if the graphics cards costs $500 or more?

    nVidia killed the PC as a gaming platform.

    People will buy them.... So they will make games.... Not all games need a $500 card to play them. You can get a computer to play VR with a 200 dollar card.....

  8. #148
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Kagthul View Post
    Number of cores has little to do with performance in games, particularly once you get to "enough" cores....

    which is about 3-4 to max out the most demanding Triple-A games.

    I mean, yeah, FX-series chips with 8 cores have been out since forever.

    They were routinely outperformed by Dual-core i3s with HT, but hey, 8 cores made them "high end", right?
    They were, yes, but not today. Modern games that make use of multiple cores now run faster on 8-core FX chips than Sandy Bridge CPUs. AdoredTV made a video about it. It took this long but eventually the FX series of CPU's are finally getting good use of their cores.


  9. #149
    Quote Originally Posted by Dukenukemx View Post
    They were, yes, but not today. Modern games that make use of multiple cores now run faster on 8-core FX chips than Sandy Bridge CPUs. AdoredTV made a video about it. It took this long but eventually the FX series of CPU's are finally getting good use of their cores.
    He didn't specifically make a video about it, but used FX vs Sandy Bridge as an example, when he argued: "You can't extrapolate a given CPUs future performance in games by lowering the resolution", which was sort of the hot topic during Ryzen launch.. With many reviewers testing sub 1080p resolution and siting the results as an indication that over time the gap between 7700K and 1800X would increase with more powerful graphics cards. Then Hardware Unboxed made a response video to it, with some rather weird test results and AdoredTV was thrown under the bus by pretty much everyone, when you just had his word against HW unboxed, hence this video was born.

    But all this is slightly off topic, though atm I don't really know what the topic is anymore.

  10. #150
    Quote Originally Posted by Gaidax View Post
    Mainstream GPUs used by the mainstream portion of gamers - more news at 11.

    I don't even need to go to any survey to figure out that vast majority of gamers sit with FHD monitors and use $150-250 GPUs or something they have for years now, if that and that's what game developers gun for.

    Nvidia could charge $2000 for 1080Ti and it would not change jack shit - people would just be extra salty, but that's the only effect it would have.

    The whole 4k and ultra-high end GPUs and even $500+ USD gpus have such low representation, that nobody honestly would bat an eyelash if tomorrow ground opens up and swallows that whole leaving no trace.
    I was with you til the last part. You hurt my feels.

    But in all reality the only thing that would change is that the equivalent of 1070 would become the high end and we'd be having this same thread about how $300+ GPUs are killing PC gaming.

  11. #151
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Quote Originally Posted by Vertigo X View Post
    I was with you til the last part. You hurt my feels.

    But in all reality the only thing that would change is that the equivalent of 1070 would become the high end and we'd be having this same thread about how $300+ GPUs are killing PC gaming.
    Hey man, I got 1080Ti myself with similar setup to your, that does not suddenly make me lose all touch with reality and imagine that this is what people roll with nowadays mainstream, even if CPU is starting to show age.

    As for 1070 thing, maybe, but then since most people use full HD screens, I'd say that 1060 is all you want there nowadays, odd game notwithstanding. Heck I'd go ahead and say that 1050Ti is the optimal deal for FHD there and it's what 200 bucks GPU? IMO that is a fair price.

  12. #152
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by mrgreenthump View Post
    He didn't specifically make a video about it, but used FX vs Sandy Bridge as an example, when he argued: "You can't extrapolate a given CPUs future performance in games by lowering the resolution", which was sort of the hot topic during Ryzen launch.. With many reviewers testing sub 1080p resolution and siting the results as an indication that over time the gap between 7700K and 1800X would increase with more powerful graphics cards. Then Hardware Unboxed made a response video to it, with some rather weird test results and AdoredTV was thrown under the bus by pretty much everyone, when you just had his word against HW unboxed, hence this video was born.

    But all this is slightly off topic, though atm I don't really know what the topic is anymore.
    Seems like he took down the video. He had made some errors, but the assessment that the FX-8350 is now faster in modern games is true. But the point is what we think is the future today is not the future tomorrow. This happens very often where what people think will happen doesn't happen. For example the ATI Radeon 8500 was slower compared to the Geforce 4 Ti. But then the Geforce FX cards were released and most games used DX8.1 cause it favored the FX cards, and it just so happens the Radeon 8500 was also DX8.1. Eventually games required DX8.1 and that left the Geforce 4 Ti users out, which pissed off a lot of people. So in the end the slower Radeon 8500 won due to features.

    Point is what maybe faster today, may not be better in 3-5 years from now. Pure speed is not always the best way to go.

    Quote Originally Posted by Vertigo X View Post
    I was with you til the last part. You hurt my feels.

    But in all reality the only thing that would change is that the equivalent of 1070 would become the high end and we'd be having this same thread about how $300+ GPUs are killing PC gaming.
    That's not how people work. If a $300 card is the best card you can get, then people are fine. But this is capitalism and there are people who are willing to spend more to get a slight edge in performance. Or this case, a lot more. This is why GPU's in the $100 to $250 range are extremely competitive while anything above that range is not. Hence the Nvidia 1030, 1050, 1050 Ti, 1060 3GB, 1060 6GB, which is a huge selection for a price difference of $150. While Nvidia's high end cards aren't as diverse until AMD attempts to dethrone them, and usually fails. For a while we had the 1070 and 1080, until the Vega cards were about to be released, and then Nvidia releases the 1070 Ti and 1080 Ti. If you don't have the fastest high end GPU then you aren't going to get the sales from people who buy them. People don't care how much a high end GPU costs, so long as it's the fastest. This is why there's no $300ish MSRP priced GPU cause that's a dead zone for GPU buyers. You don't see $320 or $350 MSRP GPU's. AMD hasn't tried to price GPU's in that range, and neither has Nvidia.

    Basically $500+ GPU's are for wealthy people, or people who are fiscally irresponsible because that's how Nvidia prices them. Remember the GTX 970 was $320 and now the 1070 is $370, though in reality it was never sold for $370 due to Founders Edition pricing. As much as people harp on the Intel i5's, we can see on Steam's hardware survey that until recently about 50% of people who gamed used dual cores. After Ryzen the quad cores are now the dominant CPU on Steam. Anything beyond quad cores is super low in representation. Why? Because very few people are willing to spend that much money for a 6 or 8 core, and most of the new quad cores are probably due to Ryzen's pricing which are now $100-$120.

    A Intel 8700K with a 1080 Ti is not the rule, it's the exception. The average person probably uses a Intel i5 8400 or a 6600k, with a Geforce GTX 1050 Ti or a GTX 1060. Beyond that you have people with G3258 and 750 Ti's. Beyond that you have first generation Core series CPU's with GTX 660's. Developers see this and an average is created and that's where they generally aim for, more or less in system requirements in games. It doesn't benefit people who own this end end hardware to have very few people using it. You want more people to have more CPU cores and more powerful graphic cards, cause then developers will throw more polygons and other features they wouldn't cause it would only benefit a very minor % of their audience.

  13. #153
    Quote Originally Posted by Dukenukemx View Post
    -snip-
    Greater options for customer is not a negative point for a video game and a game that adds top end support ages better in the long run. Crysis anyone?

  14. #154
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Vertigo X View Post
    Greater options for customer is not a negative point for a video game and a game that adds top end support ages better in the long run. Crysis anyone?
    I don't disagree, but when Crysis was released the meme "But Can It Run Crysis" was created. It shows that the average computer in late 2007 wasn't ready to deal with this game. This is not the fault of the game, but the fault of the hardware industry. Going by the Steam hardware survey of November 2008, which was a year later from when Crysis was released, it shows the average PC wasn't really ready for it. The game only requires DirectX9.0a, which by 2007 damn near everyone should have. But this is right after Vista, which slow down PC's especially graphics, and after the Geforce FX series was a horrible mess at running games in true DX9.0. And both ATI and Nvidia had problematic drivers for Vista plus the Xbox 360 and PS3 were a more popular alternative as a result, which resulted in a lot of PC gamers who didn't have the hardware needed to run Crysis properly. Around that time I think I have a ATI Radeon X1950 and I had no problem playing that game.

    Wasn't like this hasn't happened before in PC gaming history but after the Xbox 360 and PS3 the expectations of gamers was much higher. Remember Quake required a math coprocessor which not many people had at the time, and Quake 3 was extremely demanding for its time. So much so that the ATI Rage 3D chip was thrown in desktop PCs just to satisfy the ability to play that game, albeit very poorly. Can't play Quake on your 486 SX PC? Just use the plethora of upgrade chips that gave you Pentium like functionality and speeds. In 1999 you had a number of graphic cards to choose from. ATI, Nvidia, 3dFX, Matrox, S3, PowerVR are the ones I remember. Hell, Intel actually made graphic cards back then, and was pretty good too. In 2007 we had ATI and Nvidia. Today we basically have AMD and Nvidia, and for most people it's just Nvidia. The only saving grace today is that DX11 is all you need to run games and we've had DX11 since 2009. Is it ever a wonder why today we have $500+ graphic cards? Nvidia is basically an oligopoly.


    https://web.archive.org/web/20081214...vey/videocard/
    Last edited by Vash The Stampede; 2018-04-19 at 07:38 PM.

  15. #155
    Seems like AIBs are forecasting reduced demand for video cards and prices will probably drop with it in H2 2018 (if not sooner). Though there probably is still ~$50 markup from MSRP, due to memory pricing.

  16. #156
    Quote Originally Posted by Dukenukemx View Post
    I don't disagree, but when Crysis was released the meme "But Can It Run Crysis" was created. It shows that the average computer in late 2007 wasn't ready to deal with this game. This is not the fault of the game, but the fault of the hardware industry. Going by the Steam hardware survey of November 2008, which was a year later from when Crysis was released, it shows the average PC wasn't really ready for it. The game only requires DirectX9.0a, which by 2007 damn near everyone should have. But this is right after Vista, which slow down PC's especially graphics, and after the Geforce FX series was a horrible mess at running games in true DX9.0. And both ATI and Nvidia had problematic drivers for Vista plus the Xbox 360 and PS3 were a more popular alternative as a result, which resulted in a lot of PC gamers who didn't have the hardware needed to run Crysis properly. Around that time I think I have a ATI Radeon X1950 and I had no problem playing that game.

    Wasn't like this hasn't happened before in PC gaming history but after the Xbox 360 and PS3 the expectations of gamers was much higher. Remember Quake required a math coprocessor which not many people had at the time, and Quake 3 was extremely demanding for its time. So much so that the ATI Rage 3D chip was thrown in desktop PCs just to satisfy the ability to play that game, albeit very poorly. Can't play Quake on your 486 SX PC? Just use the plethora of upgrade chips that gave you Pentium like functionality and speeds. In 1999 you had a number of graphic cards to choose from. ATI, Nvidia, 3dFX, Matrox, S3, PowerVR are the ones I remember. Hell, Intel actually made graphic cards back then, and was pretty good too. In 2007 we had ATI and Nvidia. Today we basically have AMD and Nvidia, and for most people it's just Nvidia. The only saving grace today is that DX11 is all you need to run games and we've had DX11 since 2009. Is it ever a wonder why today we have $500+ graphic cards? Nvidia is basically an oligopoly.


    https://web.archive.org/web/20081214...vey/videocard/
    Crysis was neat to use as a benchmark, but not indicative of anything about the industry itself. I do agree, Q3 was a big hardware pusher at the time, and a must-have for most pc gamers (which separates it from Crysis).

    Proliferation of consoles actually slowed game development, allowing PC hardware to catch up and surpass what was required by games, many of which were either ports, or had porting in mind when they were created. At this point I think the days of many games really making higher end hardware struggle mightily are kind of over unless you want 4k, which I don't think many gamers bother with at this point.

    Right now I have nothing in my Steam library that makes my system chug (34" ultrawide, GEforce1080, etc). Skyrim slows a bit if I a ton of visual add-ons going, and Blender rendering can still be quite time consuming (1 hour + for the last still I made).

  17. #157
    The upcoming GTX 1170 is rumored to have a MSRP of $499.
    "Every country has the government it deserves."
    Joseph de Maistre (1753 – 1821)


  18. #158
    Quote Originally Posted by Amalaric View Post
    The upcoming GTX 1170 is rumored to have a MSRP of $499.
    Not really surprised if true, they'd still sell like hotcakes. It is a bit silly though if it doesn't have some crazy tech, granted a good $50 of the price can be attributed to inflated memory prices.

  19. #159
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by mrgreenthump View Post
    Not really surprised if true, they'd still sell like hotcakes. It is a bit silly though if it doesn't have some crazy tech, granted a good $50 of the price can be attributed to inflated memory prices.
    I swear the GTX 970 was $320. People would attribute this to inflation but wages haven't been increasing for the past 10 years. Which is odd when this is a time period when housing prices are skyrocketing and so is rent. So I'm going to attribute this to Nvidia can and will increase prices, cause its not like AMD is a competitor.

    We really need Intel as a third competitor in the GPU market, and I would so rock a blue Intel graphics card in my PC. That looks pretty cool to me.


  20. #160
    Deleted
    I'm gaming on a 970gtx bought several years ago for like 350dollars. The last recent game i played was far cry 5 and it was constant 50 60 fps in 1080p. I'm good.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •