Page 1 of 5
1
2
3
... LastLast
  1. #1

    How come graphics in games haven't improved that dramatically in the past 10 years?

    Was talking to my brother about the Witcher TV series and mentioned the games to him since he's not really a gamer. And it occurred to me.. The graphics in the Witcher 2 are just a small step below even some AAA games now in 2020, and that game came out in 2010. If you look in the past decade leaps such as 1990-2000 and 2000-2010, video game graphics on both console and PC took HUGE strides compared to 2010-2020. Is there a specific reason for this?

    Example: Metal Gear Solid (2000) which was known to have amazing graphics at the time VS Mass Effect 2 (2010)






    Obviously graphics don't make a good game, but with the jumps in technology over a decade, should we have expected more?

    The graphics in games like Battlefield 3, Red Dead 1 and Chrysis 2 are 10 years old but still look really good. BF3 still looks like it could be a recent title:

    Last edited by Chingylol; 2020-02-11 at 12:13 AM.

  2. #2
    Greed increased. Companies want to make as many games as possible for as little as possible to maximize profits.
    If what doesn't kill you, makes you stronger. Then I should be a god by now.

  3. #3
    because capitalist greed is causing us to stagnate in every single aspect of society - technology, medicine, science, art & entertainment, etc. we've reached a dead end. it generates more profit to make the millionth version of an existing product with very minor improvement than working on something innovative
    Last edited by gd8; 2020-02-14 at 06:54 PM.

  4. #4
    Bloodsail Admiral Phurox's Avatar
    15+ Year Old Account
    Join Date
    May 2008
    Location
    Garrison, Frostfire Ridge
    Posts
    1,123
    Games are made with consoles in mind and its always outdated hardware.

  5. #5
    Immortal Darththeo's Avatar
    7+ Year Old Account
    Join Date
    Apr 2014
    Location
    A long time ago in a galaxy far, far away
    Posts
    7,886
    Are we ignoring WoW is a 16 year old game that started work in 2001?

    If you don't think the graphics have improve drastically ... look at some of the in game cinematics from Wrath and compare to what we got in BfA.
    Peace is a lie. There is only passion. Through passion I gain strength. Through strength I gain power.
    Through power I gain victory. Through victory my chains are broken. The Force shall set me free.
    –The Sith Code

  6. #6
    Photo-realistic graphics in video games have diminishing returns for the amount of work required to keep on improving them indefinitely. Processing power limited real time rendering significantly in the previous decades, whereas now it's more a case of the work hours required to keep on making models that 1% better each time. Chasing better and better graphics is a large part of why AAA video game development has become so expensive lately. At a certain point the money and time investment stops being worth it.

    Overall the answer to the question of why graphics haven't improved as much in the past decade is a multi-faceted one, but I'd say the TLDR is largely because 1) photo-realistic raphics start to plateau at a certain point, 2) we're no longer seeing the exponential boom in development resources that allowed the average dev team size to increase so dramatically from the 90s to the aughts to the 2010s, and 3) not everyone is chasing higher fidelity graphics any more now that the industry as a whole has started to cotton on to the fact that strong artstyles can be just as, if not more appealing than photo-realism along with costing less.

  7. #7
    Quote Originally Posted by Chingylol View Post
    Was talking to my brother about the Witcher TV series and mentioned the games to him since he's not really a gamer. And it occurred to me.. The graphics in the Witcher 2 are just a small step below even some AAA games now in 2020, and that game came out in 2010. If you look in the past decade leaps such as 1990-2000 and 2000-2010, video game graphics on both console and PC took HUGE strides compared to 2010-2020. Is there a specific reason for this?

    Obviously graphics don't make a good game, but with the jumps in technology over a decade, should we have expected more?

    The graphics in games like Battlefield 3, Red Dead 1 and Chrysis 2 are 10 years old but still look really good. BF3 still looks like it could be a recent title:



    when witcher 2, came out, the developerse ven said the graphics cards available today, will not be able to play the game at max,

    the Game back then when released had 16x antiscopic filtering, most graphics cards could handle 8 max, with a few fps drops.

    they also had and was also one of the first games to have Cutscene depth of Field to allow shaodows/lighting and such in your cutscenes and allow the real game world during a cutscene,

    the game was ahead of its time,

    CD Project, was amazed themselves what they pulled off, and also said that they wouldnt downgrade a port to XBOX/Playstation, and it made a few people upset, a few years later they released a downgraded version at 720p, for the consoles, bearing in mind, that Witcher 2, Shipped with 1080p to PC, when 1080p wasnt a standard among graphics cards and 720p was the go to.


    just take a look at Witcher 3, to see how good this studio is, they ported a 1080p PC game with 96g of Data, and a Game world larger than any other Switch game to Nintendo Switch, and it works, it runs 540 in handheld and 720 in the dock.

    but you cant really notice it hence the screen is small enough to compensate
    Originally Posted by Ghostcrawler

    If you are trying to AE tank and a bad dps is attacking the wrong target and dies, we call that justice.

  8. #8
    They have improved considerably, especially on console. It's just not always as readily apparent, at least on PC, where they've always been able to push graphics more compared to console.

    We've had tons of new tech developed over the years, dynamic resolution scaling, DLSS anti-aliasing, HDR integration, ray tracing etc.

    And you can see in that video that poly counts and meshes are lower fidelity than you get from AAA games nowadays, in addition to texture resolutions that are as good if not better than what we see in that video which was more a RAM limitation based on my understanding. Just look at things like the fidelity of the "damaged" parts of the wall in the thumbnail.

    Progress isn't linear though, and you'll see spikes with certain advancements followed by smaller improvements over the years. The thing is more than we're going less for pure photorealism now and more for different art styles and directions to help games age better over time as photorealism ages very poorly. Not to mention that you similarly get decreasing returns on investment visually.

    Though your comparison videos aren't like for like. You have a PS1 game, a PS3 game that never pushed for graphics, and then a PC game where the graphical fidelity was one of the key selling points.

  9. #9
    They did improve, but in a different way. 10 years ago we had overzealous post-processing and a lot of effects were just textures displayed on top of game world. Now most things are actually rendered in engine. Even in the video you linked it's easy to see that all blood splatter, dust, lens flares and most of the dust clouds and smoke are actually 2D effects on top of stuff. It works is you let the game flow carry you, but it never lives up to scrutiny if you ever stop and look at things closer.

    And, unfortunately, small improvements like that eat ridiculous amounts of horsepower. Something as natural to any non-gamer as light actually reflecting of stuff (i.e. raytracing) is so demanding they had to add hardware components to GPUs to handle it. The processing cost of actually simulating things for real, as opposed to making a convincing facade is rather mind boggling. But facades only work in games like scripted shooters, where you're not expected to look too closely. If you want slower paced games to look as good, you need to pay the price.

  10. #10
    they have improved a ton, it's just that like with every type of progress it's cyclical.. for some time it's slow and then it goes fast only to slow down after some time before it breaks through once again..

    oh and another reason why you may not realize that graphics have improved a lot because at some point graphics reached quite realistic state already and after that it's even harder to improve the visuals so if you keep playing new games year after year you will not notice many differences.. but if you were to play a game from 2010 and then play a game from 2020 you'll see massive difference

    oh and there is a massive difference in graphics between bf3 and bf5 for example
    Last edited by Craaazyyy; 2020-02-11 at 12:32 AM.

  11. #11
    Quote Originally Posted by Echeyakee View Post
    They did improve, but in a different way. 10 years ago we had overzealous post-processing and a lot of effects were just textures displayed on top of game world. Now most things are actually rendered in engine. Even in the video you linked it's easy to see that all blood splatter, dust, lens flares and most of the dust clouds and smoke are actually 2D effects on top of stuff. It works is you let the game flow carry you, but it never lives up to scrutiny if you ever stop and look at things closer.

    And, unfortunately, small improvements like that eat ridiculous amounts of horsepower. Something as natural to any non-gamer as light actually reflecting of stuff (i.e. raytracing) is so demanding they had to add hardware components to GPUs to handle it. The processing cost of actually simulating things for real, as opposed to making a convincing facade is rather mind boggling. But facades only work in games like scripted shooters, where you're not expected to look too closely. If you want slower paced games to look as good, you need to pay the price.
    Pretty much this. Developers have really stopped trying to push everything to its limit and have instead focused on making the small details more realistic, detailed and noticeable.

    Weve also taken to adding as much increased resolution as possible with 4k capability being a benchmark and 8k on the horizon.

    Graphics have improved dramatically over the last decade, even over the last few years with things like RTX capabilities becoming baked into a lot of games (hurry up Minecraft).

    I also dont think you can use BF3, RDR, and Crysis as good measurement points. Two of those games (BF3 and Crysis) were WAY ahead of readily available consumer affordable hardware. Naturally those two games would still look fairly decent.

    Lastly, you have to think of things like physics (PhysX if you love branding) and how that has changed the game. It used to be that when you killed random creep 43, he might have a cool death animation, but now we have the kid from Shameless zeroing womprats with a Stormtrooper in real time. Its great.

    Edit/p.s. : How can you say that graphics havent improved when theres a video of a company using VR to replicate a womans dead child at least well enough for her to have a seriously emotional moment?
    Last edited by G3 Ghost; 2020-02-11 at 12:38 AM.

  12. #12
    As others have said...graphics have improved considerably. Not just in the qualitative terms...but quantitative as well.

    Also, keep in mind that a lot of these games are multiplayer and people want to run them in 4k at 60 FPS...all on a console they spent <$500 on.

    It's sort of how like they say service can be good, cheap, fast...but you'll only ever get 2 of those at the same time.

  13. #13
    short answer: Consoles holding PC back, little competition on the GPU market (NVIDIA has had the best GPU for years now, uncontested), cost of developing new engines, monitor availability (1080p vs 4k, focus on more FPS rather than more quality, introduction of 144Hz Displays), focus on E-Sports, and personal habit of accepting "old" games despite their lacking graphics.

    Long version:

    Most of the games are made for PS and XBox and then ported onto the PC or use the same engine for both Systems while getting optimized for PC. Since the consoles only change every 7 years or so, every new generation huge improvements can be made. Raytracing is introduced just now and will probably be widely used once PS5 and XBox next (whatever it is called) will ship with raytracing compability. The next consoles will be probably on par with the best gaming PCs today, so major progress can be made then.

    The power of graphics cards did not increase between 2010-2020 as much as it did 2000-2010. And in 1990 most people did not even own a PC. If we take NVIDIA as an example: 2000 their best GPU had up to 64MB vram, and it clocked at 120MHZ. 2010 their best GPU had 772 MHz (600% increase) and 1,5GB vram (2300% increase). 2020 NVIDIA's best GPU has a clock of 1350 MHz (200% increase) and vram of 11GB (700% increase). Of course that's only one way of measurement, but you can see, that the rate of improvement NVIDIA had was thirded. And AMD is not even competitive at this point, which may have led to this developement.

    Most companies use only one engine. BF3 was the first game to use Frostbite, this engine is still in use. And it will probably for another few years, because it's very expensive to develope such an engine. And the videogame market is highly competitive at this point in time, so a new engine is not in the budget for most games.

    Also 2000 we had like what solution? 480p? 1080p was established over the years and is still standard today. 4k is on the horizon, but not nearly as widely used as 1k. Then again, modern gamer monitors feature more than the normal 60Hz refresh rate. Most notably 144Hz, which is not standard yet, but cuts into the quality of games. To make a game playable at 144Hz you need more than double the graphics power than you need for 60Hz. So as a company you have to decide: Optimize your game for FPS or make it beautiful? With the introduction of E-Sports, 144Hz displays became more important (because marketing and such) and many companys did optimize or performance rather than quality.

    On the topic of E-Sports: E-Sport games are often optimzied to be profitable, low maintainance, longlively and widely available. As Esports is the gaming phenomena of the 202. decade, companies often decided to cut graphics quality to make their game available to more people. Think lol, Dota, Fortnite, PubG and others. Some of those titles can even be played on phones.

    And lastly: Maybe you got used to how games looked 10 years ago. Ask someone 10 years younger than you what they think about games released around 2010 and how they feel those games have aged. Maybe they will give you the same answer you would give when asked about games released in 2000.
    Last edited by LordVargK; 2020-02-11 at 01:00 AM.

  14. #14
    Return on Investment.

    It takes more resources, both to develop the assets, and for the hardware to render the assets, while providing less visible improvement to the player.

    At a certain point, its easier to simply accept the level of graphics, but introduce other ways to make the game more realistic and immersive. (thats if realism is even the goal thats being aspired to in the first place)

  15. #15
    Quote Originally Posted by Vegas82 View Post
    They have. Console graphics have only slightly improved because the current gen came out 7 years ago. Check back when the next gen releases.
    This generation was a big bump over last gen, though interestingly I don't think we've seen the kinds of leaps in graphical fidelity for games at the tail end of the console lifecycle compared to previous generations. Like, The Order 1886 still looks bloody fantastic on a 4K TV and that's still running at 1080p and never got a proper update to support the Pro. And when you compare to other recent games it's still visually impressive as hell, games have made bigger strides in other areas this generation.

  16. #16
    the early leaps were because there were revolutions in hardware design that made it possible. but as that market matured that stopped happening and new hardware is mostly just doing the same thing but better.

    so until (if) another revolutionary technology comes a long were stuck in a place where slightly better art not takes a lot more work/money.

    I'm personally thinking the next revolution won't be in hardware though, that field is way to mature by now. If there is another one it'll be some machine learning stuff that automates parts of the art pipeline.

  17. #17
    Many good answers here, aside from the "capitalism ruins graphics!" argument. When it comes to tech, consoles are a huge limiting factors... but then again, it's not just the consoles: it's the monitors/TVs, as well. You can pump up the resolution on graphics as much as you want, but if the vast majority of people consuming the media are still using hardware on all levels that won't support it or can't run it well, there's no point to making the media go above and beyond what most people can experience.

    However, this extends beyond graphics, as we're currently hitting some limitations with tech in general, or at the very least the challenges introduced when advancing beyond our current tech are larger. Maintaining the current rate of development likely isn't possible unless something radically new or game-changing is developed.
    “Society is endangered not by the great profligacy of a few, but by the laxity of morals amongst all.”
    “It's not an endlessly expanding list of rights — the 'right' to education, the 'right' to health care, the 'right' to food and housing. That's not freedom, that's dependency. Those aren't rights, those are the rations of slavery — hay and a barn for human cattle.”
    ― Alexis de Tocqueville

  18. #18
    Void Lord Aeluron Lightsong's Avatar
    10+ Year Old Account
    Join Date
    Jul 2011
    Location
    In some Sanctuaryesque place or a Haven
    Posts
    44,683
    Pretty sure a game from 2010 wouuld be kinda dated to a game that came out in 2020.
    #TeamLegion #UnderEarthofAzerothexpansion plz #Arathor4Alliance #TeamNoBlueHorde

    Warrior-Magi

  19. #19
    Most consoles even pcs, are entering the upper limit in what they can handle. I read an article about PS and XBOX's new consoles will only be a small jump over the current-gen and it has to do with the limitations in the hardware, and not just for consoles. If the goal is photorealism, current technology just can't handle that yet. Yeah maybe it can make shadows look better and add some raytracing but there is still a limit on what home machines can compute. Since Sony and MS are catering toward those customers it doesn't make sense for them to do anything else because the price point would be too high (Better looking = more expensive hardware) and no one would but the product. It's a dance you see between keeping the customer happy and making games look better than last gen. There will however be some improvements with monitors/tvs/etc that will make things like better but the hardware in the console, not so much.

  20. #20
    We are currently in the midst of a technological bottleneck. We have made transistors so small (to fit more processing power) that literally, electrons are almost unable to pass through them. Quantum computing is our next logical step, because we've gone nearly as far with our current methodology of computing.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •