Update: A new beta build is on the way!

Engineer’s Workshop: Engine Evolution in Warlords of Draenor
Originally Posted by Blizzard (Blue Tracker / Official Forums)

Welcome to the first in an ongoing series of programming- and engineering-focused articles that, over time, will cover some of the technical nuts and bolts that go into creating and running World of Warcraft.

Before we kick this first one off, a quick warning: What follows is a fairly technical explanation for a graphical-setting change related to anti-aliasing. Most of you probably won’t notice any difference at all—this is primarily for those who tend to tinker with their hardware and graphical settings.

In short, we’re taking strides to improve the performance of World of Warcraft, while also ensuring there’s plenty of potential to further increase graphical fidelity and enhance our support of high-end CPUs and graphics hardware.

For Warlords of Draenor, we made a decision to remove Multisample Anti-Aliasing (MSAA) and instead include a new anti-aliasing technology called Conservative Morphological Anti-Aliasing (CMAA). This change is going to allow us to bring some overdue technological advancements to World of Warcraft over the course of the next few years—we’re thinking long-term with this change.

One reason MSAA remained viable for WoW over the past decade was that the GPU had the time and resources to handle it. WoW has been a CPU-bound game for much of its lifetime, but during the Warlords development cycle, we endeavored to change that. A lot of that work involved analyzing the flow of data through our code and making sure we work on only what we need to for any given frame. One example is we now variably reduce the number of bones that need to be animated based on proximity and view (sometimes called level of detail, or LOD), a primary consumer of CPU time. We’ve also added a job system that the engine uses to task out animation and scene management in ways we had prototyped in Patch 5.4, but are expanding in Warlords.

The outcome of all of this is that more than ever before, World of Warcraft relies heavily on a GPU that previously was largely free to handle things like MSAA. We explored a number of options to reconcile this increased GPU demand with the game’s anti-aliasing needs, and ultimately decided to embrace CMAA as our anti-aliasing technology for Warlords of Draenor. As with anything that can potentially change the look of the game, we vetted removing MSAA through our engineering and art teams before coming to the conclusion to swap it for CMAA. CMAA provides solid anti-aliasing at a fraction of the cost in memory and performance. It also integrates well with technologies we have planned for the future, and helps us bring those to the game sooner. We also support FXAA (Fast Approximate Anti-Aliasing), an even lighter-weight solution, as an option for our players using DirectX 9.

CMAA fulfills our goals of providing high-quality anti-aliasing at reduced performance cost, while giving us the extra headroom we need to further improve the graphical fidelity of the game. We don’t have to make any architectural concessions within the engine for CMAA to work, and for Warlords of Draenor we’ve already been able to implement new graphical features like target outlining, soft particles, a new shadowing technique, and refraction—and more graphical features are on the horizon for future patches and expansions.

For the launch of Warlords of Draenor, CMAA is the top-tier graphical setting available, but after release we’ll be exploring more options for players with high-performance graphics cards—and if they provide quality while still fitting into our future technology plans, we’ll take a serious look at adding them to the game.

The graphical future of World of Warcraft is a bright one, and the changes we’ve made during the development of Warlords of Draenor have laid the groundwork for us to continue making the game look better and better far into the future.

Thanks for reading!
This article was originally published in forum thread: Engineer’s Workshop: Engine Evolution in Warlords of Draenor started by chaud View original post
Comments 168 Comments
  1. ablib's Avatar
    Quote Originally Posted by Ayperos View Post
    Well, my monitor is only 1080p, don't think it can go higher. Plus I play at 1600x1200, so...

    1080p is 1920 x 1080. But you're running it at 1600 x1200 ? What model is your monitor? People who play at this resolution as their native resolution, need to get better monitors.
  1. Paula Deen's Avatar
    Quote Originally Posted by ablib View Post
    1080p is 1920 x 1080. What model is your monitor? People who play at this resolution as their native resolution, need to get better monitors.
    Wait, are you saying people who play native 1080p need to get better monitors?
  1. Gaebryel Quintyne's Avatar
    My monitor says the recommended resolution for it is 1920 x 1080. It does not go any higher. Can I still use downsampling on this monitor or it won't work 'cause I checked the link for downsampling and it requires resolution to be much higher than that I think.

    How do I know if my monitor is 1080p? Where can I find that out?
  1. Paula Deen's Avatar
    Quote Originally Posted by Gaebryel Quintyne View Post
    My monitor says the recommended resolution for it is 1920 x 1080. It does not go any higher. Can I still use downsampling on this monitor or it won't work 'cause I checked the link for downsampling and it requires resolution to be much higher than that I think.
    You can add custom resolutions that go higher via things like Nvidia Control Panel. Go read through the link I posted, he goes in depth to explain exactly what to do, it is pretty crazy. I can play WoW at 4k downsampled (3840x2400), and I do not notice any FPS hits at all, plus the Anti-Aliasing is no longer needed due to the Downsampling.
  1. ablib's Avatar
    Quote Originally Posted by Paula Deen View Post
    Wait, are you saying people who play native 1080p need to get better monitors?
    Yes. People who use the term "1080p" while referencing their computer monitors are noobs. 1920 x 1080 is a terrible PC resolution. 1920 x 1200 is the closest comparable resolution. A monitor with a native resolution of 1920 x 1080 is a poor quality screen.
  1. Paula Deen's Avatar
    Quote Originally Posted by ablib View Post
    Yes. People who use the term "1080p" while referencing their computer monitors are noobs. 1920 x 1080 is a terrible PC resolution. 1920 x 1200 is the closest comparable resolution. A monitor with a native resolution of 1920 x 1080 is a poor quality screen.
    Not entirely true, I mean, my monitor is 16:10, so I play at 1920x1200, but still, 1080p looks fine, and most monitors that use it are fine too.
  1. Gaebryel Quintyne's Avatar
    Quote Originally Posted by ablib View Post
    Yes. People who use the term "1080p" while referencing their computer monitors are noobs. 1920 x 1080 is a terrible PC resolution. 1920 x 1200 is the closest comparable resolution. A monitor with a native resolution of 1920 x 1080 is a poor quality screen.
    1920 x 1200 resolution gives like black bars if you want to run anything in 16:9 screen. 1920 x 1080 is 16:9 so that should be better, no?

    Btw, I have a AMD 7870 GFX card.
  1. Ayperos's Avatar
    Quote Originally Posted by ablib View Post
    1080p is 1920 x 1080. But you're running it at 1600 x1200 ? What model is your monitor? People who play at this resolution as their native resolution, need to get better monitors.
    Old. Samsung SyncMaster BX2231.
  1. ablib's Avatar
    Quote Originally Posted by Paula Deen View Post
    Not entirely true, I mean, my monitor is 16:10, so I play at 1920x1200, but still, 1080p looks fine, and most monitors that use it are fine too.
    No it's pretty true. A 1920x1080 screen is of lesser quality than 1920x1200. I can see it instantly, and it drives me insane. It's not a computing resolution.

    Unfortunately, in the past 5-6 years or so, monitor manufacturers have been clinging to the "1080p" term as a reason to produce shitty monitors, because 90% of consumers think that 1080p is the best.

    Granted it's not as noticeable on 22" or smaller monitors, but on a 24, 27, or 30" monitor it's absolutely awful.
  1. krihan's Avatar
    its kinda funny that i had never been able to pull 60+ fps consistently with my 4770k and gtx 780 sli setup in raids . i guess this change will make it easier to have consistent fps, without randomly dropping frames.
  1. ablib's Avatar
    Quote Originally Posted by Gaebryel Quintyne View Post
    1920 x 1200 resolution gives like black bars if you want to run anything in 16:9 screen. 1920 x 1080 is 16:9 so that should be better, no?

    Btw, I have a AMD 7870 GFX card.
    You should set your monitor at whatever the native resolution is. 1920 x 1200 is better than 1080p. 1920 x 1200 could also be called 1200p (higher is better).

    1920 x 1200 should also be set at 16:10.
  1. Paula Deen's Avatar
    Quote Originally Posted by krihan View Post
    its kinda funny that i had never been able to pull 60+ fps consistently with my 4770k and gtx 780 sli setup in raids . i guess this change will make it easier to have consistent fps, without randomly dropping frames.
    I can say for sure, I never dip below 50 in very crowded areas on beta, the lowest I was dipping was Pre-Optimization Tanaan Intro and that was 45, they really did a good job with the optimization, despite the terrible AA changes.
  1. Kanegasi's Avatar
    Oh sweet, a blog post about all the Engineering changes.

    *reads*

    Oh...

    Sweet, a blog post about graphical changes. Legitimately interested, not sure if better than what I thought the blog was about.

    - - - Updated - - -

    Quote Originally Posted by Phabulous View Post
    "Morphological" is only a word if you're a Power Ranger.

    I ended up here thinking this would be an article on Engineering the profession. I am le disappoint.
    https://software.intel.com/en-us/art...-aliasing-cmaa
    Origin of word: https://en.wikipedia.org/wiki/Morphology_%28biology%29

    "Morph" is not just a verb from a kid's show, it's from the Greek word "morphé" which means "form". Showing only the form seen.
  1. Melchior's Avatar
    Quote Originally Posted by ablib View Post
    Yes. People who use the term "1080p" while referencing their computer monitors are noobs. 1920 x 1080 is a terrible PC resolution. 1920 x 1200 is the closest comparable resolution. A monitor with a native resolution of 1920 x 1080 is a poor quality screen.
    Most 3D games scale horizontal+, not vertical-. 1920x1080 is 16:9 which is a wider aspect ratio than 1920x1200 which is 16:10. Horizontal+ means 3D games will display the same vertical information in your game regardless of resolution, but add to the sides the wider your aspect ratio is. That means 1920x1200 has less of the game on the screen, because the sides are cut off compared to 1920x1080. The extra 120 vertical pixels are useful in the Windows UI and games that do not have aspect ratio scaling, but those games are far rarer. For most games that scale with aspect ratio, wider is better. 16:9 > 16:10 > 4:3 > 5:4

    As for the panel quality, there are plenty of IPS 1920x1080 monitors.

    Not that I particularly care, as my monitor is 2560x1440 (16:9 so woot), but your information is out of date.
  1. Mormolyce's Avatar
    Quote Originally Posted by ablib View Post
    Yes. People who use the term "1080p" while referencing their computer monitors are noobs. 1920 x 1080 is a terrible PC resolution. 1920 x 1200 is the closest comparable resolution. A monitor with a native resolution of 1920 x 1080 is a poor quality screen.
    ...What? I don't know what the hell you're talking about. "1080p" is a term used to market TVs, it means 1920x1080 - ie an aspect ratio of 16:9. Thanks to the success of big screen TVs this has become a de facto standard, so a lot of monitors are made in this ratio now rather than 16:10 (1920x1200), which was briefly popular several years ago. I'd say most monitors these days are 16:9 rather than 16:10.

    One is not "better" than the other, it's simply two different "widescreen" standards. 16:10 is slightly taller. I mean okay, you might personally prefer one over the other but that's just opinion.

    And regardless of which you have you should always run your monitor in its native resolution because modern PC monitors look shitty in non-native resolutions because they have to approximate them, unlike the old CRTs that were capable of the full range of resolutions without loss.

    - - - Updated - - -

    Quote Originally Posted by Phabulous View Post
    "Morphological" is only a word if you're a Power Ranger.
    Or trained in science, or Greek.

    - - - Updated - - -

    Quote Originally Posted by ablib View Post
    You should set your monitor at whatever the native resolution is. 1920 x 1200 is better than 1080p. 1920 x 1200 could also be called 1200p (higher is better).

    1920 x 1200 should also be set at 16:10.
    It's only "better" to set your monitor to 1920x1600 (16:10) if that's your monitor's native resolution, which I think is unlikely unless you went out of your way to buy one, or it's quite old.

    - - - Updated - - -

    Quote Originally Posted by Gaebryel Quintyne View Post
    My monitor says the recommended resolution for it is 1920 x 1080. It does not go any higher. Can I still use downsampling on this monitor or it won't work 'cause I checked the link for downsampling and it requires resolution to be much higher than that I think.

    How do I know if my monitor is 1080p? Where can I find that out?
    If it says the recommended resolution is 1920x1080, that means it's 1080p.

    - - - Updated - - -

    Quote Originally Posted by Lolsteak View Post
    The downside is that it could be at the cost of a greater proportion of the playerbase.
    Yeah I'm sure millions will quit WoW because of MSAA.

    Obviously WoW is a game you play because you're a massive graphics snob.
  1. mmocd78055c23a's Avatar
    Quote Originally Posted by Lolsteak View Post
    The downside is that it could be at the cost of a greater proportion of the playerbase.
    I wouldn't call it a loss to lose a player who plays for the graphics.
    WoW has always had a cartoony look about it, and I think it would be best to keep it that way.
  1. sheppo's Avatar
    Quote Originally Posted by bluspacecow View Post
    The pro "we like our graphics sharp and not fuzzy" MSAA crowd ain't gonna like this.
    it's not an issue as long as we can force a different AA method through our graphics card control panel... some games that specify AA methods don't let you do that. e.g. any game that uses FXAA will not let you force, MSAA or SSAA for example. I run with 8x SSAA because it actually AA's foliage and text.
  1. Granyala's Avatar
    CMAA looks crap.
    I doubt that any current GPU will be taxed out by WoW anytime soon, no matter what they add. <_<
  1. Yig's Avatar
    Quote Originally Posted by Agraynel View Post
    Excellent, so it means I can still enjoy the game with my 3 year old PC.
    WTF? I built my PC on Newegg for $800 in 2008 and beyond replacing my graphics card twice since then, I run WOW at Ultra 1080p and 30-60+ FPS.

    Three years?
  1. lb's Avatar
    Quote Originally Posted by Ulgrim View Post
    The jagged edges makes the game more savage™.
    You mean more "sawage™"

Site Navigation