Originally Posted by
Cantii
The most common - and best - form of Anti-Aliasing is MSAA, or Multi-Sample Anti-Aliasing. In short, it takes samples of pixels on edges and curves and blends them together creating a smoother image. The most brute-force form of AA is SSAA, or Supersampling Anti-Aliasing. SSAA takes an image and blows it up to 2x, 4x or 8x its size then shrinks it back down to your resolution, creating a much higher quality image. SSAA takes a TON of GPU power and even the highest end cards have issues with SSAA, it's just straight brute force.
MLAA is AMD's new proprietary Anti-Aliasing, Morphological Anti-Aliasing. It's a cheater's form of AA, but it produces a quality image without tanking performance, which is especially useful if you have a lower-end video card. MLAA takes an image and then after the image is processed, it uses edge-detection to smooth out any and all edges, even the edges of text. So if you're playing a game with high resolution text, the text will look like garbage with MLAA enabled.
NVidia's proprietary forms of Anti-Aliasing are mostly based on shaders and luminescence, filling in shadows and light. Depending on the settings (2x, 4x, 8x etc) and the game, it can either produce a great looking picture or a crappy looking picture. Games that benefit from things like NVidia's QXAA, FXAA, etc. are fast paced games, such as Dirt where everything is blowing by you and is a blur anyway. With games where things are more stationary, like say StarCraft 2, it'll produce a horrible image quality and you're far better off going with something like MSAA.
AMSAA is just Advanced Multi-Sampling Anti-Aliasing, does a better job than MSAA but uses more resources.
Edit: Also, NVidia's 16x AA ONLY works in SLI, so that argument isn't even close to valid.