[quote]I wanted to chime in here, because I think FXAA, and other similar methods, including MLAA, are the future of antialiasing on the PC. Multisample Antialiasing has suited gamers well for the last decade and a half. There have been improvements in each new GPU architecture that keep making MSAA faster, and less of a memory bandwidth hog. However, it can only be taken so far, and MSAA has some major drawbacks. Traditional MSAA only reduces aliasing on polygon edges, i.e. the edges of geometric game objects. It also cannot reduce aliasing on alpha textures, and it cannot reduce aliasing associated with specular shader programs.
As games have become more shader based, we’ve seen specular shader aliasing start to be a problem. About 7 years ago, when the original Far Cry was released, was when we were brought into the world of shader programs being used heavily in a game. We can point you to evaluations years ago, where we have screenshots that clearly display how evident shader aliasing can be. If you look at this screenshot you will see specular shader aliasing on the corners of this object. Now, this was primitive vertex shading, but still quite evident, MSAA can’t cure this. This screenshot shows very clearly how bad specular shader aliasing can be, all over the walls, and the floor. These are but a couple of examples, but it is things like this that traditional MSAA can’t help you with, while methods like FXAA and MLAA can.
[/quote]
Source: http://www.hardocp.com/article/2011/07/18/nvidias_new_fxaa_antialiasing_technology/5
I though all forms of AA were to address large pixels and thus the jaggies. I personally thought AA technology is but a stepping stone to high pixel density PPI (pixels per inch) because on smart phones that have high PPI AA isn’t needed anymore? Fr at least polygons and simple AA. I don’t know as mch about the other more advanced features
If memory serves, then magazine print is around 1200 dpi with a natural antialiasing of ink platter…smartphone are no where near that (regardless of what Apple might say). My eyesight isn’t great and I can still see the dots in print.
WRT the PS3. Click on the link that I provided above and you’ll see alot of discussion of FSAA on it.
The goal of SSAA and MSAA is in my opinion primarily to supply sub-pixel accuracy. That means mainly remove/reduce shimmering of moving objects. No amount of post processing is gonna solve that. The information has simply been lost during rasterization. Saying anti-aliasing is reducing only reducing jaggies is completely incorrect. FXAA/MLAA is just blurring, it doesn’t have any more pixel data compared to no “AA”. We could use something to replace MSAA, but it sure as hell isn’t blurring.
Jaggies and mosaic’ing are still quite visible even if the pixel size is smaller than visible. Edges have a regular pattern to then that looks stuck to the screen as the scene movies and so on. Even with 2k and 4k movie formats they do a lot AA in their renders. Graphic artists with the final piece in print at 1200dpi still spend a lot of time with AA, often as a post process from a much larger “master” render.
I don’t get it. Isn’t the commenter right? He gives screenshot examples where the aliasing is dramatically reduced. Is this just face-palming at one small section of his comment?
Although I personally find aliasing is less of an issue now then it used to be, because the resolution of monitors has increased. It’s still there, it’s just far less noticeable.
The article identifies anti aliasing with jagged edges, when the main point of it in my opinion is subpixel information for huge quality gains for small geometry, and less shimmering and snapping. The face-palming is about the whole article, that says 4xMSAA is about equal to FXAA. Complete utter bullshit IMO. There’s a reason there’s a bigger performance hit for proper anti aliasing. They’re just feeding lies to console gamers to justify the crappy hardware they have to work with. For gods sake, I’m playing my games at 18 times the geometry resolution (that’s area though, not width AND height) compared to consoles! That’s 720p VS 1080p + 8xMSAA. 720p without AA is complete bullshit on a TV. HD console gaming is a huge lie.
You’re right about screen resolution though. As monitors get higher pixel density, the need for AA is severely reduced. On my 24-inch monitor I can easily see the difference between 4xMSAA and 8xMSAA (and supersampling of course). On my 17-inch laptop, I don’t need any higher than 2xMSAA to reduce shimmering to very acceptable levels. No AA at all looks like crap, though.
What annoys me most is the frequent invalid claims, the extremely harsh treating of MSAA and that they justify using a worse AA method by pointing out MSAA’s flaws.
[quote]Traditional MSAA only reduces aliasing on polygon edges, i.e. the edges of geometric game objects. It also cannot reduce aliasing on alpha textures, and it cannot reduce aliasing associated with specular shader programs.
[/quote]
Double false. Transparency MSAA looks infinitely better, shimmers less and is almost identical in quality to SSAA, and costs basically nothing compared to normal MSAA. You just use a custom alpha testing shader that outputs a psuedo-random sample mask. Good performance, epic quality. (Also allows order independent transparency to some extent.) FXAA does nothing for the main problem of alpha textured geometry, shimmering. Specular aliasing also happens to be improved a lot in deferred shading renderers as lighting has to be done per sample anyway.
In no sense does FXAA do any of these two better.
[quote]If you look at this screenshot (link) you will see specular shader aliasing on the corners of this object. Now, this was primitive vertex shading, but still quite evident, MSAA can’t cure this. This screenshot (link) shows very clearly how bad specular shader aliasing can be, all over the walls, and the floor. These are but a couple of examples, but it is things like this that traditional MSAA can’t help you with, while methods like FXAA and MLAA can.
[/quote]
None of them use MSAA at all. And per vertex lighting? Yeah, MSAA is to blame for old, bad shaders. FXAA’s “subpixel precision” is literary in practice just if(allNeighborsExceedLuminanceThreshold()){ blurWithAllNeighbors(); }. I don’t see how this would reduce shimmering (except for reducing the brightness of shimmering obviously). FXAA doesn’t tackle the real problem, it just does a bad attempt at masking it.
[quote]NVIDIA FXAA:
Pros:
…
Image quality comparable to 4X MSAA and MLAA.
[/quote]
False. Even compared to MLAA, it is a lot worse (except for the “subpixel” bullshit, if you count that as a feature). FXAA depending on preset scans a lot shorter than MLAA along axises. For almost horizontal/vertical edges, the quality of MLAA is a lot better. FXAA is an optimized, low quality version of MLAA which is worse at fixing jaggies but doesn’t distort text and has that “subpixel” thing (which AMD apparently is implementing into MLAA some time in the future).
[quote]Also NVIDIA’s CSAA and AMD’s new Custom Filtering and EQAA modes. These methods do work, but they require a lot of work and are huge memory drains, and in essence still based on multi-sampling.
[/quote]
Uh, what? Sure, the gains of CSAA and EQAA is very questionable, but how do they increase “work” and memory usage a lot? Just wrong. In some cases 16xCSAA (that’s 4xMSAA + 12 coverage samples) look better than 8xMSAA, at almost identical performance to 4xMSAA (often less than 0.5ms frame time difference). CSAA memory usage: IDENTICAL. Yes, completely identical. 16xCSAA (forced through drivers AND enabled with NVFramebufferMultisampleCoverage to be sure) used exactly the same amount of memory (measured in MBs, though, so they could differ a MB or so). The test was done using a GL_RGB16F MSAA / CSAA renderbuffer of extreme resolution (16004, 9004). Memory usage was measured using GPU-Z and was confirmed multiple times to be stable.
[quote]MLAA was a step in the right direction because it fixed all the problems of MSAA […]
[/quote]
… while not doing even half what MSAA was doing about shimmering and sacrificing subpixel precision completely.
[quote]FXAA improves polygon aliasing, alpha texture aliasing, and specular aliasing, while maintaining performance that is FASTER than the performance cost of 2X AA!
[/quote]
Subpixel information, custom alpha test shaders, deferred rendering. Yes it’s slower, but it actually does anti aliasing.
[quote]The only caveat, the game developer must implement it in the game. If the game doesn’t have it, the game doesn’t have it.
[/quote]
False. Can be enabled in Nvidia inspector, but at the moment only for OpenGL games. It even works in LWJGL. It also does not work together with MSAA (believe me, it looks hilarious).
[quote]I hope it is post-processing shader methods like FXAA and MLAA that are used more and more in the future, eventually replacing traditional MSAA. Yes, you read that correctly, I want shader based aliasing programs like these to replace traditional AA, it is the only logical answer to the antialiasing problem. We need a revolution in AA, we need something that can reduce aliasing in every aspect of the game, with a minimal performance hit […]
[/quote]
Again, subpixel precision, shimmering, etc. It’s not a real anti-aliasing method, it just removes jaggies which are but ONE artifact of rasterization.
[quote][…] and right now, FXAA is the best option.
[/quote]
True! … Wait for it… … for consoles, because they suck.
[quote]Instead of having to worry about 2X AA, and 4X AA and whatnot, I just want to turn on “AA” and be done with it. Right now, F.3.A.R. allows this, and we will use FXAA as an alternative to 2X and 4X AA.
[/quote]
FXAA has more presets than there are MSAA/CSAA modes. As FXAA is so cheap, they don’t differ much in quality or performance, but more in trade-offs between removing more jaggies and blurring less. The only real quality setting is how long it scans on each axis. MSAA is more costly, infinitely more better looking and in performance sense much more dynamic. Hardware gets better and better, and games just a year old can enable MSAA on mid-range products. Sacrificing quality for performance and availability AND dropping MSAA completely stinks so much of consoles that I feel like puking. Please do NOT do that to my computer games.
As a last note, I have to say that FXAA/MLAA isn’t completely worthless. It just isn’t anti-aliasing as it should be. It does pretty well with jagged edges, but that’s not why I enable anti aliasing in games to fix. A replacement for MSAA has to have some kind of subpixel precision, or it won’t be able to replace it. 2xMSAA looks so much better than no anti-aliasing concerning shimmering, while not having unreasonable performance costs. It does however not a very good job with jagged edges, as it only has 2 colors in it’s gradient. The answer for the moment is a hybrid anti aliasing method, with MSAA/SSAA/something else for subpixel precision while still using some kind of edge blender to deal with jaggies. An interesting similar solution is SRAA, sub-pixel reconstruction anti aliasing, http://www.geeks3d.com/20110129/sraa-subpixel-reconstruction-anti-aliasing-nvidias-reply-to-amds-mlaa/. The idea is to salvage the best of both worlds to achieve smooth edges and less shimmering. How this works in practice remains to be seen, but I’m pretty sure BF3 will be the first game to ship with it. The game would be worth buying just to check out SRAA in my opinion, but it seems you get a pretty good game along with SRAA, too. =D
In the end, the need for anti aliasing is something that can only be solved completely by moving away from raster graphics. But would the replacement be any better when it comes to aliasing? Probably not. Is that gonna happen anytime soon? Unlimited Detail maybe, so a few years? MSAA would not have a definition in such a renderer.
So yeah, MLAA/FXAA, congratulations on solving jaggies once and for all! Now we just need to get rid of the blurring you introduce, somehow attain sub pixel precision, write better shaders to reduce shader aliasing (which shouldn’t be a post process’s responsibility in the first place), e.t.c. Thanks for solving probably less than 10% of the problems with aliasing while looking like 50% in still screenshots so people think it’s awesome and godly. … How do you even sleep at night?
OH GOD WHAT HAVE I DONE I SHOULDN’T BE WRITING POSTS LIKE THIS clicks post
It plays well with deferred rendering. That’s also true for any other screen-space AA method.
It works nicely with fractional super-sampling. For example you could scale the framebuffer by 1.3x and get a nice quality improvement with only a minor performance hit. It also becomes a nice knob to turn when you have spare computing power; that 1.3x can dynamically become 1.7x or more on high-end machines, as long as you maintain the target frame-rate.
In general I agree about the disadvantages in doing screen-space AA, but I wouldn’t be so negative towards it. Especially about blurring, it’s not as bad as you describe and the latest techniques are really clever when it comes to minimizing it. It’s just another tool in our hands, use it when and if it makes sense. As long as we’re so heavily resource constrained, especially on consoles, it’s better than nothing.
FSAA plays nice with lots of other techniques since it’s simply a post processing phase. But the real question is what’s your goal? You can go with the higher complexity in rendering route or the vastly simpler post process one. I think I’d actually go a step further than Spasi and say that FXAA is awesome when you compare quality vs cost (time & space).
I don’t like a lot of this flack against consoles, they totally rock! Less powerful, sure, but when you compare the difference in power, it’s pretty disappointing that PC games don’t look better.
True, deferred rendering is indeed problematic with MSAA. Memory usage/bandwidth is in my opinion the most limiting factor, but I don’t see us hitting the roof on 2GB VRAM anytime soon. Even 1.5GB is pretty far away, but 1GB is pretty much the lowest limit for mid to high-end gaming today. My GTX 295 at home gets capped by its 896MB VRAM in the Playstation 2 emulator PCSX2 if I enable insane amount of SSAA and MSAA at the same time. Though performance drops to the low 40s at that point too, so it doesn’t really make any sense anyways.
The problem with lighting having to be done per sample is perhaps costly, but we are still gaining lots of benefits during G-buffer setup compared to supersampling. Besides, considering how many lights you can have with a deferred renderer, reducing the number of lights would not make a that big difference. I believe texture mapping is a huge performance eater in deferred shading too, so I don’t think the scaling is that bad.
The one most perfect game for FXAA is Minecraft. Compared to MLAA, it didn’t distort text, and the blurring was completely invisible as it mostly had solid colors, generating basically supersample quality on close objects. Far away objects still looked totally like crap due to the insane shimmer. Notch may be good at programming, but omitting mipmaps and texture filtering was the stupidest decision ever. I always turned on FXAA in Inspector when playing Minecraft until I recently discovered that Notch also leaves alpha testing on for all block rendering, basically meaning that transparency supersampling can take care of the texture filtering if you put it on as it’ll supersample everything. Considering Minecraft’s low system requirements, I could put on 4xSSAA and still stay above 60FPS at all times, rendering FXAA obsolete (and incompatible due to a funny bug). Did wonders on distant geometry, especially leaves, just like MSAA would have done with proper transparency MSAA if it wasn’t for Minecraft not supporting it. MSAA doesn’t look good at all in Minecraft due to texture bleeding or something. Pink water and seams!
My point is that of course FXAA has its place in 3D graphics, but it’s not as an replacement for MSAA, that’s for sure. And like I said in my previous post, the solution could be FXAA + something else, like fractional SSAA.
Right. Why use deferred rendering at all then? It’s so much easier to just do everything multiple passes! Uses less memory too! That’s actually a pretty good comparison! MSAA vs FXAA is like deferred vs normal shading. Sure, FXAA is simpler, but MSAA is better because it makes things possible that aren’t possible with FXAA (lots of lights / subpixels).
In no way is it excusable to create such an unbalanced machine as the PS3. 8 CPU cores and a graphics card weak as hell? 8 CPU cores and 512 MB of RAM, including VRAM? Don’t make me f**king laugh. Consoles aren’t easier to make games for even as they all have identical hardware and you don’t need stuff like DirectX/OpenGL. Just look at Bad Company on PS3. They had to use all unused CPU cores to CULL TRIANGLES. They iterated through triangles and culled all that wasn’t visible in software because the graphics card is so goddamn weak. The developers had to go to such extents just to achieve 30 FPS. It just blows my mind.
Google some technical slides on PS3 games. Many games (Killzone for example) do occlusion culling on the CPU. They have implemented a software Z-rasterizer just to save the GPU from rendering a few extra occluded models. We’re back in the days of software rendering!
It’s even more fun listening to PS3 fanboys like my old class mate.
Me: “Have you seen how blurry the textures are in PS3 games? It only has 256MB of VRAM!”
Him: “But it has 8 CPU cores!”
Me: “o_0”
I can understand them a little though. The GPU uses the most power in a gaming computer by a large margin, so cutting off some performance to reduce the cost of the power supply and the GPU itself makes sense (for manufacturing it, not for actually using it for anything, especially gaming). What DOESN’T make sense is why they have a 8 core CPU in a computer at all. My laptop’s dual core runs all the latest games at way above a consoles graphics settings. Why, oh, why did they put so much money on the Cell? They could use it in the PS4 (which I’ve heard they will). It would make more sense but still be too expensive and overpowered. And what kind of complete moron thought 512MB RAM was enough for anything? CoD: Black Ops with max texture settings and 8xMSAA uses almost 4GB of memory (RAM + VRAM). Can you even understand how much worse it looks on a PS3? Like I said in the Intel thread: DICE is doing the right thing with BF3, giving consoles what they deserve and without destroying the game for PC. Screw Intel Graphics Decelerators and their console cousins! DICE has my money this Christmas.
And finally, why are PC games so much cheaper than console games. About 30% cheaper? I have no idea to be honest. Should definitely be the other way around considering the quality of the products.
+100000000
Consoles are the reason MSAA doesn’t work on many objects in Bad Company 2. Basically only the player model, vehicles and the terrain is anti aliased. All buildings, trees, static debris, etc is jagged, flickering and shimmering. Way to go!