[quote]You can find it unhelpful and unproductive all you like, but you don’t write games for a living, so your advice and arguments aren’t worth shit, I’m afraid.
[/quote]
So that was directed at me, huh?
(I realize that people are flaming everyone by “mistake” here, so I’m not trying to douse you all in gasoline here by answering.)
Keep in mind that we have different interests in programming. I’d love to see you implement shadow map ray-traced volumetric lighting using OpenGL 1.1. Or deferred shading. My interest in game making is graphics, and I also think that people should use the most powerful tools they have available. I’ve stated my arguments for OpenGL 3.3. I’m also only targeting hardware that can actually run my stuff at 10+ FPS. I’m also ignoring Intel.
So I’m sorry, Intel card owners and Mac owners. No ray-traced volumetric lighting effects for you. Or GPU accelerated particle effects. Or deferred shading with MSAA. I’m getting sick of being told “YOU CAN’T MAKE GOOD GAMES SO USE OPENGL 1.1 AND JAVA 1.0”. I want to get into the gaming industry after university (5.5 years in the future), but I want to do game graphics most of all. I want to show my OpenGL 3.3 compliant demo with tessellation and deferred shading when I apply for a job.
I now realize that I actually AM dousing you all in gasoline, but whatever.
Uhm, I’m pretty sure Starcraft 2 uses deferred shading (and obviously MRTs) on anything over the absolute lowest setting, and have you heard any problems related to that? Driver bugs in up to date OpenGL 3.3 drivers are pretty much a myth IMHO.
First of all, MRTs cost no fill rate at all. It’s the same pixel, so no additional coverage checks ( = filling pixels) are done, which is the whole point of MRTs. Bandwidth, however is increased linearly by MRTs. HOWEVER, this doesn’t matter, because…
To spew some technical reasoning about the fill rate: How many new commercial games do not use deferred shading nowadays? Don’t you think that graphics cards makers have adapted? You can show this to yourself by enabling 8xMSAA in a forward rendering game. Your FPS will most likely not even drop to 3/4th the FPS with no MSAA (assuming a realistic test). Why? Because your graphics card has much more bandwidth and fill rate than it needs for basic forward rendering. HDR rendering + Deferred shading is three or four 16-bit floating point RGBA textures render targets. Add antialiasing, and you multiply both the fill-rate (subsamples) and the bandwidth needed, and you STILL don’t get a linear drop in FPS. I’ll even dare say that ALL graphics cards are unable to use their full hardware potential without antialiasing and/or deferred shading.
Anyway, if you need help with setting up MRTs, I’m you man. xD