Been reading many articles from the past three years about how slow intel GPUs are at keeping up with the Joneses. In my thread about our new lighting system which uses RGBA16F textures and FBOs, you can see the painstaking effort we put forth to achieve an effect that doesn’t seem all that tricky in 2D, but nevertheless is probably not going to work on someone with an intel laptop from 3 years ago and this is extremely dissapointing.
Does anyone have any information on what is actually supported by intel? I’ve read that intel chips actually have support for the “core” of 3.0 like RGBA16F textures and FBOs. The JOGL library we have doesn’t have the EXT objects, we are using GL2.GL_FRAMEBUFFER (weird to begin with, because I thought they were 3.0).
I’ll continue to do research and try to post findings here. I’d hate to lose an entire potential market over something like this, especially when you consider that there are games out there that look way better than our simple 2D game that will run, albeit slowly, on intel stock graphics.
I have access to an Intel G45 (may be more than 3 years old :P). Has I recall it support a lot of OpenGL 2.0 functions like FBOs… but I don’t think it support float texture (may be by updating the driver).
If you have a test or some software to get data that you need, I will be able to run it (well only next week, after that I will not come back before 2 weeks).
I was really supprised by the performance of this really cheap card, I was expecting something far worst (don’t except too mush thought ;D)
I may have gone a little bit overboard about how bad Intel cards are in the lighting thread… I’m one of those people who can’t play a game under 60 FPS. I also have some nightmarish experience with Intel cards in my last school. Let’s just say my OpenGL 1.2 game was running worse than expected, and also crashing on fullscreen. I had to code a fake fullscreen mode for our game project in school just for that. shudder The game was running at about 90 FPS for basically a fullscreen texture (it was a visual novel), with performance being very much influenced by the size of the texture due to its shared memory. It dropped to about 60 when a single alpha-blended character appeared on the screen. The exact same scene was running at over 2000 FPS on my GTX 295 (well, not using SLI of course, so it should be equal to a 2 years old €200 card). Kind of puts things in contrast.
I suppose/hope most Intel card owners don’t expect constant 60 FPS… You’ll need some good game loop logic to handle the differences in FPS though.
That’s one of our biggest overall fears (FPS throttling and giving the illusion of lag vs. smoothness). I hope by the time we actually release the thing, intel has their act together, though it’s not entirely their fault when the CPU has to sacrifice operations to do GPU calcs.
I have a Mobile Intel 965 Express Chipset with the latest driver from September 23, 2009. If you need any testing for OpenGL compability, I’m always here and on the #lwjgl channel on Freenode.
O_O OpenGL Extension Viewer says everything up to 2.0 is 100% implemented and the Shading language version is 1.10!!
Only a handful from 2.1 to 4.0 but nothing from 4.1 and 4.2 is implemented.
Sounds pretty good still sucky that your CPU is doing all of the shader calculations and it’s going to be way less effective than any kind of dedicated GPU, but not bad nonetheless and more than I expected. Curiously, where did you go on your system to see this info? I’m curious as to what my work computer could handle.
Oh, I was referring to looking up what kind of graphics drivers you are currently running. I know how to do it for my ATI card obviously, but I don’t know how to see what my intel GPU driver version is
I had graphical artifacts in Stalker: Clear Sky during rain before updating my drivers using hacked ones from NVidia (laptop could only use drivers from ASUS), and those drivers were really old. Like, from March 2011. Completely unacceptable. The fact that the latest graphics drivers from Intel are 2 years old made me cry.
Also, I don’t know what is most unbelievable; the fact that the Intel cards might run shaders in software, or that you (Rejechted) seem to expect to get playable FPS with them. You might as well start counting frames in FPM, frames per minute. And like Princec said, Display.setFullscreen(true); crashes. A workaround is creating a fullscreen JFrame and adding a Canvas to it which you call Display.setParent(canvas); on. HOWEVER, if your JFrame has the same resolution as the desktop, it will enter exclusive mode = crash. I made my JFrame’s size (width+2, height). That way I had 1 pixel on each side extra (to get it centered), and voila! 50 lines of code just to fix a ****ing driver bug! All hail Intel!
Ra4king, does your card support ARB_framebuffer or EXT_framebuffer? ARB_half_float_pixel (for HDR) is just way out of its league so don’t bother looking that one up.
Stalker is not a very good basis to use for pointing out driver bugs, given that the engine for the game is extremely buggy. I would expect what actually happened is that the game was doing something it shouldn’t, and your new drivers are simply able to cope with it. Even the guys who wrote the engine have gone on to say that the engine isn’t very good.
A few years ago, Intel chipsets were appalling, but these days they are not so bad. Especially when you consider they still just budget chipsets. John Carmack even praised them at the last QuakeCon.
ARB_framebuffer_object and EXT_framebuffer_object are identical, but both exist for some unknown reason. If it has one of them does, it sure does support framebuffer. Newer Intel cards also seem to have fixed the fullscreen bug then. Miracles happen, I guess.
Sure, Stalker is one hell of a buggy game, but the fact still remains that a driver fix/workaround was needed to not get black flickering boxes all over the screen, and NVidia handled it. If a game doesn’t work on Intel cards and a driver update could fix it, I wouldn’t expect such a fix to ever be released.
Yes, things are apparently getting a little better with Intel cards, but in my opinion they are killing PC gaming and driving people to shrug consoles. They are making PCs being workstations for things like internet browsing, Word and (not game) programming. The integrated GPUs in the new Sandy Bridge CPUs are just pathetic. Gaming computers are getting more and more distinct from “normal” computers. If everyone had a computer capable of gaming to some extent, the PC game industry would gain a lot. Intel is effectively counteracting this with the Sandy Bridge GPUs.
I don’t think many people have both a gaming computer and a console. Most people either have an expensive gaming computer, or a “normal” computer and a console. By making the “normal” PC market bigger, they are driving people to consoles. Let’s just say I don’t appreciate the thought of playing my games in less than 720p without antialiasing at 30FPS, when I can get 90 FPS capped constantly with 8xMSAA and 2xTransparency SSAA on a real gaming machine. That is what’s happening with Call of Duty at the moment. DICE is doing things right with BF3, giving consoles what they can handle (and deserve).
I’m obviously quite biased on this, so just ignore me if you think my opinions are outrageous.
BF3!! I preordered my Xbox 360 copy, but now that I’m getting a new computer (gaming specs), I canceled it and I’m getting the PC copy
I’m scared though. I’m really good with the Xbox controller, using keyboard and mouse for an FPS for the first time ever is gonna be scary :S
Good boy!
You should be able to use an XBox controller with BF3 on PC, but you’ll just get pwned by every mouse+keyboard player out there. Besides, joysticks are for flight simulators.