Now it works on old intel onboard chipsets! It is very very slow but that is not my fault, these kind of chipset is only compatible with immediate mode (glVertex…) and maybe with display lists.
Vertex arrays should work too. At least i had them working on exactly the same machine.
It depends on the exact model. The intel 82810 does not support vertex arrays. But more recent chips may support it. In my code, I check every extensions before using them. If I don’t use vertex arrays with a chip, it means that they are not available. Look at my source code if you’re not convinced, inside the package called “drawer” in the class called “DynamicVertexSetFactory”.
For example, on the chip “Intel 82865 G”, the display lists and the vertex arrays are available.
NEW version :
- using Java 1.6 and JOGL 1.1.0
- “FPS counter becoming crazy” bug may disappear
- better “seperation of concerns” in the source code
- artworks displayed in the experimental version
- less use of system memory, more use of video memory for resident textures
- a very small increase of speed
WARNING!
Some recent open source drivers under Linux may cause some problems, gray rectangle flickering on the screen. These drivers don’t support alpha components inside the textures. I’m sorry, you will need to use proprietary drivers if your card itself really support 32 bits textures.
Not a good idea. This machine (the intel one) has Java5 installed. I can’t test your game anymore on this machine, as i’m not permitted to upgrade.
Vertex arrays are not an extension, they’re always available.
Are you sure you’re not getting confused with vertex buffer objects?
Sorry, I didn’t do this to bother anyone and you tested my game on many computer, I thank you. With Java 6 and JOGL 1.1.0, the game uses really less memory and it is the only solution to have resident textures.
PLEASE read the code before saying wrong things! “GL_EXT_vertex_array” is an extension! I know my source code is dirty but I explained above that the answer was inside this class! I have been programming games for eight years, I really make the difference between vertex arrays and VBO. Of course, they are not always available. Come back down to Earth! There are still people who have old chipsets and old graphics cards where nor display lists neither vertex arrays are available. Ok, it is very rare as time goes by but it doesn’t mean that it is always available, you are totally wrong!
On the other hand, I will make effort to clean the code and to add much more comments before implementing the cells-and-portals algorithm as I know my code is hardly readable.
Finally, I am really surprised that someone who has been here for a long time is able to tell something so wrong. It is evident that hardware has evolved. Everything has not always been available. I’m sorry, Orangy Tang, how long have you used OpenGL in your life?
WARNING! The source code on my website has not been updated since the beginning of July. Things have changed…but not inside the class below which show you how I check some extensions.
/*This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation, version 2
of the License.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place - Suite 330, Boston,
MA 02111-1307, USA.
*/
package drawer;
import java.nio.FloatBuffer;
import javax.media.opengl.GL;
class DynamicVertexSetFactory implements IDynamicVertexSetFactory{
private IDynamicVertexSetFactory delegate;
DynamicVertexSetFactory(GL gl){
if(gl.isExtensionAvailable("GL_ARB_vertex_buffer_object")
&& gl.isFunctionAvailable("glBindBufferARB")
&& gl.isFunctionAvailable("glBufferDataARB")
&& gl.isFunctionAvailable("glBufferSubDataARB")
&& gl.isFunctionAvailable("glDeleteBuffersARB")
&& gl.isFunctionAvailable("glGenBuffersARB"))
delegate=new DynamicVertexBufferObjectFactory(gl);
else
if(gl.isExtensionAvailable("GL_EXT_vertex_array")
&& gl.isFunctionAvailable("glColorPointer")
&& gl.isFunctionAvailable("glDrawArrays")
&& gl.isFunctionAvailable("glDrawElements")
&& gl.isFunctionAvailable("glDrawRangeElements")
&& gl.isFunctionAvailable("glIndexPointer")
&& gl.isFunctionAvailable("glNormalPointer")
&& gl.isFunctionAvailable("glTexCoordPointer")
&& gl.isFunctionAvailable("glVertexPointer"))
delegate=new VertexArrayFactory(gl);
else
delegate=new DynamicDefaultVertexSetFactory(gl);
}
public DynamicVertexSet newVertexSet(float[] array){
return(delegate.newVertexSet(array));
}
public DynamicVertexSet newVertexSet(FloatBuffer floatBuffer){
return(delegate.newVertexSet(floatBuffer));
}
public DynamicVertexSet newVertexSet(VertexSet vertexSet){
return(delegate.newVertexSet(vertexSet));
}
}
Man, what’s your problem, had a bad day or what?
Ok, so technically vertex arrays are an extension. But they were introduced with GL1.1 which is usually considered the basic version everyone has. I’ve never seen a 1.0 driver in the wild, especially since having no installed drivers on windows will give you a default software 1.1 renderer. OpenGL 1.0 is so limited it’s practically unusable anyway - it doesn’t even support texture objects, and I’ll bet you don’t check for their presence before using them (I’d check, but I can’t see any references to gl textures in your old source code).
Conversely, display lists have been present since 1.0, so your assertion that they might be unavailable is incorrect.
I use glBindTexture only inside the MD3 loader. In other classes, I use TextureIO. No you can have GL 1.1 and have no support of vertex arrays. At work, I tested my game on a PC with a Intel 82810 chipset, it should be fully compatible with GL 1.1 and there is no vertex arrays.
Therefore, unlike you, I don’t assume everyone has at least a particular version, I prefer taking care to have no bad surprise. Sometimes, these kind of chipset may pretend to support GL 1.1 and prevent you from using display lists. It is why I check it too, even though you find it useless. There is a difference between “hardware support” and “software support”. You can install the latest version of OpenGL on a PC with an old graphics card. Then, some extensions or even some integrated functions might be unsupported at all.
If I had followed your way of thinking, the “all white screen” bug would never have been corrected, it would not have appear, the program would have crashed directly and users with old chipsets would have been “punished” because they don’t have any hardware that is fully compatible with the “basic version” of OpenGL. On my view, I have to do my best to support as much hardware as possible in some limits.
You’re right on one point, I should check if textures are supported too. If it is not supported somewhere, it would be better to show at least a popup message to tell the user that there is something wrong.
Why do you speak about windows? It is a particular case. I know many people use windows but I have tested the game on Mac and Linux too. As Java is good to design softwares independent from hardwares, I think of this too, I don’t consider only the most popular configuration for the operating system and the graphics card.
Don’t you have anything more interesting to say?
Well I guess if you want to run on entirely 1.0 then you’d better stop using TextureIO, since it uses texture objects.
[quote]No you can have GL 1.1 and have no support of vertex arrays. At work, I tested my game on a PC with a Intel 82810 chipset, it should be fully compatible with GL 1.1 and there is no vertex arrays.
[/quote]
Vertex arrays were made part of the core in 1.1. That means that anything with GL 1.1 or later must support vertex arrays. Often, once an extension is rolled into core then it might not be included in the extension string since it’s redundant information (especially for very old extensions like this). Of course, there is always the possibility of driver bugs preventing a feature from working, but thats an entirely separate matter.
[quote]If I had followed your way of thinking, the “all white screen” bug would never have been corrected, it would not have appear, the program would have crashed directly and users with old chipsets would have been “punished” because they don’t have any hardware that is fully compatible with the “basic version” of OpenGL. On my view, I have to do my best to support as much hardware as possible in some limits.
[/quote]
You seem to be confusing driver bugs for extensions. Disabling certain features on certain hardware is fine for working around such bugs, but that doesn’t mean you can start claiming that display list support (or vertex arrays in 1.1) are optional and should be checked for in the extension string.
I won’t fully support it, I told that at least, I would show a popup if there is something wrong.
Fine, I treat even a separate matter! ;D That’s good!
Don’t suppose that I’m confused. At work, no feature has been disabled for OpenGL, I checked it, maybe it is a bug on the driver. Finally, my “optional” check prevents the game from crashing. That is why it is not useless to check it. Display list support is not optional but sometimes, a chipset is announced to be fully compatible with GL 1.1 and that is not true. Sometimes, it is only a problem of driver, sometimes it is really unsupported.
I don’t agree with you. I will do my best to correct bugs when it is possible. A part of my code corrects a problem of buffer swap on this kind of chip. I won’t remove it. I can’t correct every bugs of the environment but I will do what I can.
Hmm - I see what I can do.
Have you tried testing the game without the GL_EXT_vertex_array check on the intel chipset?
// (...)
else
if(gl.isFunctionAvailable("glColorPointer")
&& gl.isFunctionAvailable("glDrawArrays")
&& gl.isFunctionAvailable("glDrawElements")
&& gl.isFunctionAvailable("glDrawRangeElements")
&& gl.isFunctionAvailable("glIndexPointer")
&& gl.isFunctionAvailable("glNormalPointer")
&& gl.isFunctionAvailable("glTexCoordPointer")
&& gl.isFunctionAvailable("glVertexPointer"))
delegate=new VertexArrayFactory(gl);
// (...)
Good idea! At first, I have to find a way to detect this kind of chipset. Then, I must plan to design a mechanism to catch GLException and to perform an adequate treatment if the vertex arrays are not supported.
Nevertheless, I am not sure that I can detect which graphics card or chipset is being used. If you know how to do it, I am really interested in this.
Finally, on some architectures, I fear that the game might crash even though I try to catch the GLException and to associate an other treatment. With JOGL, I think almost all functions are available. It is not enough, you can’t only check if the function is available in a particular machine. For the vertex arrays, it will always answer “true”, won’t it?
Regarding the information available in the jogl javadoc, I first assumed that a certain functionality would work, if all functions you need are present:
But reading the complete “isFunctionAvailable()” javadoc, I am not sure anymore, since it only tests if the functions queried are available in a specific core version or extension:
On the other hand I suspect that the Intel driver maybe just don’t report the vertex array ARB extension, because the queried functionality is available in the core version it implements, so it might be sufficient to skip the extension test and only test for the glfunctions you need in this case.
You could maybe show a warning message, if the extension is not but the functions are supported and let the user decide, if he want’s to try the potentially faster code path at his own risk.
Also keep the 90/10 rule in mind: the last 10% to perfection take 90% of the time, so you might be better off to ignore some problems… at least til you know they are really serious and affecting more than 10% of your target audience.
Currently, the game uses another mode of drawing if there is something wrong with vertex arrays, it is just slower. I know that I can’t correct every bugs not inside the game but rather in the environment, that is not the aim of my project. Nevertheless, I will do my best to prevent the game from crashing. Don’t forget that I plan to use a cells-and-portals algorithm in a few months. Then, the game will be hugely faster than now. ;D
Finally, as I have still some problems with my driver, I won’t be able to modify any visual effect this week and maybe next week. I will use this time to improve the readability and the design of the code. I plan to reach the following aims :
- more comments, at least a complete description of each class, each attribute and each method to make the source code more readable
- respect the MVC design pattern to make easier the creation of the multiplayer online mode
- decompose the project in some components to improve the “reusability” and the “maintenability” of the source code
The following components will be created : - interfaces : component containing all the interfaces of the project
- main : main component of the project which uses the other component
- sound : component handling the whole sound system (using JOGG + JORBIS)
- tools : component with misc. devices to convert files from Vincent Stahl’s format into an OpenGL friendly format and to modify the way to build the textures (priority handling)
- drawer : component used to build, modify, draw and delete set of vertices, textures coordinates and normal coordinates
- md3 : component used to load the MD3 model
- context : component handling the OpenGL (available extensions…) and the sound context
If I succeed in installing my driver, I will go on trying to display all objects in the experimental mode.
I have a lot of things to do ;D
The source code will be updated tomorrow. If someone needs quickly to optimize my MD3 loader, I can do it first.
Link removed
Warning! Recent graphics cards with proprietary drivers seem to work correctly for my game under Mandriva Linux 2007. If you don’t want to have problems with 32 bits textures with any game, you should invest at least in a recent ATI card. Can someone tell me if there is something wrong with nvidia cards under linux please?
The component architecture is going to be finished. I had the choice between 2 kinds of component architecture :
- component, link, provided/required interface
- component, port, connector
The first kind is really fine when the provided interface and the required interface are exactly the same. It is very simple and it prevents the redundancy of interfaces (an interface appearing multiple times only with a different name or in a different package). However, the component is not independent, it always depends at least on the component which contains all the interfaces, it complicates the tests on the component and it decreases the reusability of your code as the programmer can’t put only your component and connect it directly to another project. I advice to use this kind of architecture only for very simple projects.
The second kind is better fitted when the provided interface and the required interface are different. Then, you use an adapter. It is more complicated and it might force you to duplicate interfaces uselessly. However, the component is fully independent, it can be tested separately and more easily reused as the programmer can take your component and put it in his own project (if it is well documented). I advise to use this kind of architecture for complex projects, open source projects when the maintenability and the reusability are very important.
I chose the second one and it works. The source code will be updated monday. I will try to handle unbreakable objects in the experimental version during this weekend.
Warning! Important information about the future of the game
Money can be a serious obstacle in the development of a project. Nvertheless, my game will always be free (open source and with no charge). I didn’t give up the multiplayer online mode, I only need to find a solution to share a dedicated server with some people. If I need money, I will pay it directly if I can afford it, I won’t charge the users, I don’t plan to create fees or anything like this. I know that my game is still ugly and slow. I won’t ask for money even though it would become more interesting. I consider that the video games are works of art, everyone should have an equal access to the art and the culture in a fair world. The only way I know to be sure to give an equal access to my game to everyone is to keep it free of charge. This project is open source because I consider that everyone should have an equal access to the information, whatever the subject (sciences, history …). Allowing people to access easily to the information is a benefit for the whole society.
I understand that many people here want to live from their games, I respect this choice, they don’t do evil, they dream of living from their passion, it’s beautiful, I respect their ambition. Many of them would agree with charging the customer, it’s their viewpoint, it is the most evident way of getting money. However, I’ve made a different choice, a complicated choice : I want to avoid charging customers even though the development of my game has a real cost. As I’m not only a dreamer, I have found a solution, a realistic solution to provide a free access to gaming without advertissement or sponsoring. I will announce it on my own website in a few days.
game executable :
Link removed
Link removed
source code :
Link removed