LWJGL: how to use procedurally generated BufferedImages

Sorry switched around some stuff :frowning:

Which smiley icon do I use for embarrassed?

Here are the results of the diagnostic. Adding the static reference to GL20 got rid of the error on the reference to the GL_SHADING_LANGUAGE_VERSION. And yes, the text and reference now match up.

No shaders whatsoever.

OpenGL version: 1.3.1072 WinXP Release
GLSL version: null
Vendor + Graphic Card: ATI Technologies Inc. RADEON 7000 DDR x86/SSE2

My old computer also seems to insist on ATI graphic cards. We tried putting in a PCI board, a GeForce something with OpenGL 4.2, and no go. The only ATI board for sale at the shop where I usually go was $120, and “only” supported OpengGL 3.2.

At the time, I decided not to get it, figuring I’d wait to get a wholly new computer rather than spend a significant fraction on a somewhat obsolete graphics board. But at that point, I didn’t realize that I didn’t even have OpenGL 2.1. So maybe I need to go back and get that or a similar board.

I was guessing that for a casual game aimed at PC desktops, implementing some graphics from the most widespread OpenGL would help deliver some extra cpu. But from davedes comments, I might as well just stick with Java2D, than try and implement what I have in a with nothing better than OpenGL 1.3.

Example of what I’d like to be able to do: I use lines/rectangles to draw a “pavilion” which includes a smoke effect, made from 40 procedurally generated graphics, each with a gradation of alpha, that get overlaid. With the scene change, using Java2D, I call the following code to fade the pavilion and smoke effects:


		Composite composite = g2.getComposite();
		g2.setComposite(AlphaComposite.getInstance(AlphaComposite.SRC_OVER, pavViz/255f));

“pavViz” goes from 255 maximum to 0 minimum which allows everything drawn in the block to fade to nothing.

If I can’t do this in OpenGL except via shaders, then I either give up entirely on using OpenGL, for now, or I try to get up to the lowest common denominator that would allow this to be done somewhat efficiently. Am I correct in thinking that the minimum would be OpenGL 2.0?

I come back to OpenGL 2.0 as a reference, because of reading this link, on Minecraft stats on their Users:
http://www.java-gaming.org/topics/new-project-so-early-in-development-i-have-no-name-for-it/28032/msg/255212/view.html#msg255212

But if there is a way to get a boost over Java2D and be able to efficiently change the alpha component of graphics while using OpenGL <= 1.3, then maybe I can deal with that, and be happy that the 9% users relying on “crap Intel cards” can also play the game.

(I understand the stats from that post are stale. But I don’t know how far off the numbers are now. I suspect a significant number of those “crap Intel card owners” are buying into mobile rather than upgrading their desktops.)

This tutorial works brilliantly! Dropped it into mdesl.test package and there was not one complaint or even suggestion from Eclipse.

[quote]Try this, it uses no shaders at all and should support GL 1.0:


[/quote]
I’m going to take a closer look at the code…

Throwing in the towel on LWJGL for now. Am going to stick with Java2D.

I’ve just read some more about the newest versions of OpenGL, and the extent to which ‘immediate mode’ is considered obsolete.

It is hard enough to master all this “old style” stuff, without the disincentive of knowing that there will be yet another set of hurdles mastering the more current OpenGL. Better to come in when I’m ready to rock at the higher level.

Can’t afford a new computer right now. :frowning: Am just going to see what I can do within the limits of Java2D.

Seriously… why not give LibGDX a try like I said earlier? It should support GL10; and then you can upgrade your game to GLES20 later on with minimal changes to your code. You can still get down to the wire and learn about “raw OpenGL” if you fancy. And not only will it improve your performance, but it will allow your game to port to other platforms.

Look at it this way: For games, immediate mode has indeed been deprecated for years. But, realistically, Java2D was never even a contender. :wink: So you are stepping backwards by choosing Java2D.

Also keep in mind that Minecraft was made with GL10, as well as other successful Java games like some of Puppy Game’s earlier works (unless I’m mistaken).

Awww that’s sad to hear. The GL 3.2 card would have still been an excellent tool to learn modern OpenGL.

Oh well, maybe someday.

Libgdx also has to wait for a new computer, I think. It really slowed down Eclipse drastically, to put in all the supporting functionality. A better computer wouldn’t object as much. If I give it another try, I think I will first just try running it without ANY of the Android or HTML support, just the root project and the desktop version.

I really didn’t enjoy my experience with Libgdx at all. I think it is an awesome achievement and a tremendous boon to the Java gaming community, but it also kind of set my teeth on edge. Personality thing, perhaps: I also wrote my own sound engine and procedural synthesizer rather than go with existing sound libraries. (This was before and about the same time that TinySound was made which actually seems pretty good to me.)

I liked the idea of LWJGL being a thin wrapper and that I’d be able to see what was going on that was so tough and strange about OpenGL. I thought since I was able to figure out and work with much of javax.sound.sampled, I’d be able to deal, but I think it is fair to say that OpenGL is a more difficult jump.

Yeah, maybe should have pulled the trigger on that ATI card with OpenGL 3.2. Will rethink that. I have some games that I can’t play any more when I replaced the old failed card with the current cheapo. Big mistake.

Anyway, I’m going to let the bruises on my head heal and my thoughts clear for at least a few days–there’s lots of other stuff to work on in the meantime.

Everyone that has been helping–keep up the great work, and thank you!!