swing slowness.. myth?

The daft thing is, I can (and do) go out and pay £200+ on a high end graphics card, yet since most of Java2D leaves this sitting doing nothing whilst it does all of its drawing in software I’m left feeling mightly cheesed off.

Yes Swing is pretty fast for what it does, but fast software effects are a moot point when I’ve got dedicated hardware that Java seems determined to use only as a glorified heat-sink. :o

It’s a shame I can’t get Lwjgl display embedded within a Swing framework, as this would be ideal for game tools. As is, I have to create a gui toolkit myself and be self contained (not nessisarity a bad thing, but I won’t get anything near as complete widget-wise).

GL4Java would appear to be a good comprimise, but its not very user friendly and no way as near reliable from what i’ve tested.

I usually shoot for 20 FPS. Anything faster is unnoticable by the human eye. Disney animated flicks for example move at 24 FPS. If your game is tweeking over this, maybe you should have the game waste more cycles doing cool stuff (special lighting effects, etc).

Sigh, not the whole “anything over 24fps is unnoticable by the human eye” argument again.

I can spot the difference between 50hz and 60hz, but beyond 75hz it all looks the same :slight_smile: Almost anyone will spot the difference between 20fps and 30fps a mile off.

Tv’s can get away with 25 or 30 ‘real’ fps (50hz or 60hz interlaced) because of persistance of vision, motion blur, colour bleeding and such. The myth of anything higher being unneeded was started by tv companies etc. when the standards were first set. Remember how 3dfx claimed that 16bit colour is more than the human eye can tell? Again wrong, and soon we shall have 128bit colour ;D

Ideally you want to hit the refresh rate of your display. Anything more is indeed wasted, but less will be noticed.

[quote]and soon we shall have 128bit colour
[/quote]
Reminds me the moment when i fought for dithered 24 bits images, because of indesirable banding effects. Yes, 128 bits would be nice. :slight_smile:

[quote] because of persistance of vision, motion blur, colour bleeding and such.
[/quote]
Cartoon animation never uses motion blur or color bleeding. Please describe “persistance of vision”.

[quote]
The myth of anything higher being unneeded was started by tv companies etc. when the standards were first set.
[\quote]

Its believed the human eye can see up to 220 unique FPS. Disney uses 24 FPS on thier animation but in truth they only use 12 unique frames taken twice. They could duplicate that again and crank it up to 48 FPS but you wont tell the difference since you still deduce it down to the 12 unique frames. The rest is just dupes. On that token, who out there is creating games with more than 20 unique frames of animation per second?

[quote]Cartoon animation never uses motion blur or color bleeding
[/quote]
I beg to differ - Manga and Anime have been adding motion blur for ages, and has become one of its distinctive features. For those of you who can’t stand overly done and poorly drawn manga (ie: me >:( ) I’ll direct you to X:Men Evolution for more subtle and well done application of such effects.

Colour bleeding will happen anyway due to archaic methods of transmitting images and the lower quality of tv’s compared to monitors - this is a basic nessecity, you don’t need to focus on single pixels on a tv, but you do need vibrant colours and a larger viewing distance. Persistance of vision referes to the many different effects having a cumulative effect of causing us to beleive a series of still images are in fact moving.

[quote]On that token, who out there is creating games with more than 20 unique frames of animation per second?
[/quote]
Lots of people. Discrete animation frames are rapidly becomming a thing of the past, and more sophisticated methods such as boned animation are replacing them. Not only do they make life easier on the artists, but they allow on-the-fly generation of new animation states by blending pre-built ones. And as vertex shaders become more widespread, they can be achieved with less memory and minimal performance hit. See Black & White for a real world example of the above.

[quote]Cartoon animation never uses motion blur or color bleeding. Please describe “persistance of vision”.
[/quote]
Until we all get HDTVs, our traditional low-res low-quality CRT solutions provide motion blur and colour bleeding for free! ;D

Persistence of Vision is basically the way your retina/brain combination holds on to images - even after an image disappears, your brain still tells you it’s there. This adds to the illusion that low frame rates are acceptable.

http://amo.net/NT/05-24-01FPS.html :

In an animation or a cartoon, each frame or image of the 24/30 frames per second is perfect, there is no blur in the image - EVER.

True, the display device may color bleed. But Disney does not add color bleeding.

Ive always heard that called animation. I guess I am not as big a geek as I first thought :).

[quote]http://amo.net/NT/05-24-01FPS.html :

In an animation or a cartoon, each frame or image of the 24/30 frames per second is perfect, there is no blur in the image - EVER.
[/quote]
Well, that’s true for situations where they don’t put in any motion blur. ;D I don’t think a comment like that can be considered law - it may hold true for most cartoons, but not all of them.

The motion blur is actually applied to the individual frames as a post-production tweak I suspect.

The difference between 25 and 50fps in an unbridgable gulf. 25 looks awful for 2D but seems more acceptable in 3D. It’s the same kind of difference that shows up when you try using timer based animation instead of tick-based in 2D.

Back OT: if we could just filter out any games that don’t actually require 50fps then we could get back to discussion of the speed of active rendering thanks :slight_smile: However I’m of the general feeling that anything that doesn’t run at 50fps feels like a major step backwards from the C64 and that ain’t good when I’ve spent a grand on a computer.

Cas :slight_smile:

thats another problem with the api as it currently stands - none of the Java2D stuff is hardware accelerated, so using VolatileImage gives no benefit at all.

Infact, if you performed an AlphaComposite of 2 VolatileImages it is slower than doing it with 2 BufferedImages (TYPE_INT_ARGB).

crazy realy ::slight_smile:

I think the opposite is true. I also think Orangy was talking more along the lines of 3D (with his example of B&W) while I was going on about 2D. 2D looks fine at 24 frames per second. Do this with 3D and it looks god awful. Reason being that with 3D animation you can easily produce 60 unique frames of animation per second while I know of no 2D game out there using over 20 unique frames per second.

The word “unique” is key here. As I said early with my example of Disney they produce 12 unique frames (or cells) of animation per second. They duplicate these frames to get a desired 24 FPS.

Commercial example: Diablo II runs at 25 FPS. B&W runs pretty good at 50 FPS. If you were to crank Diablo II up to something higher you would not notice any difference since the same number of unique frames of animation are still played. With each increase in FPS for B&W you can get smoother animation.

[quote]2D looks fine at 24 frames per second. Do this with 3D and it looks god awful.
[/quote]
Nope, quite the opposite. I was refering to both 2d and 3d, but as Cas mentions you can get away with a lower fps in a 3d game because of the overal movement being non-linear compared to a 2d game, where any dropped frames will stick out like a sore thumb. Its just that non-frame based animation is more common in 3d because it makes more sense.

Diablo may ‘only’ have 24 frames of animation, but you can bet that they’re scrolling the background at the same rate as the refresh rate to get it nice and silky smooth. Or play Bauldurs Gate 2 and compare how smooth the characters are (traditionally animated sprites) against the spell effects (particle & vector based animation).

I’m with OringyTang here. The point is that 2D can be far better observed by the eye.
But also for 3D, it really depends. While MSFlight is fine with 20FPS, Q3A will be unplayable. So it largely depends on how fast the camera moves.
For 2D, not only framerate, but also constancy of framerate is highly important, bc. is directly relates to the smoothness of aninamtions. Thats where Java sucks …

The background moves as fast as the characters: 25 FPS. This is the developers ideal setting. As many of you know who have played Diablo II keeping even at that consistant frame rate would have been nice :).

The big difference there could be that Baldurs Gate has over 108,420 frames of character and monster animations. Part II saw and addition 100,000+ frames (see games technical faq). The more unique animation frames you have the more ‘silky’ the end result can be.

Can’t agree more. Diablo could possibly run at 200 fps on a good computer but the designers purposely cut this down to a self imposed 25 FPS. This is because movement and animation in games is often based on the desired frames per second. Its all about timing. This also prevents the game becoming too fast in the future when you have better hardware. Think of them as self imposed speed caps.

ah, thats why motion in diablo 2 is so horribly jerky :stuck_out_tongue:

have you got another example - diablo 2 isn’t the best written of games ;D (aside from the god aweful netcode - its packed full of logic bugs as well)