4k demo contest?

Yeah given that we have the same amount of time as the 4k games contest, I bet we will see some nice ones.
I’ll give it a try if I have the time.

Could we increase the running time to 3 or 4 minutes including loading/precalc. Btw, the maximum running time at breakpoint is 8 minutes:
http://breakpoint.untergrund.net/compos_pc.php

Of course, people should remember that it is better to make it short and interesting than long and boring.

True. I was just seeing if non-standard/undocumented parts of the Sun library could be used.

Will executable jar be allowed? Would be nice to go fullscreen without signing.

So it’s looking like the 4k rules, but non-interactive with a maximum run time TBD.
Is non-interactive mandatory? I was toying with some sort of user influence - but then the demo would vary depending on the user.

Sound would be an interesting challenge. Generating waveforms is quite easy, but a score of reasonable length would use a fair number of bytes. Maybe some sort of sound-to-light demo - but it might end up looking too much like windows media player. :o

What about capturing sound from the microphone and doing something with that? Would that count as an external resource and therefore banned? E.g. you play your favourite music & the demo works out some effects to go with it. Microphone access would require signing the app though, so no applets.

One of the main problems I see is limiting the number of different library calls (they use loads of bytes). We may well end up with the same approach as used in 4k games - i.e. do everything in a array and then blit it to the screen, which is a shame as some creative use of ConvolveOp would be interesting. It’s would also be interesting to code so that Java2D offloads most of the work to the underlying graphics card. It’s a shame Java2D doesn’t support shaders - but then, all those Library calls eat bytes.

Debatable. Not the most user friendly approach, having to download a jar file and double click on it. I’ve had problems where jar files open up in notepad or something.

[quote]So it’s looking like the 4k rules, but non-interactive with a maximum run time TBD.
[/quote]
There are two time limits really:

  • time limit on how long it takes to procedurally pre-generate or pre-render graphics
  • time limit on how long it takes to play the demo

So, all in all, we might be talking about 30-40 seconds for playback and perhaps 10-15 seconds max for initialization. This in more of a guideline.

even if it is not “real exclusive fullscreen”, you can already do that without signing anything : http://www.java-gaming.org/index.php/topic,20058.0.html

Play time is most likely equal on most systems, but loading time probably varies greatly among CPUs

There’s some very nice java demos at http://www.comp.eonworks.com/demo/demo.html.
Ok, they are several megabytes, so well out of scope, but pretty inspirational.

[quote]but it might end up looking too much like windows media player.
[/quote]
people use it dont they :wink:

I guess we should only aim for play time, anywhere between 30-60 seconds.

That remindes me, did this in 98:
http://www.scene.org/file.php?file=/mirrors/hornet/demos/1998/h/heltsten.zip&fileinfo

Came third at The Gathering after Digital Nerds and Komplex:
http://www.scene.org/file.php?file=/demos/groups/komplex/java/forward.zip&fileinfo

Must say those are the nicest applets for running software rendering from Java 1.1’s time.

impressive :slight_smile:
I think there will be lots of tunnels and plasmas in the 4k competition.
The forward demo didn’t work for me, after loading the images and the blue points nothing happened (FF & IExplore)

Plasma? Ha!

Telling a story in 4k is much harder!

Not sure that I’ll manage a story. Unless it involves lots of plasma ;D
Tentative plan is a soft polysynth driving an event sequencer driving textures (e.g. plasma) and a 3D mapper (e.g. tunnel).
If it’s raining this weekend, I might see if that goes into 4k.

Darn it, I just saw you were back, Alan… long time no siege since LWJGL 16K.

Hi Riven, thanks for remembering. Many people from then are still here ( Hi folks! ) Since LWGL16K I’ve learnt japanese and travelled around japan (twice), then learnt to dance (still enjoying that). Programming wise, I’ve updated my ‘Storm the Castle’ 3D game engine so that it scales better with larger game maps. There’s still one part that doesn’t scale, It loops through the entire object set when determining which objects are close enough to require drawing. That really needs changing to grouping objects dynamically into a grid of sets, so limiting the check to adjacent squares. A better level editor and do a new game with the engine would be nice; probably japanese historical, since I’ve now got loads of pictures of shrines, temples and castles. Also looking at java me’s m3d library to see what’s in there. Not enough time for all of this of course (so mostly won’t happen), which of course won’t be helped if this comp goes ahead ::slight_smile:

A quick thrash on the synth, has produced a workable tune - 12 bars - about 40secs techno. The soft synth from my LWGL16k entry can be modified to play this. Main change is to make it multi-channel, so it can have several voices - at least saw-lead (triangle wave) and crash-cymbal (white noise). Not done, but can leave until (if?) the comp goes ahead as no issues expected.

I’ve watched and rewatched the ‘Digital Nerds’ demos, particularly the plasma tunnel bits and coded up a basic tunnel by precomputing the 3D mapping - ~10ms/frame full-screen. Dynamically computing the mapping would allow viewpoint rotation (The Nerds did it), but my code was too slow (~100ms/frame), even with partial precomputation. Worse, my plasma texture generation lacked that ‘wow’ factor. Definitely have to find a much better texture generation algorithm, but not sure where to start - maybe fractals.

The ‘Digital Nerds’ metaball demo was also excellent. No idea how that works, which is probably why I like it so much.

Edit: Any thoughts on how much heap space we can use. If we webstart, we can override the default heap size.

Dynamic tunnels are usually implemented by interpolating 16x16 or 8x8 blocks. What demo had the metaballs?

Your synth sounds interesting. I’ve managed to generate some sounds but not written a sequencer yet. Not sure how I will do it.

I’m planning on doing a simple flat shaded software renderer with some post processing effects. Generating some interesting geometry will be challenging. Now I’ve managed to generate a bunch of rotating boxes. Looks kind of crap.

There were metaballs in the Apex demo. After googling about, I’ve found some information on the algorithm. Most interesting and worth a go. I tried interpolation on the tunnel but the code bloat has largely dissuaded me bearing in mind the 4k limit. Still lots can be done with a static mapping and it leaves plenty of processor time for other effects.

[quote]Your synth sounds interesting. I’ve managed to generate some sounds but not written a sequencer yet. Not sure how I will do it.
[/quote]
Mine stores a sequence of notes (note, duration, delay to next note) in an integer array, with 16bits per note. Create a blank byte array for holding the audio. For each note I determine the start and stop point in the array. Then calculate and add samples in at the appropriate frequency for the note. Build in attack and decay to zero amplitude to avoid clicks.

[quote]I’m planning on doing a simple flat shaded software renderer with some post processing effects. Generating some interesting geometry will be challenging. Now I’ve managed to generate a bunch of rotating boxes. Looks kind of crap.
[/quote]
I’ll look forward to seeing the shapes you come up with. All the best.