New Visualization Platform

Hello,

Sorry if this is too commercial, but I thought there was a good chance the people reading this list would find the Meson platform interesting:

http://www.kaon.com/software/swmeson.html

It’s a language with a virtual machine that is implemented in Java 1.1. It’s primary purpose is to provide a platform for visualization of 2D and 3D graphics (like Shockwave) but without the need for a browser plugin. We’ve been using this platform for a couple of years at places like Dell and Cisco, but this is the first time we’ve made it available to the world at large.

Just thought you might like to know…

-Joshua Smith
CTO/Alpha Geek
Kaon Interactive
jesmith@kaon.com

Welcome to JGO! Yes, that’s very interesting stuff - and runs extremely well from what I can see.

Are you very tightly focused on 3D models, or have you given some consideration to terrain/environment rendering?

Our corporate focus is definitely “Product Visualization”, not immersive environments. However, I’m personally interested in seeing what things people can do with Meson, so I’d be glad to give you pointers if you wanted to experiment in that direction.

One of our customers creates 3D models of room interiors, and we quickly put together a “walk through” style UI to facilitate looking at those models on the web. The engine can definitely handle immersive scenes.

Collision detection and terrain following could be added by any competent Java programmer using a “Gluon.”

My biggest concern for making something more quake-like, is that we do not use an object-resolution frustrum clipper (or a LOS clipper, for that matter). These are completely unnecessary when you are looking at objects from the outside, but become critical to performance when you drop into a large scale scene. Again, a Gluon could add this kind of management.

-Joshua

You do realise that this falls exactly into the patent that has been causing all the fuss recently (the view around an object) :stuck_out_tongue:

Apart from that, its very nice. I like the glass :slight_smile:

Au contrair. I am quite certain that we’re clear of this patent. The gist of it (by my reading) is that you effect pan&rotate&zoom by re-transforming the projected coordinates, instead of going all the way back to the original 3D coordinates. Neat trick. If I had thought of it, I probably would have implemented it as an optimization. Glad I didn’t think of it! :wink:

Actually, the optimization in the patent is far more trouble than it’s worth on modern processors, and I think it’s unlikely that the technique is still in use.

Also, I imagine that integrating a clipper into a pipeline that is re-projecting 2D coordinates must be a real mess.

-Joshua

This stuff is absolutely fantastic! And so very small! The glass jar is particularly nice.

Cas :slight_smile:

Thanks. If you like that glass, you’ll love this:

http://www.kaon.com/software/swgallery.html

The ring was exported from LightWave as VRML, read into vSpace Master, tuned up the materials and lights, exported as web tour and voila!

-Joshua

[quote]Thanks. If you like that glass, you’ll love this:

http://www.kaon.com/software/demo/ring.html
[/quote]
Ah, be careful who you forward that page to…I just got this response from a friend:

[quote]Au contrair. I am quite certain that we’re clear of this patent. The gist of it (by my reading) is that you effect pan&rotate&zoom by re-transforming the projected coordinates, instead of going all the way back to the original 3D coordinates. Neat trick. If I had thought of it, I probably would have implemented it as an optimization. Glad I didn’t think of it! :wink:

Actually, the optimization in the patent is far more trouble than it’s worth on modern processors, and I think it’s unlikely that the technique is still in use.

Also, I imagine that integrating a clipper into a pipeline that is re-projecting 2D coordinates must be a real mess.

-Joshua
[/quote]
Sadly thats not it. Its a (very broad) patent on having a view-space, and describes storing the object [psitions in one space (model spave) and transforming them to a view space and then screen - i.e. what everyone does.

in particular the patent covers the user panning around an object and zooming - exactly the sort of interface you use.

You probably also fall foul of 4,742,474 (frame buffers consisting of arrays of numbers!!!)

the patents are all amazingly broad, & quite probably use prior-art, but it seems this is one of those companies who make ‘business’ of buying duff patents and stong-arming other companies into stumping up licenses.

Sorry, I’m kinda out of the loop on these things, these patents that everyones talking about recently, they are “American” patents? Right?

Kev

[quote]Sorry, I’m kinda out of the loop on these things, these patents that everyones talking about recently, they are “American” patents? Right?
[/quote]
Yes, they’re US-patents. :wink:
Please be aware however that the European Union thinks about “inventing” just these kind of so called software-patents, too. Heaven forbid.

Also, it’s possible that some mighty US patent owners are going to try to claim their patent outside the USA, too. An US sentator being commited to the Hollywood movie industry recently said he’d try to mak US law being valid outside the US, too. (I’m not sure how he’ll make this, though.)

Oh, and now I’m off-topic to this thread.

If it was merely that, it would be invalid on its face. Sutherland did that in the early '60s.

Again, panning around an object is not novel. It is the algorithmic approach he’s using to panning around the object, and my reading of the patent is that he is doing it by re-projecting already-projected coordinates.

Read the patent before you start repeating hysterical claims. It’s a hardware patent. A software program cannot infringe a hardware patent.

-Joshua

Runs like a charm :slight_smile:
Is there a way to zoom in the example applets… i wanna see that bi-linear filtering up close :wink:
“Stable-scene 256x oversample progressive anti-aliasing” 256x seems like overkill but never the less impressive.
And a question… Do you have a demo of the “Real-time, dynamically generated shadows”?

[quote]Runs like a charm :slight_smile:
Is there a way to zoom in the example applets… i wanna see that bi-linear filtering up close :wink:
“Stable-scene 256x oversample progressive anti-aliasing” 256x seems like overkill but never the less impressive.
And a question… Do you have a demo of the “Real-time, dynamically generated shadows”?
[/quote]
Look at the camera we did for Sony Style (in the yellow box on this page):

http://www.kaon.com/software

You can zoom way in on that one. Also, the pen demo on this page:

http://www.kaon.com/software/swmeson2.html

zooms in automatically when you start typing.

256 oversamples is actually critical when you use as much texture as we typically use for models we make for e-commerce customers (unless you want to rely on the mip-maps). Anything less and you can get serious moiré. Also, keep in mind that most people surfing the web are using much lower screen resolution than your typical gamer uses, so any jaggies along edges are really easy to see.

Another way to think of it is that if you are using 24 bit color (as most do, these days), you need 256 oversamples to get smooth transitions between black and white.

All the shadows you see in our demos are dynamically generated. Again, that Sony demo is a good one to look at to see the “dynamic” aspect of things. You can control the resolution of the shadow map using the Meson language (the Sony example is using a pretty low res map, 24x24 pixels – actually, that’s a great way to see the linear filtering in action!). For an example of a really high-res shadow, take a look at the ring on this page:

http://www.kaon.com/software/swgallery.html

-Joshua