After Java

Modern engines generate machine code.

With WebGL you can make games with PS1 like computations with PS3 like shaders on top. That’s plenty of power, if you ask me.

Regular Canvas is similar to Flash. We’ll see some hardware acceleration in the future though.

Also, both (i.e. Canvas and WebGL) will get faster once some bottlenecks are removed. The compositing currently takes a big chunk of CPU cycles for no good reason. This will get fixed.

WebGL does work fine on modern handsets.

HTML5 in Firefox: WebGL Demo On the Nokia N900 (YouTube)

Android handsets will be fine, too. They support OpenGL ES 2.0 and the V8 JS engine (the one Chrome uses) works there as well. Oddly enough the JS engine is just like the most powerful one on the desktop, whereas the Java VM is pretty crippled (they intend to speed it up a bit though) compared to those you know from desktops.

It’s sorta unfair, but as things are, JavaScript is in pretty good shape on those weak devices.

I was referring to addition complexity needed in 3D games. Think physics, etc. Whatever performance JavaScript has, it was not made for such things. And running such things on the desktop (in JavaScript) is also problematic, and in the article I’ve read, the iPhone JavaScript is 10 times slower than on a netbook. The 3D display may work properly (its hardware accelerated no question), but with CPU performance 10 times slower than a low-end desktop, it will not run the additional computations needed for a 3D game. Android may be better, but i doubt even it has enough processing power.

Sure, running games in the cloud is expensive. That is why i think there will be special versions of game engines made for running in the cloud. Something like a single game instance synthesizing display for multiple clients, sharing resources (textures, models), and rendering to framebuffer and compressing the stream too. One such instance could possibly act as a multiplayer server too (reducing that part of lag to 0). I think games specifically written to run in a “game cloud” will require lot less resources per player served, than a stock desktop game run on the server. That is why i think special game engine versions, new “cloud game” API’s, SDK’s will have importance.

I personally think it’s only a matter of time until JavaScript gets some competitors in the browser. I personally would love to see something along the lines of C–, a language designed to be an intermediate language. Google are also working on Native Client to allow users to safely run native apps in their browser. Something like this will probably end up being the best solution.

I just can’t see “the cloud” working for games. We still have massive asymmetries in network to local cpu/bandwidth ratios. And there are no indications thats ever going to change. In other words, networks are still the bottleneck.

And thats leaving out costs. As Riven says it does not scale. Everyone thinks bandwidth is cheap. Its not, for a decent uptime (Service Level Agreement not some “promise”) server with serious bandwidth, its cheaper to master and post DVD’s than sell online. I have run the numbers. Peer agreements only help some.

Then on top of all that is what people call HD for live streaming. Its not even VGA half the time. So pixelated and blocky. A BluRay disk has peek bandwidth of about 40Mbit. Thats for 1080p and really its optimized for 720p/1080i. How may FPS gamers do you know that run at that low resolution (I do, but many are well above that now). So rendering high res and then on the fly compression that has very poor performance and high latency to a level of crap… its not going to take off.

Sure it may work for Farmvile… But >720p60, i can always give my clients a better game for less cost by giving them a full client side app.

Also the “the cloud” crap was called “net computing” about 10-15 years ago. We were all going to have dumb terminals within the decade. It didn’t sell at all.

Net speed is just one thing that got better, but the big difference is that we got a lot more and better mobile devices now. We already have the “dumb terminals”, its called the Android, iPhone, iPad and the netbooks. Mobile net speed will get up to 50-100Mbps, and that will be more than enough to stream video. Also consider that mobile devices have smaller displays, so a small (rendering, streaming video) resolution is enough. In the time of “Net computing” there were no applications on the net, now we have Google, and other myriad of Web based applications. There are even programming IDE-s on Web now.

I’m not arguing that a desktop (or game console) game beats streaming video. Its just PC sales are declining, and mobile device sales are exploding. Mobile devices are too weak to run proper 3D games, but are (thanks to hardware acceleration) fast enough to show streaming video.

This. “The cloud” has degenerated into a marketing euphemism. If big corporations said “store your private data on our servers” people would run a mile. Call it “the cloud” and suddenly it’s hip and trendy.

Today Steam distribute 20 petabytes of data a month and they are able to make a profit. Bandwidth costs have been going down over the last decade, and that’s expected to continue. Typically with optic fibre it’s the interface which is the expensive part, that’s one reason why it’s not being pushed for use in homes. However again that is changing. Light Peak is an example where an optic fibre interface is becomming cheap enough for use on home devices (even though it’s still more expensive then USB). There are houses already that have optic fibre broadband connections, and I see no reason why this won’t be common place in the future.

Even if OnLive (and similar) services don’t work in practice due to latency, with increased bandwidth you can have the user download a game on the fly (or at least just the bits they are using) in realtime.

What about using a hidden Java applet to perform the expensive computations (or run the full game code even), then feeding JS with data to do rendering, play sounds, etc?

It was. But if you look around now, you will see a different picture. Companies are saving tons of cash by running their services on Google or Amazon cloud instead of their own data-centers. Its not “just hype”, there is a huge amount of programming articles about writing apps to run on clouds, application frameworks getting cloud-compatible, discussions of users, both good and bad reports. It shows that cloud is not an empty phrase any more. It is different than the “Net computing” initiative was. “Net computing” focused on the client side, and there were neither client machines, nor services usable from the Web. It was a sales slogan, but now “The Cloud” is a working business of both Google and Amazon.

[quote]Mobile net speed will get up to 50-100Mbps…
[/quote]
Only if you can change the laws of physics…With a given S/N ratio there is only so much data you can fit per Hz. Either that or you are the only person on the block with mobile device. They only way you could get that to work is will cell sites the size of an orange.

I have 8Mbit at home completely unlimited for 25EU. Many people up north get fiber at that price (But don’t ask how much the beer costs!). Home bandwidth is quite high and the last mile is good enough in many places. But thats not where there is a bandwidth shortage.

Quake 3 still has many active servers… Lets assume that we have a server with 16 people connected. You are not going to get away with you tube quality here. But lets assume there is some serious codec magic going on and you can get 1 frame latency with 10Mbit (I seriously doubt 720p60 at 40Mbit would good enough). Lets assume a 50% occupancy and we have 12163600*10Mbit => 864 Gbyte per day per server (only 16 players!) or 26Tbytes per month for one server. Compared to quake live which has a usage of about 5Mbyte per hour for 16 people on a server (ie 60Mbyte for 12 hours). Even if the encoding could get the bit rate down to 1Mbit its still massive compared to a normal system. So quake live will be cheaper… and we haven’t even started talking about the hardware required to render and encode this data, the cooling or the rent. And lets not forget a pipe thats 1000x larger than your competitor is going to cost more too.

Sure it can be done… but it will never be cheaper.

And mobile devices are not too slow for 3d. Older 3d works fine. There 3d capabilities will increases faster than there network capabilities.

[quote=“delt0r,post:90,topic:35232”]
YouTube is streaming video constantly, all day every day. It’s not what I would say is good quality, but tonnes of people find it acceptable and the quality is improving. YouTube is also expected to make a profit this year.

Maybe a subscription fee isn’t enough, maybe there will need to be more advertising in games as well to help pay for the service. But YouTube is real life example proving that it is possible to stream tonnes to video footage, dispite the bandwidth costs, and still make a profit. Maybe it’s not profitable right now today, but presuming it received enough customers I would be surprised if it wasn’t profitable within a few years.

Well, I think WebGL shows that the current direction is doing only the rendering on the client (mobile devices have enough graphics processing power to render 3D in a quality suitable for their small screens) and leaving everything else (AI, physics and, obviously, multiplayer) to the servers. This is making the best of current “terminals” and bandwidth strengths and limitations.
This actually bodes well for Java (the platform, not necessarily the language) as it is by far the most successful server platform.

As for the client, well, JavaScript seems to be the direction, though I wish it wasn’t so. At least I hope that we’d get something like GWT for JavaFX - something that would compile JavaFX Script to HTML-JavaScript.
The reasons for this, I think, are not so much Java’s limitations - it would have been far cheaper (for Google, for example) to fix Java’s loading time on the client rather than develop a whole new standard with far fewer capabilities - but rather simple politics. A huge company such as Google cannot afford to stake its future on a platform controlled by another company.
If Java were to be completely freed, perhaps Google and others would be happy to use it rather than JavaScript, but where is the upside for Oracle?

[quote]YouTube is real life example proving …
[/quote]
I think you missed the bit where “youtube quality” won’t cut it. Also you tube can encode outside real time. You can’t do that with a interactive FPS game which means much higher bandwidth for a given quality.

Where I used to work we had a weekly video conference with another team in a different country. The footage was always to a high standard and practically real-time. BBC iPlayer is another example, it too streams video at much better quality then YouTube in almost real-time. I expect the delays in both of my examples would only be acceptable for the slowest paced games. But they are examples of where real-time video streaming is being done, today, to a high standard. Clearly the price has also not prevented them, and there are plenty of other services who offer similar.

A gaming version would need to improve on this, to have much lower latency during compression and decompression. However hardware is still getting more powerful year on year and GPGPUs are starting to become more common in servers. You might need high-end hardware but again I see no reason why you can’t do this today, where the majority of the latency is in the network.

For mobile devices with small screens, youtube quality is enough. Yes it must be compressed in real-time as opposed of multiple-pass encoding possible when encoding it offline, but also, game display output is much more predictable than random videos up on youtube. The MPEG-4 standard even contains possibilities for encoding videos with well-known properties, such as frame movements, even 3D functionality. Don’t know how much of that is contained in H.264 or Theora, but it should be eventually. And that may provide compression speed/quality well beyond what can be achieved with random videos.

First of all, most games do not need heavy physics calculations (the PlayStation1 and the N64 were perfectly fine without it). Also, this isn’t about AAA with a multi million dollar budget.

This is what it’s all about:

[…] in the article I’ve read, the iPhone JavaScript is 10 times slower than on a netbook.

Depending on how they benchmarked it they may have benchmarked a completely different thing. E.g. if you do DOM manipulation, the JS code only contributes to like 10% of the time. It also didn’t compare it to Java. On Android devices JavaScript (V8 - generates native code) is probably about as fast as Java (Dalvik - currently no JIT).

Also, iPhone = last/current generation. Who the f- cares? It’s the wrong time frame.

Android 2.2 benchmark (with JIT).

Ah good. I expect there’s plenty more speed where that came from. Just remember the speed increases we saw from 1.0->1.1->1.1->1.4->1.6 JDKs.

Cas :slight_smile:

[quote]Android 2.2 benchmark (with JIT).
[/quote]
ooooooohhhhhh… pretty!

Maybe it’s time to dust off the Android SDK and try it out again!

now we just need a dalvik web plugin :slight_smile: