Why is java not like heaven for AAA game companies?

  • Everything that isn’t Microsoft is OpenGl because Microsoft doesn’t license DirectX to anyone else. Which isn’t that big of a deal since most gamers are using Windows and not linux/mac/etc. It is a big deal in the console market - but as you can see Microsoft is doing pretty good with the Xbox.

There is a reason most gamers use Windows, and that’s because Microsoft has been after gamers for a while - Linux and Mac were never big on games.

  • It isn’t fair to compare OpenGl to DirectX because, like I said, DirectX performs way more functions than OpenGL (which strictly acts as a graphics layer and nothing else.)

  • Understandably, Japan isn’t big on the Xbox One - simply because their culture demands a different sort of game genre. Microsoft is trying to capture this market with Killer Instinct etc.

IMO The PS4 was incredibly disappointing to me.

This is ancient history, back to the 20th century, but Java was originally intended as a browser language. Applets were going to rule the Web. Java was going to be what Javascript is today.

Unfortunately, Java needed Swing and was a megabyte download–at a time when most people had dial-up. HTML worked much better. You couldn’t write programs in HTML, but the bosses didn’t care, they only knew that HTML was fast and Java was ssslllooowwwww. So Java became a laughing stock, and this persists in a lot of (older?) minds today.

I did work briefly on an Applet web app in the late 90’s. It’s still going strong without maintenance 15 years later. People who choose HTML have bounced around from Active X to Silverlight to Flash to immense Javascript atrocities at great expense, but Java is still a joke (to the older crowd) except to those in the back rooms who have been using it. Its rep’s been improving lately; it’s 5 times better than anything else (except maybe C++ for high frame-rate games), and it’s getting hard for people to overlook that.

You really don’t want to read too much into the difference between OpenGL and DirectX these days; they’re functionally virtually identical, and only a very small part of the code and therefore team really need to know much about it.

Cas :slight_smile:

The difference is in politics and support. DirectX has more of a user / code base. There are more game engines supporting DirectX. There’s also mighty Vole pushing it.

Directx was a bane to computing because it wasn’t needed at the time of its inception. Dx was oop which means it was hard to port on more platforms back then.

Opengl and d3d are virtually similar in all aspects gfx-wise.

How could you be disappointed in something you haven’t tried yet?

While I am sure we could find some way to argue your first two points, they don’t contradict anything I said in my post.

I am disappointed in PS4 (And Sony in general) with their incompetence and poor business ethics that they’ve displayed recently. Microsoft isn’t perfect in that respect but their par-taking in the console war has been much more mature.

I was lead to this conclusion after Sony’s PS4 PRs essentially consisted of 50% bashing MS (calling them out implicitly at gamescon for the DRM policies and ‘inconsistency’ on a unreleased and unofficial product.) And after Sony couldn’t even figure out how to properly preload GTA4 content safely (or maybe it was their inability to promptly notify users when their personal data was ripped out of the PS+ databases). The other 20% was them riding the irrational internet rage regarding the Xbox One.

Finally, all I see in the PS4 is better hardware. The thing is, when it comes to developing gaming consoles, hardware literally means nothing! Game consoles are designed to run cheaper hardware at an over-specification clock rate with optimized games - i.e, arguing hardware is pointless it just tells us who managed to achieve what more optimally (who has the crappier hardware with competitive performance)

That sounds like an absolute nightmare for revision control and documentation.

.net and the CLR was already able to do more or less this. It turned out to not be particularly popular though.

Cas :slight_smile:

They didn’t say anything untrue, if this offended anyone then they are clearly biased and most likely a fanboy. Being upset about that is irrational, a company stating what’s advantageous about their product over the competition is how advertising works.

There was nothing irrational about it, Microsoft’s product was going to be shit and potential buyers were really upset.

Yeah because playing COD on NES is the same as playing COD on PS4/XBOX360!!!

That’s such a silly statement, I don’t even feel like replying how silly it is.

Your statements are very subjective to what you like/want/feel, there’s no point in arguing further if you are going to argue subjectively and act like it’s objective.

My statement isn’t silly, you completely missed the point of it. Not only that, but you clearly have no concept of why game consoles exist.

The point is this: The games need to be competitive, it doesn’t MATTER how they achieve that level of competitiveness. What matters is that they OPTIMIZE the price with performance and software tuning. It is a very common and ignorant misconception of gaming consoles that hardware is significant - it is the product of hardware power, software performance and architecture tuning that is important TOGETHER. Not as individual components. If via one of those three you can achieve a degree a competitiveness you have the remainder of the three are less significant.

It is not subjective unless you want to over-engineer a console and lose millions of dollars on it (wait, didn’t Sony do that for PS3?)

Don’t call my statement silly, it makes complete sense and it has been a concept behind developing game consoles for a very long time. It is the entire purpose of a game console and it is what sets them apart from your desktop computer. Game consoles are CHEAP, OPTIMAL and PREDICTABLE. They shouldn’t need THE BEST hardware to be competitive.

Calling my statements subjective doesn’t make them so. I laugh at the ignorant fools who purchase a console based on hardware specification like they’re out shopping for a desktop computer.

Finally, these game consoles have unique architectures that were designed in different ways. I am blown away by the ignorance of people picking at random hardware components in either console, deriving an abstract idea of their function and applying them as “her durr x is better than y” when really they have no idea in the world. Not even the engineers at Microsoft or Sony have a full idea - the hardware is benchmarked, reduced, benchmarked, reduced until it reaches the predefined performance objectives with MINIMAL and CHEAP hardware and clever architecture and software tuning tricks because it is way cheaper to distribute software optimizations than it is to distribute expensive hardware. It is a constant cycle.

Even in that regard, it’s completely false to say that the PS4 is nothing but upgraded hardware. If that were the case, it would be just as fair to say the XBox One is the exact same thing.

Games make the console, and the PS4 definitely has a lot of games that make it tempting. To that end though, it’s completely opinion. If I cared about the games the XBox One had, I’d be more interested in it. This isn’t a point that can really be argued, though.

I didn’t say the PS4 is nothing but upgraded hardware. I said all I see is a PS3 with upgraded hardware. Like you said, on that respect, it is entirely subjective.

I now understand what “talking to a wall” means.

Adding to the point about tools being in C++, from my perspective it seems that many AAA developers reduce costs by resorting to third-party engines, like the Unreal Engine, and as long as those keep using the same technology, users of said engines are somewhat locked in place.

I’m also of the opinion that a negative for Java in the eyes of developers might be the relative ease by which it can be decompiled.

I, personally, am acquiring a sort of “bias” against Java based on the fear that Oracle might end up screwing Java devs up for the sake of monetizing the language. Let’s hope I’m wrong on that one! :clue:

Let’s not forget that Java’s performance comes after JIT, which not only needs warmup, you’ve also got the memory footprint and CPU overhead of running an optimizing compiler at runtime. Hope you’re into load times. Sure, there’s AOT compilation, but now where’s the value proposition over C++ again?

Well, it’s managed to start with.

Java doesn’t need JIT on a game console because the game console is 100% predictable - it would always JIT to the same thing.

There is Java the language and then there is HotSpot the Java Runtime Environment. There is no reason why you couldn’t save the HotSpot state and cross compile it to a new platform. (Ignoring the obstacle of huge hardware differences.) Runtime recompilation is an option, not a necessity.

Most companies also make games for consoles, and the JVM will be a burden on a console
The industry is hooked to C++ right now, only indies can take the risk to use java
Even if Android has java, it’s easier to use C++ to port from ios

Java programs do not run in a VM any more. Java compilers target a virtual machine; meaning a virtual instruction set, not an interpreter. It’s called an intermediate representation because source code is transformed to an intermediate format before being translated again to native code. Compilers can be made to target different instruction sets. Some compile directly to native code. Android also uses an intermediate virtual instruction set which is different from the JVM specifications. By your logic C and C++ would also not be suitable for consoles.

The two major open source C compilers, the GCC C compiler and Clang also use a virtual machine specification. GCC compilers use two intermediate languages called “GENERIC” and “GIMPLE”. Clang targets an intermediate instruction set for a made up machine called “LLVM” or “Low Level Virtual Machine”. The JVM, GIMPLE, and the LLVM all use the same strategy. (Did I also mention IL for the .NET platform?) It enables x source code languages to target y platforms and only x + y translators need to be written instead of x * y unique compilers. Most generic optimizations are done on the intermediate representation in order to further reduce code duplication.

People hear virtual machine and they think interpreter. They should think hypothetical computer. They are as well defined as a real computer architecture (usually better because there are no undocumented quirks) and are designed to be portable and make optimization easy. The JVM is special because it is a computer that supports heap allocation without using pointers, which has a lot of benefits, though there are also some flaws in the platform that are unfortunate in hindsight. The JVM is a fully functional machine specification. Real world physical computers (hardware) that conform to the specification exist and run Java bytecode directly, but the norm is for bytecode to be (re)compiled to native machine code before it is run.

Patent nonsense. The JIT compiles methods that have run several thousand times, and it uses the tracing information it collects during bytecode execution in order to do so. That’s why it methods in the server VM require more calls as bytecode before they’re ever compiled. And it’s still designed to back out to bytecode whenever it needs to, such as in the face of new overloads appearing.

JVM Bytecode is starting to resemble something like an IR, but that doesn’t mean it is one.