Not contrary to me, certainly! I disagree with you on module systems or Maven being complex bollocks in general (Gradle is obviously! ) Work with either long enough and the gains generally outweigh the pains. But yes, Java 9+ modules do seem to introduce a massive amount of complexity without solving very much.
I am intrigued to know what adoption of modules is going to be like when Java 11 is finally out.
I can totally understand @princec’s stance and I share it. Jigsaw or the Java 9 Module System is for modularizing the JRE and thus being able to only include the used parts of it in order to keep the footprint of a deployed application small, which is definitely a thing for cloud-based (micro-)services. But that’s about it.
Java probably does not want to get left behind in the language/platform popularity fight when it comes to microservices or cloud services, where JavaScript with Node is apparently gaining kilometers every day. So, everything needs to be small, including the JRE.
But user applications already use other means for modularity, be it either a big folder of jar files or dependency management systems like Maven, Ivy or Gradle. So, Java 9’s Module System does not help anything here. It just becomes a necessity iff you want to strip down the JRE, since then you need to declare and modularize your own application and express their JRE module dependencies as well.
As for myself, I (or any German customers I work for) is soooo many lightyears away from migrating into any cloud or microservice architectures, and everything is deployed on a big server either in-house or a single VPS from a data center with massive amounts of HDD, that it is never a concern whether the deployed application weighs in at 20MB or 200MB. That’s why the Module System is currently completely ignored by me. Heck, most customers still run on Java 5, most on 7 and veeery few on 8. For the reason: It just freakin’ works! Most don’t even use Java but run COBOL on IBM mainframe systems, which work even better.
Also I NEVER understood the “we must strongly encapsulate the internal JDK classes even more” thing. Can someone please explain to me why the JDK classes and my application classes and all controlled dependencies - that I know I am using and what they are doing - must be shielded away from each other? When I use an internal API then this is freakin’ my own decision/fault. Yes, of course I can --add-opens everything to everything, but why the hell is this not the default? Who needs to be protected from whom and why exactly? This is exactly the same as the SecurityManager which no one is actually using. Is anyone here using the Java SecurityManager? That was only invented to protect the user from running unknown and potentially malicious code they download from somewhere off the Internet and ran in an Applet. Has anyone actually done this? No one certainly does this today and everyone knows their dependencies. Java is not an “Internet Language” where everyone runs untrusted/unknown/insecure code all the time.
Unfortunately I have to work in javascript/nodejs/typescript in my dayjob now, and what I see that because javascript and “doing it strict” are almost the opposite meaning, everyone does everything. Therefor I have to face with bugs, when one my dependency’s dependency’s dependency fails, because of a wrongly architected visibility/permission.
Of course, hacking was always been a habit, but nowadays it’s getting more popular, especially in the mentioned field.
I might be too young with my 32 years and 15 years work experience, but I do love how strict the new module system is. And as its not obligatory to use, everyone will be happy. And I think, trying to stay on the track with other languages in today’s development rush is good, because I don’t want to see more nodejs applications on the server side then I see now
Disclaimer: My opinion too is, that the module system shouldn’t have been introduced, at least not in the way it happened.
Ahhh common, you know why Did you forget about unsafe? It’s the best example why one wants to decide which classes one wants to expose from a module and which ones not.
Same for every other dependency that wants to give you an api and hide all implementations from you. Or if you operate on a platform where other people add their own modules. Once you work in such an environment, you learn to appreciate it, at least that’s what I think about it. I think it’s a typical problem that people tend to not understand as long as they don’t have such a need to be solved. I also guess that there wouldn’t be so much discussion about it if the module system would have existed since ever, because than, there wouldn’t be any incompatibility/breaking changes thingies.
Yes, I work for a company that uses it. But honestly I have no knowledge about it, I think we use it for file access restriction and reflection restriction.
Counter question: Do you always make all your fields public? Or your constructors, and never use factory facades? The answer is simply encapsulation. Not what against whom, but simply everything from everyone. It makes everything more reliable and reduces leaking of things.
It was already possible to create your own JRE without jlink before Java 1.9 and it was already possible to ship your game with a custom JRE before Java 1.9. Java 1.8 introduced some profiles and in the past, you could already take a binary build of OpenJDK and remove the “useless” stuff (nsigma already mentioned that more or less).
I find the module system a bit difficult to use and it’s not enough reliably supported in the most major IDEs (Eclipse, Netbeans, …) to encourage me to use it in my projects.
Some internal APIs will become impossible to use in a later release whereas there aren’t any replacements for some of them. I stick with Java 1.8 for now.
I’m not entirely clear on what the module system achieves?
Static class dependencies are automatically resolved at compile time & listed in the constants pool of each class.
Dynamic class dependencies (e.g. Class.forName(String) ) aren’t made less fragile by this module system, you just get an extra layer of runtime errors.
It seems like this explicit declaration of module dependencies is fragile data duplication that increases the manual overhead of maintaining a code base.
Moreover it seems like it’s treading on the toes of packages, and jars.
Is the sole gain just added security against versioning conflicts?
I honestly had no idea this even existed until I read this thread
One of Java’s great strengths was the ability to, if necessary, completely bypass the compile-time checks for access to stuff and get at it with reflection if you needed to. It is after all, your computer, doing your bidding. I don’t see that someone should be allowed to say “no you can’t see this” and actually have it strictly enforced by the JVM. It’s my bastard machine, I’ll do with it what I will. The 0.01% of users for whom this might be an issue should maybe have come up with some other solution that didn’t involve buggering everything up for everyone else.
And as for modularisation… as @gouessej says, we’ve had a trivial solution for years. You just delete native binaries and remove packages from rt.jar until your JRE is small enough. It was trivial. I even had a utility which took the output from -verbose:class and automatically stripped everything else out of rt.jar to make it even more trivial. Until I just simply stopped bothering when bandwidth and disk space became effectively unlimited. All that really needed doing in this respect was a little moving around of classes to fix some odd cross-package dependencies that had been in the JDK base libraries for years.
As for the “exports” feature of the module system… would have been perfectly well served with a class annotation to suggest that classes were default hidden in code suggestions to IDEs. Then you can have a jar with all sorts of internal classes of no real interest to anyone else not cluttering up your ctrl-space suggestions list.
No I have not at all forgotten about sun.misc.Unsafe, since like every freakin’ library depends on it and I use it myself. This is the biggest question mark in my head: Why does everyone argue about Unsafe being bad and should never have been introduced and should be put behind ten meters of concrete. I completely do not get that. It always had a stable API (albeit undocumented, but who cared) and it served its freakin’ purpose. But NO, Oracle/the JCP suddenly thought that eventually, Unsafe should be hidden away, which thankfully did not yet happen (even with the current JDK 12 EA), because everyone is relying on it and it never was an issue.
But noooo… we have to create object-oriented freakin’ abstractions for half of the things Unsafe exposed in a much cleaner/direct way. VarHandles access to off-heap ByteBuffers, which will take Oracle months if not years to efficiently down-level to fast native code, so that it is not like 2x slower than Unsafe. So they decided to put massive amounts of effort into destroying what had already worked and trying to come up with unnecessary abstractions for it.
Beforehand, I would like to say again that I don’t agree with the JPMS and all the unsafe stuff
You answer the question by yourself. It was and is a dangerous implementation detail that should be completely hidden from anyone. If it never had been accessible somehow, no core JVM framework would rely on it and we wouldn’t have any problems now, because the problems never existed.
No, it never had an API. And it is no API. It is a mixed bag of different, partly totally unrelated things that don’t conform to the rest of the Java platform.
Don’t get me wrong: People’s demands are totally right and understandable, we all have them. For example they want fast and unchecked direct buffer access. Then the answer is not Unsafe, but an official API with defined boundaries and with a common understanding. And this API won’t give you the possibility to create an object instance without calling its constructor - if a demand for this feature is strong enough, this should be another official API, well designed and accepted by everyone.
Of course, since everything was fucked up already, but worked extremely well, I tend to just leave it as it is and give people all the Unsafe. And if it wouldn’t have been available all the time, maybe the JVM wouldn’t be as strong as it is today, because people (really, not only officially) lacked features they sincerely want to have.
All that said, this is really just one example of why modules make sense: Encapsulation should be a decision of the writer, not the user.
Oh, I disagree about that. Encapsulation is a hint. The fact is, if there’s code running on my computer, I want absolute and total control over that code. Most of Java’s historical and entirely stupid, avoidable problems and massive overengineered are geared towards that one, utterly stupid decision to get Java to run as applets in browsers. All that security manager stuff… all that encapsulation… all that “no you can’t have access to Unsafe because it’s unsafe and you might hurt yourself”… all that refusal to just simply add a runtime flag to the VM to turn off bounds checking … that was all borne of a totally pointless desire to attempt (and indeed, fail) to get signed or unsigned Java code to run in the browser.
Every other use case for Java’s security mechanisms has been and still is and always will be a total, complete, waste of everyone’s time. If I ruled the world right now I’d go through the whole of OpenJDK and remove every last scrap of code security from it. It has only ever been a massive hindrance to Java’s adoption.
That’s not just about security though, it’s about efficiency and provability of code. It’s another reason Java has been successful. I’d rather Java adopted more from things like Rust than attempted to chase C/C++ down a rabbit hole. Who needs a VM flag to turn off bounds checking when the VM can do that itself.
I agree, the module system appears to be a gross duplication of permissions. In particular the need for explicit ‘exports some.package;’ for every single module’s packages and sub-packages that are meant to be public. And then on top of that, any other module using that first module’s packages must also declare that it ‘requires firstmodule;’ in its module-info.java file. Thank goodness, at least not every package needs to be explicitly required.
I actually couldn’t believe the need for export statements so I checked out the jdk 9 source for the java.base module and yes, there’s pages and pages of boilerplate export statements:
The idea of small custom JVM’s and the ability to ditch bloated API’s like AWT and Swing is fantastic. I suppose the module concept helped the JVM developers achieve this.
But for us consumers of the JVM, it’s not clear how multiple layers of encapsulation security in the private/protected/public classes and fields and now the matching ‘export’ and ‘requires’ statements in multiple module-info.java configuration files achieves anything more than duplication. Or more accurately, triplication.
Hm, it makes me a little sad that the topic is discussed that harshly here. We’re talking about modules and information hiding, and I thought it is generally accepted that it’s better to be private to others by default. And explicitly open up things one wants to be an interface for others. Not agreeing on that makes a discussion some kind of hard - really reminds me of young programmers making everything public because who cares about visibility. Encapsulation and information hiding is key to reliable, safe software.
Also, I think we mix up things here. We have again a lively discussion about unsafe, which is just an example for something where it probably would have been better if it wasn’t ever reachable for anyone (and I’m again, not saying that we shouldn’t have the features like unchecked buffer access etc). And the security stuff doesn’t relate to JPMS directly in the same way. If I write a module that runs in an environment at a customer, like modules from other third party companies, than my module shouldn’t be allowed to write to their config directories. And I shouldn’t be allowed to use reflection on the runtime and make everything accessible - simple as that.
Whether or not the module system should be in the JDK, it isn’t a duplication of existing permissions. You’ll see the same mechanism in OSGi and the NetBeans module system, and maybe elsewhere. And it’s really useful! There is no way within the private/protected/public system to define code that should work together but not be exposed as API across different modules. Unless you use the horrible approach of splitting packages across modules / JARs, which is error prone and explicitly not supported by the JDK module system. If you’re writing an application framework or library (or JDK!) that has optional or OS specific parts this handles not exposing implementation details to the world. It also allows you to have packages that serve multiple other packages in your API.
I agree. I was reading a blog that proposed making fields and methods accessible as public by default rather than protected.
It wouldn’t matter if this was implemented now anyway since this low level public encapsulation is over ruled in Java 9 and onward if the public method’s class’s package’s module has no module export statement in its module-info.java.
Would be interesting to hear James Gosling’s opinion about modules, but I can’t find anything on Google.
Ok, I see your point about there being no way of hiding things without modules. Some see strict encapsulation as a feature, but princec and I regard it as a pain in the neck. Means I have to copy your entire source code if I want to make some small change that you didn’t foresee in your library.
There is still triplication of permissions to make a method public beyond the module. The method must be public, the method’s module must export the package with an export statement in its module-info java file, and the module where I’m using the library’s module must have a requires statement in its module-info.java. And I also have to include the library module in the module path when compiling and running the public method. Quite laborious for little gain in my opinion. Hopefully the tooling will eventually make it invisible.
Not necessarily, it means you have to do something explicit to depend on implementation details - eg. --add-exports or --add-opens. This is a good thing! It’s not about disabling all access, just an obvious here-be-dragons, and a good pointer to the fact that it’s probably going to break if you upgrade. I have a number of native binding libraries that I maintain that have a lowlevel package where all the actual interaction with the native layer happens, and is definitely not intended to be a stable API. No matter how much you tell people to keep their mitts off it, I’m still answering queries about breakage!
All encapsulation should be overridable by the consumer - I agree with you both there. In fact, have been arguing that exact point in a review of the Apache NetBeans module system recently. Explicit but possible is what I’m arguing for.
As for patching code, if that’s what you mean rather than access, then absolutely that should be done and compiled with the whole source code IMO.
java 11 has been modularized, so the whole jvm has been dispatched into small modules(in jmod format)
and javafx 11 is distributing their own jmods
so in order to use javafx 11, you have to integrate jfx jmods with default jmods to create you own customized jre
by using jlink tool