In XML too, the syntax of the devil himself.
Cas
In XML too, the syntax of the devil himself.
Cas
You simply have to consider how miserable development has gotten, when Maven and Gradle become the appropriate tools.
On the other hand… any successful programming language’s ecosystem will become bloated to the point where things fall apart, and you need a full fledged toolchain to produce hello-world like applications. Take JavaScript for example, in the last decade it kinda escalated a teeny bit, to where the toolchains rival Java’s.
When you program for big business, you’ll find that you’ll hardly ever write business logic anymore, it’s all tying this framework to that, transforming an object to a slightly different object, and flinging it across the network in some shape or form, all abstracted behind 5-ish layers of abstraction, which you have to drill through ocassionally.
You know you’ve had an invigorating day at work, when you wrote a for-loop.
Ok, so… Scar is great, and everyone needs to know about it.
Cas
I’m really a bit confused of what kind of statements I have to read here from experienced people ???
First of all, there’s a huge difference between your own private projects and professional software development.
In professional projects you need always ask the question if a decision makes sense from your company’s point of view. There are also some foundations that one has to accept, no way around it (exceptions included, as always). For example there is the need to have a ci pipeline and a release automation. As well as a platform independent, reliable, build. Using eclipse -> export as jar depends on eclipse, which means most likely your ci can’t build the project, means that your colleague with Netbeans can’t build the project, means most likely that you have no platform independent build. Someone who does such a decision has (most likely) no place in a company and I hope I don’t have to explain that.
In professional projects, one also has to take care of standards - if a tool that gets a job somehow acceptable done, and is an established standard, then it’s okay to use it and benefit from a large job market, well known and established knowledge etc. Just like Java still exists
Ehm…okay. What’s so worse about them? No seriously, don’t start what those tools may not do completely right, I have 7 years experience with them. The thing is - there is always the need for a build tool (if you’re not using scripts only), one way or the other. History told us, that we must not cripple what can be done in a build. Not every project is simple. Not every project has to be simple.
What a build tool has to do is, it has to expose those functionalities in a sensible manner. For example if you have a non-trivial build, maven’s declarative approach fails miserably… when people want to execute their logic, they’ll go and put it in a plugin - looks declarative, but …yeah. When they want to fiddle their logic somewhere between two things, they start to shift maven phases. For those cases, gradle’s approach is better: You can declaratively define your build graph with tasks. Once learned, everybody can understand that. You can write your build as code - script files in groovy or kotlin, logic in whatever JVM language you want to have (for example the same as your project’s main language!). What could be negative about this approach, honestly? Of course, there are some things you have do read, because - surprise - if you want to solve a complex problem, the solution is most times not easy. For simple, small builds, gradle or maven files are very small and are very easy understandable…
Most complains I hear about gradle is from people not able to read even the simplest Groovy code, because…surprise, they are most often Java developers, claiming that they don’t have to learn anything else.
I think there is a huge difference between gradle, which allows you to write a very small piece of logic as a simple local function (as you would with code in your project too), and which allows you to write a medium sized piece of logic as a task or plugin directly in your project (buildSrc) without the need to publish and maintain a seperate artifact, and mavens “you have to make a seperate plugin project for everything”. In gradle, you can chose the appropriate way how to organize your build, just like your code.
One of my favorite examples is having a service you need for your tests running on a random port. It’s a simple requirement that many projects have - In maven, the official way is to use the build-helper-maven-plugin, which requires you to write at least 22 lines xml for the plugin usage. Afterwards, you have to check the documentation in order to find out how to use the generated port(s) as a property (because it is not obvious and there is no standard for this). From my knowledge, it doesn’t work with autocompletion in the IDE, despite this could be an XML based build’s biggest advantage here.
In gradle, its really the most simple thing: a property. ext.myPort = new ServerSocket(0).getLocalPort(). Every Java/Groovy/JVM-developers knows this function, a property/variable is the standard way to do this in any programming language and it wins over XML with 1:22 lines. It is okayish supported by IDEs.
This is just one example of something that is daily business in builds. I’m not going to start how much simpler a single simple if-statement is compared to maven’s complex profile logic (that really no one understands besides the maven users who got used to this pain). The foundations of maven prevent it from being an adequate solution for more complex builds.
And I really don’t want to doom maven in any way, I maxed maven by myself: It’s way better than nothing/eclipse/ant stuff, it is an established tool and it can handle medium sized/complex projects okayish. I would most times be totally happy to use it in production and in my spare time projects (which I still do sometimes). It’s just that we have to be honest about “right tools for the right job” - and Maven is quite limited here.
Not sure why you’re confused about the statements of experienced people here
First you say that these tools fit their purpose, then you list quite a few dreadful issues with these tools (which i could copy and paste from your informative posts).
Either way, it’s certainly true that we couldn’t work without these tools, but enough people in a typical dev-team turn your pom-file in this (excuse my french) clusterf**k, and maven is partially to blame here - the way they designed it, is not intuitive - newbies craft pom’s with copy/paste and checking whether it works, senior developers… do the same… very few people in a team (if any) can actually design/maintain a proper hierarchy of pom-files. The design of the tool, and the terminology used is… not ideal. Surely you can agree on that.
As usual the problem is “let’s design another DSL!” and that’s where gradle and Maven and Ant are all exasperating. The dependence on external sites for builds is also massively vexing and calls into question the professional credentials of anyone using them (“sorry, can’t do the build today, looks like crappylibrary.org is down again”) but that’s an orthogonal rant for another day.
I am considering forking Scar to add THJSON configuration support… hey ho.
Cas
Of course, I agree with everything you said Riven
I just tried to say that no tool is perfect in every regard - maven for example is not suited for complex projects. But it’s an established standard, so foreign people will probably be able to handle your project very fast. And it is suited for use with ci pipelines, there are many many plugins that you don’t have to implement by yourself. So it may be a proper choice for small enterprise projects. Of course there is some investment when one wants to use tools like gradle properly.
Saying that either Maven or gradle is devil’s work and nobody needs build tools or that build shouldn’t be that complex that you need gradle…those are statements I can’t understand.
Na, I have to second that. The way Ant and Maven define a DSL is via XML schemas. Groovy itself as a language is sufficient for build scripts (or Kotlin, yay!), no need for extra constructs - what adds “DSL” for gradle is just how you use interfaces provided by gradle, like project, tasks, configurations etc. It’s really more like using a framework than it’s a DSL, because with gradle there is no DSL, it’s just groovy with the gradle framework. It’s just simpler to say “it’s a DSL”, because it writes like a DSL. The same for Scala and sbt (i think). So here’s only a framework invented that for example can be used fantastically with Kotlin, or Java in a Java based plugin as well.
I think you got it wrong - repository doesn’t mean “external site”. It means you can have different kinds of repositories…flat folder repositories, network shares, local artifactory instances caching the outside world. If you have problems with your build because external services aren’t available, you’re doing it wrong.
Have to admit that I don’t know this tool, so feel free to show it to us in its own thread
you mean to configure your build in your own DSL… 8)
Personally i’m really liking the approach that Jonathan Blow is taking with his Jai Programming Language. The build process is actually part of the language (and written using the actual language). He argues that languages like C++ (and Java by extension) are actually under spec’ed (by not including the build process as part of the specification) and actually demonstrates several pretty cool usages of Jai building his current project using the integrated build process to produce a working executable (the build process even has the ability to specify a game icon for the executable).
A decent video summarising the above and why he took some of the decisions in designing Jai can be found here. He makes some interesting and controversial arguments including some of the problems he considers languages like C++ & Java have. Whether he actually succeeds or not remains to be seen.
I see you bothered to read the recommended usage guidelines before arguing there was a huge difference then! :
My first thought too! Not that we’re really talking DSLs anyway. Although I think I like the idea.
@kappa - interesting link - bookmarked for later perusal.
I had similar thoughts when I was first learning Maven. I’m definitely not a Maven expert, but I can maybe provide a bit more context.
If your project does not use any libraries, or if you have an existing setup and nobody else needs to run your code, then Maven is probably overkill.
But if your project does use libraries, or if other people are going to run your code, or if you want to streamline your setup process, that’s where Maven comes in handy.
Imagine this scenario: I have a project I’ve been working on by myself. I use about 10 different libraries, and I keep them all in a lib directory in my project. I’m not sure which version of the libraries I’m using, but that’s okay, because whatever I have works. I also have a series of steps that I follow in Eclipse to export everything as an application.
But now I have a friend / coworker who wants to work on my project with me. How do I get them setup? I could just send them my lib directory, but that’s not very scalable: what if I actually have 100 coworkers? What if one of the jar files is very large? So I give my friend a list of libraries to download: but wait, what version of the library was I using again? On top of that, how do I transfer over my export settings? Or what happens if I have a possible contributor who absolutely hates Eclipse and wants to use a different IDE?
Enter Maven. With Maven, I no longer need a lib directory full of jar files. Instead, I use a pom.xml file that lists dependencies, and Maven will fetch and add them to my classpath automatically. It will also track versions, so updating which version of a library I’m using is now a one-line change, and everybody will get the new version. I also no longer need to keep track of my Eclipse-specific way of deploying: I just put the build rule inside my pom.xml file, and Maven will take care of that. Now everybody on my team has the same deployment steps.
Here’s a real-world example: if I have a web project that I deploy using Google App Engine, I actually need about 50 jar files, an Eclipse plugin, the Cloud SDK, and to run some tools inside the SDK when I want to deploy. Sharing that project is going to be super annoying. However, I can use Maven to keep track of all of that, and deploying is as simple as executing one Maven command.
You can use Eclipse and Maven at the same time. Eclipse actually works very well with Maven. Maven isn’t an editor. It’s a tool that you can use alongside your editor.
That’s actually one of its selling points. If your project is setup using Maven, then you can use any editor your want: Eclipse, Intellij, a basic text editor, whatever.
I’m happy to talk about this stuff. I definitely remember being very confused about Maven when I started reading about it.
Nah, I’m sorry, but I read this guide several times months before this discussion, as well as most other resources available about gradle. And my statements are totally conform to it. My statement was, that gradle allows you to chose the appropriate form and place to write your build logic. I (hopefully :P) never said that declarative should not be the very first choice. But I said that it’s a fallacy to think that everything has to be a (separately distributed) plugin. In their first example, they show a conditional task definition. This is something that fits into a plugin.
One of their next examples is about minimizing stuff that is done in the configuration phase, with
task printArtifactNames {
doLast {
def libraryNames = configurations.compileClasspath.collect { it.name }
logger.quiet libraryNames
}
}
They use a local variable to store what they want to log. this is imperative, because the use of a local variable is totally suitable here. If the logic to gather the items to log would take two or three lines and could be reused by another task, there is nothing wrong about just extracting a script local function and reuse it.
In their example, they use the apply plugin syntax, which is not declarative either…that’s what I meant with not everything is perfect in every regard, because with complex things, one can’t do everything right everytime. And of course, people can misuse code in a build, just like literally everytihng can be abused, but pleease, please let’s not talk about abusing features or when to have exceptions from a very strong rule.
But now I have a friend / coworker who wants to work on my project with me. How do I get them setup? I could just send them my lib directory, but that’s not very scalable: what if I actually have 100 coworkers? What if one of the jar files is very large? So I give my friend a list of libraries to download: but wait, what version of the library was I using again? On top of that, how do I transfer over my export settings? Or what happens if I have a possible contributor who absolutely hates Eclipse and wants to use a different IDE?
No, no, no! You commit the lib directory that you use that you know works and that’s all there is to it. It doesn’t matter how many co-workers you have, or how big the libraries are. Those are irrelevant considerations. What is relevant is a repository that checks out in its entirety from a single location and works, 100% of the time, every time. There is nothing more annoying than trying to build some source code and some dicky network issue breaks the whole damned thing.
Cas
You commit the lib directory that you use that you know works and that’s all there is to it. It doesn’t matter how many co-workers you have, or how big the libraries are. Those are irrelevant considerations. What is relevant is a repository that checks out in its entirety from a single location and works, 100% of the time, every time. There is nothing more annoying than trying to build some source code and some dicky network issue breaks the whole damned thing.
There is no such thing as 100% availability. No server provider will give you 100% uptime. And you obviously need an internet-accessible server (be it for SCM repository or Maven repository or whatever) when you collaborate with multiple people.
So, from a “get-a-working-build”-perspective, there is no difference between the SCM repository server being down or the Maven repository server being down. Just that the latter is muuuuch much less likely if we are talking about Maven Central. So even if you host your own SCM repo on your own server checking in all libraries, this will have a much worse uptime than a global Maven repository. Even if you take down your own VPS server with the SCM repository just for one minute because you need to install some OS update or whatever, from that moment on it will have a worse uptime than Maven Central.
For example, BitBucket (aka. AWS Cloud North America) has the f**kin’ worst uptimes I have ever seen. I have a customer who hosts all their sources on BitBucket and it is a freakin’ nightmare. There are outages or timeouts literally eeevery day…
So in the end: You are ALWAYS better off pulling your dependencies from a global Maven repository. And release versions there stay constant/immutable.
That’s absolutely wrong. Our internal SVN servers are 100% reliable (in that even if they explode into flames we can simply restore the live backup onto a new instance - they are located on-site*). When we need to get a build done from scratch onto a clean machine (which happens a fair amount), we need go no further than our LAN. And then we know what is built today is exactly what was built yesterday.
I cannot believe any actual software development company would rely on external servers to be able to build their product? Like, for example Github. Mental. Who on Earth would trust their crown jewels to an external provider? Any sane CIO would be spitting feathers if he discovered it.
Cas
I’ve just realised having replied that there may be two entirely different kinds of developers here… there are those like me who work for a software company, whose business it is to make proprietary software and sell it, and we work in offices and have a LAN and trusted vLANs and so on to collaborate with the other offices around the world… and there are those that work on Github/Sourceforge etc sorts of places who are essentially building large collaborative open-source projects and nobody’s paying for anything. And yes, I do get why you’d use Maven for that.
Cas
I cannot believe any actual software development company would rely on external servers to be able to build their product? Like, for example Github. Mental. Who on Earth would trust their crown jewels to an external provider? Any sane CIO would be spitting feathers if he discovered it.
Actually, many companies do. First and foremost probably Netflix and Amazon. In this particular case however, it is an international company with many sub-companies around the globe which needed centralized access to a single source repository because it’s a more-or-less unified software being worked on from multiple companies.
Hosting it in one company’s location was not an option because of multiple reasons:
No, no, no! You commit the lib directory that you use that you know works and that’s all there is to it. It doesn’t matter how many co-workers you have, or how big the libraries are. Those are irrelevant considerations. What is relevant is a repository that checks out in its entirety from a single location and works, 100% of the time, every time. There is nothing more annoying than trying to build some source code and some dicky network issue breaks the whole damned thing.
I am a little bit torn on this one. Getting a working lib directory without some kind of dependency management software like maven or gradle might be a tedious task in bigger projects, so I very much like the declarative style of managing dependencies (and their transitive dependencies). On the other hand I tried to work on a private project while travelling by train which I didn’t work on my notebook before. It took me 2h to get it to compile, because I only could download dependencies while staying long enough in a train station
Also the size of binary files do matter for some setups. Plain git for example creates full copies of binary files on commit and to make matters worse it also checks out the complete repository inclusive history and all branches of a project on fresh clone. So even deleting a binary file (without deleting it’s history) won’t help much. I also worked as a DevOp for a very large german company and disk space was very expensive in their infrastructure.
I worked with both - lib folders and maven/gradle dependencies - in the past. Both work, but I usually prefer the dependency style over the lib folder. But since over the last 10 years I was the person that designed, audited and fixed build systems using ant, maven, gradle and jenkins (putting poorly written Jenkinsfiles in the mix is even more fun) for multiple dev teams, I am a bit biased.