Advantages of Ant?

Yes, experienced the npm hell! >:( From what I remember you have to specify an exact version for dependencies to build against in Maven so that every build is reproducible. That is how to handle external dependencies IMO. The problem @princec outlines should not be possible with Maven.

… oh, and while I maintain a number of libraries that use Maven for build, my main project uses Ant for reasons similar to @gouessej and I really can’t be bothered trying to find workarounds. Make of that what you will. ;D

What’s wrong with npm? It’s great.

Usually it’s a good idea to use exact versions for your dependencies regardless because sometimes people use semver wrong, either unknowingly or unintentionally, and break stuff.

Speaking of npm, the ecosystem has gone through a few task runners itself: grunt, gulp, bundlers (browserify, webpack, rollup…) and more recently npm scripts to name a few.

NPM is indeed great and I’m releasing a bunch of modules currently. I use open semver with whatever is the major version for all dependencies in package.json and then use npm shrinkwrap when doing releases. Problem solved. Before doing a release I’ll test with the open semver versioning and update anything that needs updating then use shrinkwrap again as necessary.

I have a standardized build / test NPM module including building documentation, typhonjs-npm-build-test that automates everything in a coherent way across all TyphonJS modules. In the next update I’ll be adding automation for the shrinkwrap process in the publish script. Publishing will fail if the shrinkwrap changes as npm-shrinkwrap.json should be committed in the repo.

Of all things I’ve completely overhauled ESDoc adding tons of features and a much more flexible architecture which supports Babylon for ES6+ doc generation and soon Typescript along with other things like Flow and internationalization of docs generated. I’ll soon be releasing a CI support module for building docs on commit and uploading them to a document specific repo for all organizations / repos across a larger multi-repo effort (like TyphonJS) with a container based doc server and web app which pulls down changes automatically and deploys them online as commits happen in addition to hosting older versions.

The next version of typhonjs-npm-build-test will include automation of the shrinkwrap process and switch to TJSDoc (a placeholder presently for the next few days!) from ESDoc.

Apparently I can’t shake the tools creation bug… ::slight_smile:

We simply do not allow any external dependencies in a build, period. The entirety of the repository must be sufficient to produce all artefacts.

Cas :slight_smile:

Wow!

So you’ve got an entire VM image containing OS & apps checked in? ;D

Yes.

Cas :slight_smile:

Sorry, I didn’t follow the whole discussion, but I have to ask you Cas: Your dependencies are checked in into your source code grepository? As in “they are versioned, together with your own code”??

I thought that we got out of the nineties already :smiley:

Dependencies in the maven central (and other repositories) are immutable - assuming “everything is in the repo” is roughly the same as “assuming the repo is enough to build the project, if you have internet”. Taking care of somebody’s internet connection at build time is the same as taking care that his hdd has enough space - it’s not your problem.

Sure, your argument about a man in the middle or other network relates things would still be valid.

Yes, they are.

When a software problem costs $100,000 a day (which it does, where I work), we need to be able to fix, and build, without issue. We had one instance last year where we couldn’t get to kernel.org. That pointless outage cost our customer another day of downtime.

We also don’t like things unexpectedly changing while we’re not looking.

Cas :slight_smile:

Yes, that’s why companies host their own artefact servers as a proxy - to have an internal release and to be able to continue working without internet. Sure that this shouldn’t be the way you should tackle the problem? I mean… common, how big is your repository?!

Where should this ever happen? If you use immutable dependencies, there’s nothing gonna happen.

The size of the repository is irrelevant - data has to be somewhere.

You may be able to declare immutable dependencies but either you can’t guarantee the actual contents are immutable, or you’re back to having things change unexpectedly behind your back. Or right in front of you for that matter.

I’m not sure I see why anyone sees a problem with this cast-iron, failsafe approach to repository management.

When we want to update a dependency, we check in a new version of whatever thing we want to have updated, test the crap out of it, and then forget about it ever going wrong on us again.

Cas :slight_smile:

Because it’s not cool. Even a HelloWorld has to depend half the internet (and download it) to be build or otherwise, it’s just not fancy enough.
And as we all know: The internet is static. maven central, packagist and whatever else comes to mind will always exist from now on to the end of time. And so will your internet connection.

It’s not a matter of coolness - it’s a matter of by reference or by value. Nowadays, with dvcs, you don’t want the disadvanteges of checking in binary files into your reposirory, because it is simply not necessary: Your repository is code that changes over time. And what changes over time is your dependency, not the dependency itself. Since the dependency is shared between your collegues or even the rest of the world, it makes sense to have it available via a shared artefact repository. It is then not versioned in your repository, what should be conceptionally more right than wrong. And as you should know: The internet is not neccesary to use a dependency, because it is cached locally or by your artefact server at your company. And when maven central goes away, we have to defrost the hell, so let’s talk about that if it happens already.

I don’t want to convince anybody - If you don’t experience problems with your solution, just keep on going :smiley: But there are better ways to solve them. And checking in half the internet in a private repository is better than downloading it once? xD

I’m using maven and stuff myself, so there’s no need to convince me. I just don’t think that it’s all great and has no disadvantages. Relying on external or internal servers is a disadvantage. It might not matter in most cases or the benefits might outweigh them though. However, I can walk into my basement, grab my 25 year old Amiga discs and compile the content on the actual machine or an emulator, because it’s all there. Try that with any application of today in 10,20 or 30 years…good luck! Again, that might not matter but it shows a problem.