The tutorial here has an explanation about it.
You are correct about Sys.getTime() returning milliseconds on some platforms, however since LWJGL is cross platform its not guaranteed that every platform will do the same and the value could even change on current platforms in future LWJGL releases.
Ticks can have a lower accuracy than milliseconds (can be harware related, like some old arcade machines had hardware timers that ran at 60 ticks a second as games on it only need 60fps) or might even have higher accuracy than milliseconds (like nanoseconds or more in the future), it all depends on the underlying platform.
So Sys.getTime() will return the number of ticks (whatever the implementation underneath thinks is best for platform) and then Sys.getTimerResolution() will tell you how many of those ticks there are in a second. From which you can calculate the timer resolutions you require (millisecond, microseconds, nanoseconds, picoseconds, femtoseconds, etc).