LibGDX’s LWJGL (desktop) backend uses Display.sync, but only if the following condition is met:
graphics.vsync && graphics.config.useCPUSynch
(Should be noted that it will use Display.sync by default, as both of those are initialized to true)
IMO Slick’s usage was more reasonable: You set a target frame rate which will be used by Display.sync, and then you can independently enable/disable hardware vsync with setVSync(boolean).
Currently, LibGDX’s unusual implementation could lead to bugs with Display.setVSyncEnabled never being reset.