Is this a design flaw in the NIO classes? I can’t find an API-way to guarantee being informed when a SocketChannel gets disconnected…the best I can do is to discard the whole point of NBIO and periodically read from every SC, ignoring the Selector!?!
There’s no “OP_DISCONNECT” to listen for, and it seems that most of my SC’s just drop silently, without triggering an OP_READ event (which I would have expected).
I thought this was a bug from 1.4.0 / .1 that had been fixed by now, but I’m getting this with 1.4.2
The funny thing is that you won’t realise it’s happening, and it’s never been a problem for me before (because I didn’t care). But now I have an app where I have OTHER code that is dependent on being notified when an SC disappears. I store info for each SC as it is accepted, which I then need to supplment with data like “total time connected” - which I can’t fnd out unless I get close/disconnect notification.