The Artima Developer Community
Sponsored Link

Legacy Java Answers Forum
June 2000



This page contains an archived post to the Java Answers Forum made prior to February 25, 2002. If you wish to participate in discussions, please visit the new Artima Forums.



Posted by Sri on June 21, 2000 at 10:50 PM

> I've created a nice client/server application working with sockets. So long, so good. The only problem is that the server must detect when a client disconnects. The client can disconnect programatically (e.g. calling a method client.disconnect()) or there can happen something which disconnects the client (kill the client process). In both cases, the server must be informed to deallocate the resources ment for that client. My ideea was to connect the client to a second server port, and let the server read from the client's InputStream. The read method blocks until the client disconnects (because the client doesn't write anything), and then I can deallocate all resources on the server. But the read block is quite unhealthy, because my thread where this read occurs uses the processor in a pretty egoist way. I have no solution, maybe you can help me.
> Thanks.

You can trap the SocketException that is thrown when a socket
connection is lost and perform the necessary operations in the
catch block.



Sponsored Links

Copyright © 1996-2009 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use - Advertise with Us