How can I detect that a client has disconnected from my server?
I have the following code in my AcceptCallBack
method
static Socket hand
This is in VB, but it seems to work well for me. It looks for a 0 byte return like the previous post.
Private Sub RecData(ByVal AR As IAsyncResult)
Dim Socket As Socket = AR.AsyncState
If Socket.Connected = False And Socket.Available = False Then
Debug.Print("Detected Disconnected Socket - " + Socket.RemoteEndPoint.ToString)
Exit Sub
End If
Dim BytesRead As Int32 = Socket.EndReceive(AR)
If BytesRead = 0 Then
Debug.Print("Detected Disconnected Socket - Bytes Read = 0 - " + Socket.RemoteEndPoint.ToString)
UpdateText("Client " + Socket.RemoteEndPoint.ToString + " has disconnected from Server.")
Socket.Close()
Exit Sub
End If
Dim msg As String = System.Text.ASCIIEncoding.ASCII.GetString(ByteData)
Erase ByteData
ReDim ByteData(1024)
ClientSocket.BeginReceive(ByteData, 0, ByteData.Length, SocketFlags.None, New AsyncCallback(AddressOf RecData), ClientSocket)
UpdateText(msg)
End Sub
I've found quite useful, another workaround for that!
If you use asynchronous methods for reading data from the network socket (I mean, use BeginReceive
- EndReceive
methods), whenever a connection is terminated; one of these situations appear: Either a message is sent with no data (you can see it with Socket.Available
- even though BeginReceive
is triggered, its value will be zero) or Socket.Connected
value becomes false in this call (don't try to use EndReceive
then).
I'm posting the function I used, I think you can see what I meant from it better:
private void OnRecieve(IAsyncResult parameter)
{
Socket sock = (Socket)parameter.AsyncState;
if(!sock.Connected || sock.Available == 0)
{
// Connection is terminated, either by force or willingly
return;
}
sock.EndReceive(parameter);
sock.BeginReceive(..., ... , ... , ..., new AsyncCallback(OnRecieve), sock);
// To handle further commands sent by client.
// "..." zones might change in your code.
}