问题
My datagram socket is not throwing a SocketTimeout despite it being set and I'm not sure how to resolve this. The code is as follows. The thing is, if it doesn't receive any messages along this socket connection, it will timeout on the first run through. However, it successfully receives a message a couple of times, it won't timeout later on when a .receive is called.
DatagramSocket serverSocket = new DatagramSocket(serverSyncPort);
serverSocket.setSoTimeout(200);
while(true)
{
receiveData = new byte[1024];
receivePacket = new DatagramPacket(receiveData,receiveData.length);
try
{
serverSocket.receive(receivePacket);
}
catch(SocketTimeoutException e) {}
}
回答1:
From the javadocs
If the timeout expires, a java.net.SocketTimeoutException is raised, though the DatagramSocket is still valid. The option must be enabled prior to entering the blocking operation to have effect. The timeout must be > 0. A timeout of zero is interpreted as an infinite timeout.
Check whether the timeout is enabled properly or not as highlighted in above quote.
Also, later check the value for getSoTimeout(), to verify what it is.
回答2:
A call to receive() for this DatagramSocket will block for only setsockettimeout amount of time. If the timeout expires, a java.net.SocketTimeoutException is raised, though the DatagramSocket is still valid. Here 200 milisecond is set as time out and it should work.
Which environment you are trying on ? It's possible this is the expected behaviour on Windows, as SocketException is coming fine on solaris / Linux env.
来源:https://stackoverflow.com/questions/13832260/datagramsocket-not-throwing-sockettimeout-java