I have a real-time application with clients using websockets to connect with a Spring Framework server, which is running Spring Boot Tomcat. I want the server to quickly (wi
The approach I eventually took was to implement an application-layer ping-pong protocol.
p
to the client. n
ping messages without receiving a pong response, it generates a timeout event. n*p
time. There should be a much simpler way of implementing this using timeouts in the underlying TCP connection.