Abstract:
When a TCP connection between a client behind a NAT and a server is idle for a long
time, it may get torn down due to TCP binding timeout. In order to keep the connection alive,
the client device needs to send keep-alive packets through the connection when it is otherwise
idle. To reduce resource consumption, it is preferred that the keep-alive packet is sent at the
farthest possible time within the NAT binding timeout. This interval is called the Optimal
Keep-alive Interval. Due to varied settings of different network equipments, optimal keep-alive
(KA) interval will not be identical in different networks. Hence, it needs to be dynamically
detected. In this thesis, we employ several search approaches to dynamically detect the optimal
KA interval. These include binary search, exponential search and hybrid search. Hybrid search
combines different aspects of binary and exponential search techniques. We present theoretical
analysis of different aspects of these techniques. We also conduct simulation based experiments
to compare these techniques. Based on the theoretical studies and the experimental results, we
conclude that hybrid search should be used in detecting the optimal keep-alive interval of a TCP
connection.