Best way to measure latency between a client and a server

The reason i ask is the way i have been doing it is just measure how long it takes for the initial message to reach the server ONCE
What if the signal get better over or worse over time.

What is the best way to measure changing latency.
Should i take the 3 latest packets and see how long it took them to get here and average them.

I would really appreciate some help

Thank you for your help

You could use an external program like Wireshark or Fiddler to monitor network traffic. This is what you can see with Fiddler:

ACTUAL PERFORMANCE

ClientConnected: 16:11:10.534
ClientBeginRequest: 16:11:10.700
GotRequestHeaders: 16:11:10.700
ClientDoneRequest: 16:11:10.958
Determine Gateway: 0ms
DNS Lookup: 0ms
TCP/IP Connect: 0ms
HTTPS Handshake: 0ms
ServerConnected: 16:11:10.567
FiddlerBeginRequest: 16:11:10.958
ServerGotRequest: 16:11:10.958
ServerBeginResponse: 16:11:11.004
GotResponseHeaders: 16:11:11.255
ServerDoneResponse: 16:11:11.256
ClientBeginResponse: 16:11:11.255
ClientDoneResponse: 16:11:11.256

Don’t know if it’s the best way but for Daedalus I measure it every 4 seconds and update the current ping by doing:

ping = ping*0.3f + newPing*0.7f;

0.3f and 0.7f depends on how you want to be reactive to a ping change :slight_smile:

Thank you.
This was what i need.

And i need to measure it in the code Troncoso

When my game client connect to the server I exchange 5-10 UDP sockets in less than a second to sync both machines. The client run with the same clock as the server. After that, you could always guess the RTT during the game but is it less necessary since server send packets with the real timestamp when the event is happening (moving objects, damages, etc.). The client would know if it is in the past or not :slight_smile:

Why would you ever open multiple sockets on one client.