“Latency” is considered to be the amount of time taken from the time when request is made by the user till the time taken for the receiver to get back to that user. In other words, latency specifies the amount of delay on a network or internet connection.
Now, if we check on “Low Latency” – It is the requested resources that will appear instantly without any lag, as soon as the request is made by the other end. But, with high latency the case is different, here you can observe a significant delay between the time the request has been sent and the resources are returned.
https://www.mirrorfly.com/blog/what-is-low-latency/
Now, if we check on “Low Latency” – It is the requested resources that will appear instantly without any lag, as soon as the request is made by the other end. But, with high latency the case is different, here you can observe a significant delay between the time the request has been sent and the resources are returned.
https://www.mirrorfly.com/blog/what-is-low-latency/
“Latency” is considered to be the amount of time taken from the time when request is made by the user till the time taken for the receiver to get back to that user. In other words, latency specifies the amount of delay on a network or internet connection.
Now, if we check on “Low Latency” – It is the requested resources that will appear instantly without any lag, as soon as the request is made by the other end. But, with high latency the case is different, here you can observe a significant delay between the time the request has been sent and the resources are returned.
https://www.mirrorfly.com/blog/what-is-low-latency/
0 Comments
0 Shares
115 Views