What Is Latency?

What Is Latency

Date First Published: 25th February 2022

Topic: Computer Networking

Subtopic: Data Transmission Technologies

Article Type: Computer Terms & Definitions

Difficulty: Medium

Difficulty Level: 5/10

Learn more about what latency is in this article.

Latency is the amount of time it takes for data to be transferred or processed to its destination. It is the measure of delay. For example, if Server A started to send data to a server at 06:00:00.000 and Server B received the data at 06:00:00.050, the amount of latency would be the difference between these two time periods, which is 00:00:00.050 (50 milliseconds).

Latency is useful for measuring how fast a webpage or an application will load and how fast data can be transferred from one position to another. Overall, high amounts of latency will result in slow speed, have a negative impact on SEO, and will cause users to get bored of waiting for the website or application to load.

Note: Info Icon

Latency can never be completely eliminated. Every time a network packet is sent and travelling over a network, there is always some latency, which could be of a few milliseconds, but it is important for it to be minimised.

The ping response time is indicative of latency, since it measures the minimum amount of time needed to send the smallest possible amount of data and receive a response. Ideally, the ping response time should be low, since a higher ping response time means that it will take longer for data to be transferred to its destination.

What Affects Latency?

The factors that affect latency are:

  • The distance between the two devices. This is the most obvious factor that affects latency. For example, a client device communicating and making a request to a server that is in a different continent will result in much higher latency than if the server was nearby because the requests have to travel longer distances.
  • The internet connection and the router. Slow routers might cause a delay when a connected device tries to transfer data. In some cases, the cause of high latency can be the user's side rather than the server's side.
  • Congestion and excessive traffic - If too many people are using a website or network at the same time, this will slow the servers down and greatly increase the amount of latency.
  • The ISP - ISPs can sometimes intentionally slow down traffic or bandwidth when a certain amount of traffic has been used. This is known as throttling.

Disk Latency

Outside of computer networking, this is another type of latency. Disk latency is the amount of time it takes for data to be requested from a storage device and for the data to be returned. The rotational latency and the seek time have an effect on the disk latency.

The rotational latency is the amount of time it takes for a sector of the disk (a sector that data is to be read or written from) to rotate under the read-write heads of the disk drive.

The seek time is the physical movement of the drive head to read or write data.

SSDs (solid-state drives) have much lower latency than traditional HDDs (hard disk drives) because they do not rotate.

Disk latency is the reason why reading or writing large numbers of files is often much slower than reading or writing a single nearby file. Since SSDs do not rotate like traditional HDDs, they have much lower latency.


Feedback

  • Is there anything that you disagree with on this page?
  • Are there any spelling, grammatical, or punctuation errors on this page?
  • Are there any broken links or design errors on this page?

If so, it is important that you tell me as soon as possible on this page.


Comments