Date First Published: 25th February 2022
Topic: Computer Networking
Subtopic: Data Transmission Technologies
Article Type: Computer Terms & Definitions
Difficulty: MediumDifficulty Level: 5/10
Learn more about what latency is in this article.
Latency is the amount of time it takes for data to be transferred or processed to its destination. It is the measure of delay. For example, if Server A started to send data to a server at 06:00:00.000 and Server B received the data at 06:00:00.050, the amount of latency would be the difference between these two time periods, which is 00:00:00.050 (50 milliseconds).
Latency is useful for measuring how fast a webpage or an application will load and how fast data can be transferred from one position to another. Overall, high amounts of latency will result in slow speed, have a negative impact on SEO, and will cause users to get bored of waiting for the website or application to load.
Latency can never be completely eliminated. Every time a network packet is sent and travelling over a network, there is always some latency, which could be of a few milliseconds, but it is important for it to be minimised.
The ping response time is indicative of latency, since it measures the minimum amount of time needed to send the smallest possible amount of data and receive a response. Ideally, the ping response time should be low, since a higher ping response time means that it will take longer for data to be transferred to its destination.
The factors that affect latency are:
Outside of computer networking, this is another type of latency. Disk latency is the amount of time it takes for data to be requested from a storage device and for the data to be returned. The rotational latency and the seek time have an effect on the disk latency.
The rotational latency is the amount of time it takes for a sector of the disk (a sector that data is to be read or written from) to rotate under the read-write heads of the disk drive.
The seek time is the physical movement of the drive head to read or write data.
SSDs (solid-state drives) have much lower latency than traditional HDDs (hard disk drives) because they do not rotate.
Disk latency is the reason why reading or writing large numbers of files is often much slower than reading or writing a single nearby file. Since SSDs do not rotate like traditional HDDs, they have much lower latency.
If so, it is important that you tell me as soon as possible on this page.
Network Services Network Setups Network Standards Network Hardware Network Identifiers Network Software Internet Protocols Internet Organisations Data Transmission Technologies Web Development Web Design Web Advertising Web Applications Web Organisations Web Technologies Web Services SEO Threats To Systems, Data & Information Security Mechanisms & Technologies Computer Hardware Computer Software Ethics & Sustainability Legislation & User Data Protection