Slow internet is a bummer. It’s the quickest way to ruin your mood. Most of the time, we begin reconfiguring our devices to find a solution, but fail to realize slow internet is a problem usually caused by poor network performance.
Network performance isn’t only about how fast your internet speed is. There are multiple factors that play a key role in determining the internet speed delivered to your devices. Latency, bandwidth, and throughput are terms you might have come across while trying to find out how to optimize your connection for gaming. Each of these attributes allows us to monitor and evaluate network performance.
So, how do you actually measure network performance? How do you interpret the metrics which determine the speed of data transfer across a network? What’s the difference between them?
Well to say the least, although bandwidth, latency, and throughput are often mistaken to be synonymous, they are quite different when you understand them better.
Bandwidth is essentially the maximum amount of data that can be theoretically transmitted at any given time. And like so, it serves as a measure of how much data can be sent and received at a time. A connection with higher bandwidth, for instance, fiber-optic, allows you to receive and send more data at any given point in time—whereas a Copper based DSL connection comes with limited bandwidth by default.
It is common to mistake bandwidth for speed, but while it can affect the rate of data transfer, bandwidth is linked with the capacity of the network and is not a measure of its speed. It simply tells us how “wide” the communication “band” is. A high-capacity communication link, quite like a water hose with a big diameter, will allow more data to flow per second, as opposed to a hose that is smaller in diameter.
Imagine a highway with a fixed speed limit—if all vehicles on the road must travel at the same speed, the only way to get more cars out there is to make the highway wider. Likewise, if you download a 10MB file over a connection that has 1 Mbps bandwidth, you’ll take 10 seconds to do so. But, if you increase the bandwidth to 10 Mbps, the same 10MB file can now download in 1 second. The speed of your data transfer remains the same, only the “width” of the communication “band” increases making it possible for more data to be carried through in one second.
So, higher bandwidth does not translate into faster network speed, you only perceive it to be so. Because while increased bandwidth allows more data to be transferred at a time, it does not increase the speed at which the network is transferring the data.
Bandwidth can be symmetrical and asymmetrical depending on the type of connection you have. Fiber Optic networks deliver symmetrical bandwidth, while Coaxial and Copper networks cannot deliver the same. What that means is with a fiber internet connection you will be able to transfer the same amount of data, downstream and upstream, at a time. But with a Cable or DSL connection, you’ll be able to transfer more data at a time while downloading relative to when you upload.
Simply put latency is the measure of delay—which takes into account the round trip of data traveling from AèBèA. So the longer it takes for information to come back to your device, the higher the latency, and the slower the network. This is why satellite internet is the most infamous for high latency. After all, it takes longer for the digital signal to travel all the way to the orbiting satellite in space and then back, unlike when you are connected to a wired terrestrial network.
Latency is measured in milliseconds—because the process of transmitting and retrieving data is awfully quick. And, the various internet types by default have different latency levels—because of variations in network infrastructure. Satellite internet tops them all at an estimated 800ms, while fiber-optic broadband boasts the lowest at an estimated 12ms. The reason why satellite internet is not considered the right option for serious gamers who cannot afford to lose to the lag. And, also the reason why hard-core gaming is best done on a fiber optic internet connection, which guarantees no ping spikes. Meaning, no hike in “ping rate”—another term used to refer to network latency.
So, you could be on a high-speed internet connection with ample bandwidth, but if latency is high, you are bound to face delays—meaning while it may only take you 5ms to download a file, with latency at 100ms, you’ll have to wait as much before the download even begins.
Throughput is where many users get confused. Like bandwidth, it is the amount of data that can be transmitted at a time. The difference is throughput is a real-time measurement, whereas bandwidth tells us of the theoretical max.
Many factors affect the throughput of your connection. For certain types of traffic, it is packet loss that gets translated into low throughput—because lost packets have to be retransmitted. While for other types of traffic it is the high latency that makes throughput take a dip—because before the next packet is dispatched, the previous one must be acknowledged. Network congestion is also a factor critical to throughput—the more the users on the network, the lower the throughput is bound to be.
Throughput is measured in bits per second as well as data per second. And, reflects real-time network performance statistics. Using throughput to measure network speed often helps with rooting out problems that cause sluggishness in a network.
As we said earlier, the difference between bandwidth and throughput is the former is theoretical while the latter is a real-time measure of how much data is transferred at a time. Meaning, bandwidth tells you how much data can be transmitted, while throughput tells you how much data is actually transmitted. And, for this reason, throughput often proves to be a far better measure when it comes to evaluating network performance.
With that said, the influence bandwidth has on network performance is significant—high bandwidth connections allow more users to connect without deterioration in network performance—one reason why fiber optic connections can handle multi-device connectivity better than other internet types.
Many people confuse bandwidth and latency due to the link between data transmission and time. To understand the impact of both on network performance, it’s important to understand the difference between the two.
Again, as we said earlier, bandwidth is related to the network’s capacity to transmit data in a given time frame. While latency measures the time it takes for this data to get transmitted. Meaning bandwidth tells you how much data can theoretically transfer at a time. Whereas latency reflects how quickly data packets get transferred. For this reason, latency is measured in milliseconds and bandwidth in Kbps, Mbps, or Gbps.
Let’s take a look at some scenarios to understand the real-life impact of both bandwidth and latency on your online experience:
So when it boils down to the importance of either bandwidth or latency, it’s important to define the purpose your network fulfills. On gaming latency has a bigger impact, whereas for streaming video bandwidth is more critical—gamers cannot afford a lag in response time, while video streaming demands large amounts of data download for an extended period of time.
Well, throughput as we said earlier, is the actual amount of data that gets transferred at a time. And like so, it is bound to be affected by lagging response time. Because if it takes longer for data packets to complete the round trip from A to B and back to A, it is likely the amount of data getting transferred at a time would be affected as well. Meaning, latency will slow down throughput, which in turn will adversely affect network performance.
So, yes. Latency does affect throughput. In fact, it is one of the key factors you must take into account when trying to work on network performance.
How far the device is from the server it is communicating with impacts latency.
So, you may be on a high-speed, high-bandwidth connection in New York while trying to retrieve information from a server in Japan. And, your pal could be on a slower connection, with lower bandwidth, yet able to retrieve information quicker from a server in San Francisco. Only because of the physical distance between the user and the server they are trying to connect with.
Corrupt or out-of-date hardware pieces such as hubs, switches, routers, etc. can contribute to increased latency. For this reason, it is necessary you keep your hardware up-to-date.
A few easy tips can help you reduce network latency quite a bit.
Bandwidth, latency, and throughput are confusing terms, but not difficult to distinguish or understand if you know what you are looking for. Knowing these aspects, and keeping a check on them can go a long way at helping you optimize network performance.