Why Is Bandwidth Measured in Bits Instead of Bytes?

Why Is Bandwidth Measured in Bits Instead of Bytes?

Technology is confounding enough without having to deal with ambiguous nomenclature. So why do we have both bits and bytes, and what do they measure? Does the distinction of terminology serve a practical purpose, or is techno-god just out to torment us?

Bandwidth is measured in bits instead of bytes because although networks are used to transfer many bytes of data, they do so one bit at a time. Moreover, since a bit is the smallest unit of data, measuring bandwidth in bits gives you the most precise measure of a network’s data-carrying capacity.

This article will look into the difference between bits and bytes. Hopefully, by the time we’re through, you’ll be able to tell your megabits from your megabytes like a pro.

Megabits per Second vs. Megabytes per Second: (MB/s vs. Mbps)

Why Is Bandwidth Measured in Bits Instead of Bytes?
Binary code flowing inside fiber optics cables showing digital data signals running in parallelly with each other.

People often confuse the terms megabits per second (Mbps) with megabytes per second (MB/s). It doesn’t help that each measure can be expressed in terms of the other and used in closely related areas of computing. 

But do they refer to the bandwidth of networks, the data transfer rates of disks, the speeds of connections, or all of these? To decipher these questions, we’ll need to start with what bits and bytes are and build from there.

Bits vs. Bytes Explained

Bits vs. Bytes Explained
Hard drive disk sectors.

Both bits and bytes are information units, but a bit (binary digit) is smaller than a byte, and there are 8 bits in a byte. In fact, a bit is the smallest unit of digital information, a single point of data that can exist in only one of two states – one or zero, on or off, however you choose to think of it. 

Almost all computing today relies on programming executed based on this binary architecture. The exception is quantum computing, which we discuss in many of our articles like “Does Quantum Computing Use Binary Systems?” or “Is There a Quantum Internet?”

Like a bit, a byte is a measure of information. To differentiate a byte from a bit, it may be helpful to think of bits and bytes as a unit of memory. This conclusion is because a byte consists of 8 bits arranged in a specific sequence. 

Why 8? Well, it turns out that’s how many binary digits you need to encode a single text item. Bits can be a 0 or 1 and can be true or false or yes or no. 1 byte = 8 bits.

Note that bits are individual units that cannot be further broken down. That is, they contain all the information within them. On the other hand, a byte consists of a string of 8 bits in a specific order. Changing the sequence of bits in a byte changes the information the byte carries.

What Are Megabits and Megabytes?

What is a Megabyte, Megabit, Gigabyte, and Gigabit?

As the volumes of data that computers handled began rising dramatically from the earliest days of computing, engineers needed shorthand measures for volumes of data. This is where the kilo-, mega-, giga-, and tera- perfixes come in. They’re shorthand for exponentially larger volumes of information.

The following table explains the numbers of the prefixes attached to the bits and bytes index: 

PrefixNumber of BitsNumber of Bytes
KiloThousand1 x 10 3
MegaMillion1 x 10 6
GigaBillion1 x 10 9
TeraTrillion1 x 10 12
PetaQuadrillion1 x 10 15
ExaQuintillion1 x 10 18
ZettaSextillion1 x 10 21
YottaSeptillion1 x 10 24
RonnaOctillion1 x 10 27
QuettaNonillion1 x 10 30
Prefix numbers for Bits and Bytes as approved by the International System of Units (SI).

It follows that a megabit is a million bits and a megabyte a million bytes.

Megabits per Second and Megabytes per Second Explained

Megabits per Second and Megabytes per Second Explained

Megabits and megabytes provide a reasonable measure of volumes of data. But, when transferring data across networks or retrieving it from memory, we also need to be able to add a time measure to evaluate the rate at which such transfers happen. This is where megabits per second (Mbps) and megabytes per second (MB/s) come in. 

Since bytes denote a single character of textual information, file sizes and data on hard disks have long been measured in bytes. That’s why most text files measure in KB, typical images in MB, and high-definition videos can often be several GB in size. 

Inevitably, data transfer rates between devices came to be measured in bytes per second, including as megabytes per second, written MB/s, or MBps. This is why most hard disk transfer speeds are rated in MB/s.

However, while memory is measured in bytes, bytes of information are only transferred across a network in serial bits – i.e., one bit at a time. Moreover, these bits can be scrambled and do not maintain the strict byte order, as they will be rearranged in their original order at the other end of a connection.

Because data is transferred bit-by-bit serially over a network, network bandwidth has come to be evaluated in bits, including in megabits per second or Mbps. This is why the bandwidth of your internet connection is usually described as Mbps and not MB/s.

However, the information described in one measure can easily be stated in the other. That is, since 8 bits = 1 byte, it follows that 1 bit = ⅛ or 0.125 bytes and 1 bps = 0.125 B/s. Therefore, 1 Mbps = 0.125 MB/s.

Now that we’ve come to grips with what Mbps and MB/s refer to let’s take a look at how they convey practically helpful information. You will find the terms are quite relevant to the tasks you do every day.

What Does Megabits Per Second (Mbps) Measure?

What Does Megabits Per Second (Mbps) Measure?
IT specialists working in data center room.

Megabits per second are usually used to measure the bandwidth of networks. 

Although many use the terms interchangeably, bandwidth is not the same as speed. Bandwidth refers to the carrying capacity of a network – the maximum number of bytes per second it can transfer – as opposed to its actual speed.

Besides bandwidth, the speed of a connection also depends on its latency. Latency is the time it takes for a message to travel across the network and elicit a response. Latency, in turn, depends on a number of other factors, such as how you are connected to a network and the number of other users active on it currently.

For instance, ethernet connections can reach speeds of 10 Gbps or so, whereas wireless connections, such as WiFi, are limited to 6.9 Gbps or less. Latency also has a hard upper limit: the speed of light. This is because it is not physically possible to send signals faster. 

The table below shows bandwidths for commonly used networks today: 

NetworkMaximum Bandwidth
4G Cellular100 Mbps
5G Cellular20 Gbps
DSL Broadband40 Mbps
Cable Broadband300 Mbps
Fiber Broadband10 Gbps
MPLS10 Gbps
SD-WAN (Multiple Connections)20 Gbps
Bandwidths for commonly used networks.

So this explains typical network bandwidths. But how much do you really need? Well, that depends on what you are using your network for. For instance, transferring large amounts of video data will require more bandwidth than just browsing text-only internet pages.

Here’s a list of bandwidth requirements to access some commonly used services:

ServiceBandwidth Requirement
Streaming 4K video25 Mbps
Streaming 1920 x 1080 video5 Mbps
Streaming 1280 x 720 video3 Mbps
Livestreaming Webcam0.5 Mbps
Screen Sharing150 Kbps
VoIP Calls80 Kbps
See our article “How Many GB for a 2-Hour Video?”

As you can see, even the slowest connections nowadays are fast enough to enjoy some of the highest-quality services available. However, remember that actual speeds depend on other factors, including distance to servers, network congestion, and connection type.

In addition, speeds can vary depending on the direction a signal travels along. For instance, download speeds are typically slower than upload speeds on most networks. 

What Does Megabytes Per Second (MBps) Measure?

What Does Megabytes Per Second (MBps) Measure?
Hard Disk Drive HDD, Solid state drive SSD, and M2 SSD.

Megabytes per second measures the rate at which memory is accessed or retrieved from storage devices. The speed of your storage device can significantly impact your system’s performance. Of course, the type of connection will also have a limiting impact.

With increasingly faster disks available today, you’ll want to pay attention to these numbers. Typical data transfer rates for disks in use today are listed in the following table: 

Storage DeviceData Transfer Rate
7200 RPM HDD (SATA III)100 MB/s
SSD (SATA III)550 MB/s
SSD (M.2)550 MB/s
SSD (NVMe)3500 MB/s
Typical data transfer rates for hard drives.

Key Takeaways:

What is bandwidth in networking? How to calculate bandwidth?

Bandwidth is measured in bits instead of bytes because it is a measure of the data-carrying capacity of networks. Since data is transferred across networks serially – i.e., bit by bit – it makes sense to measure bandwidth in terms of the maximum number of bits a network can move per second. 

Bytes, on the other hand, refer to strings of information whose order affects their content. While they can be broken up into bits and transferred across a network, they must be reassembled into their original structures to retain fidelity and accurately convey their original content.

We discuss bandwidth, latency, and megabits (Mbps) and megabytes (MBps) per second in detail in some of our other informative articles like “Examining 5G Technology For Smartphones.” and “Do

es WiFi Need To Be Plugged Into a Cable Outlet?”

John Mortensen

I am a project manager, tech writer, and science enthusiast who loves to study the latest technology, such as AI, comedy, quantum computers, smartphones, headphones, and software.

Recent Posts