Bufferbloat
All of us have run
into systems that worked perfectly upto a point but then broke down after that.
Simply because some resource turns out to be inadequate beyond that point. Or
to put it in engineering terms, the solution wasn’t “scalable” beyond a point.
But have you ever
encountered a scenario where a surplus
of a resource caused a degradation of
performance? If you’ve used the Internet, then yes, you’d have run into that
scenario, writes Brian Christian in his book, Algorithms to Live By.
Here’s how the
modem was designed to work way back:
1)
You
want to upload a really big file.
2)
The
network is busy, so your modem can’t transmit just yet.
3)
So the
modem stores the content in its buffer, planning to transmit whenever the
network allows.
Sounds good,
right? What could possibly go wrong with that approach?
Aha, when it was
originally designed, memory was expensive, so how much data the modem could
store in its buffer was relatively small. Beyond that, it would refuse to
accept more data. But as memory got cheaper and cheaper, the buffer size grew
bigger and bigger.
Now for the key
point in all this: the data still wasn’t getting transmitted until the network
had the bandwidth! In other words, all your content that you thought you were
uploading was just sitting in the buffer. How would you perceive things? That
it was taking forever to upload.
But, you argue,
what’s buffer (memory) size got to do with? Wouldn’t the network being busy
delay things both when buffers were small and when buffers grew larger? Aha,
the difference is that the buffer size determined how long it was before the
modem told you, “Hey, I am jammed up. Try uploading more stuff only after I’ve
transmitted what’s already in the queue”. With small memories, that feedback
came early to the user. Now, with larger memories, we don’t get such feedback
for a long time. But during that period, we think that the data is being
transmitted when in reality it’s just sitting in the modem!
Notice how we got
to this problem? Because the buffer got larger, thanks to the falling cost of
memory! Christian puts the impact of this nicely:
“The most prevalent critique of modern
communication is that we are “always connected”. But the problem isn’t that
we’re always connected; we’re not. The problem is that we are always buffered.
The difference is enormous.”
There’s even a
term for this problem; bufferbloat. Or as Christian writes:
“We used to reject;
now we defer.”
Comments
Post a Comment