Where did providers come up with how much “bandwidth” is needed to watch a video.  Not to reference a “buried lede”…  It’s about money.

Frontier says you need to purchase a 25 Mb/S to properly view a 4K video (AKA TV Show, movie, etc).  Comcast, AT&T, and most other internet providers say the same thing.  We’ll wait and discuss the content providers like Hulu, Amazon, Netflix, Apple, and especially AT&T / DirecTV, later.

The first question, and the one that prompted this short diatribe is: Have they ever heard of math?

Let the math begin!  Assume the higher Cinema Standard at 4096px X 2160px image for a 4K screen.  That’s 8,847,360 dots on a screen for a single image.  One dot equals one bit, eight bits per byte equals 1,105,920 bytes (1.11 MB, which is twice the RAM typical IBM PCs of the early 1980s had, and that’s just for a single 4K image).  What about frames displayed per second (~24fps)?  It’s up to 26.5 MB every second.  And then there’s color too?  Assuming it’s equivalent to RGB, that’s three more bytes for every bit, right.  Holy cow, that’s like a couple of DVDs worth of data every minute! 25Mb/S can’t possibly be enough!  It’s looking like over 100 GB per hour for 4K for an uncompressed video.

Wait…  What was that thing about compression?  Oh, that’s right.  The size of an uncompressed frame of video multiplied by frame rate, adding color, etc. is not an accurate calculation of the bandwidth needed for 4K video.

So how big would a one minute uncompressed video file be?  A nice number to connect it to human experience would be a DVD (about 5 GB).  OK, what about compressed 4K video?  Well, it depends on a lot of factors, including what’s being videoed, compression method, quality, camera hardware capability, etc.  But again, going with something familiar to latch onto, a nice round 100 MB for a minute of 4K video is about right.  Again, it could quite a bit smaller or larger, depending on many factors.

OK, let’s cut to the chase and use that 100 MB per minute of video as a base.  Realistically, that number is a bit larger than what movies from Netflix, Hulu, etc. consume.  But for the sake of this story, let’s go with something bigger.  OK, more math, 100 MB for a minute of 4K video, that’s 800 Mb (bits), divide by 60 equals 13 Mb/S.

Hulu and some other content providers support this calculation by referencing a number around 16 Mb/S.  Close enough to the calculated amount and allowing for some factors that affect peak bandwidth available to a user, etc.

But wait (again)!  How about instead of calculating bandwidth, actually measuring it?  As it turns out, minus some spikes for buffering, it’s more like about 5 Mb/S for a 4Kish (in Amazon terminology, “Best Video”) for an average video.  So 5 Mb/S to 10 Mb/S is a range of bandwidth that will suffice for most video.

OK, how about some context?  In the late 1990s, SBC (soon to be AT&T) and other DSL providers were offering a mouthwatering 6 Mb/S download speed.  Compared to 56K dial-up, that was the holy grail.  In the 1980s, Ethernet speeds on coaxial cable was advertised at 10 Mb/S.  Not to leave you hanging, but there’s another entire story here, so enough context to think about for now.

And in researching this story, I ran across this article: https://www.evdodepotusa.com/how-much-data-does-4k-video-streaming-use/, which gave me an idea for the next story in this series.  Go ahead, read the above noted story.  The story’s focus is on the size of 4K videos (very large), their bandwidth usage, and a focus on mobile devices such as phones and tablets.  The article purports to be concerned about the bandwidth usage. the Here are the things to focus on:

4K Video (IE, a really high resolution video), bandwidth, and mostly noting mobile devices using bandwidth on whatever data plan the subscriber has.

What they don’t mention or ask is this: Why is someone watching a 4K video on a mobile phone?  Think about…  Top of the line phones have a resolution topping out at about 1400px X “maybe 3000px”.  Sure, that’s approaching “4K”, but it’s a 6ish inch screen.  What if the resolution was 14,000px X 30,000px on the same 6 inch screen?  Answer?  Two things: 1) Amazing technology and 2) Totally useless.  What is the point of having resolution that exceeds that of an iMAX theater on a 6 inch screen?  Answer?  There is no point.  It is a complete and total waste.  Why?  Because the human eye can’t see the difference. 

Yup, that’s right.  We’re approaching a glass ceiling where technology is out-performing the audience it is serving.  Visual definition beyond what the human eye can see is pointless.  And can you see where this is going next?  We’ll be needing upgrades to the human eye soon.

Sorry, too much there, it will be for a future story.

And yet another subject for another article might be internet provider’s bandwidth guarantee.  Spoiler alert, there is no guarantee (unless you pay a for a ‘non-consumer level / business’ package).  They only state “…up to X Mb/S”.

 

Categories: Thoughts