I’ve never known cable providers of failures to broadcast live TV in its history. MASH (not live) amongst many others had 70-100+ million viewers, many shows had 80%+ of the entire nation viewing something on its network without issue. I’ve never seen buffering on a Superbowl show.
Why do streaming services suffer compared to cable television when too many people watch at the same time? What’s the technical difficulty of a network that has improved over time but can’t keep up with numbers from decades ago for live television?
I hate ad based cable television but never had issues with it growing up. Why can’t current ‘tech’ meet the same needs we seemed to have solved long ago?
Just curious about what changed in data transmission that made it more difficult for the majority of people to watch the same thing at the same time.
Yes, at least some WiFi adapters can. Software used to attack WiFi connections, like
aircrack
, does this by listening and logging (encrypted) packets without authenticating to the access point, and then attempting to determine an encryption key. You can just send unencrypted traffic the way you do today, and software could theoretically receive it.However, this probably won’t provide any great benefit. That is, as far as I know, just being connected to a WiFi access point shouldn’t generate much traffic, so you could have a very large number of computers authenticated to the WiFi access point – just set it not to use a password – without any real drawback relative to having the same machines snooping on unencrypted traffic.
WiFi adapters cannot listen to multiple frequencies concurrently (well, unless things have changed recently), so it won’t let you easily receive data from more access points simultaneously, if you’re thinking of having them all send data simultaneously.