Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand the desire (fetish?) for high speed home Internet connections at home.

I have 25 Mbps up. 10 Mbps down. Have had it for years. It's fine.

It's fine when both my wife and I are working from home and doing calls. It's fine for software development. It's fine for email and web browsing, and everything other than downloading maddeningly large files, 99% of which shouldn't be that large anyway. It's fine for watching streaming shows. Maybe if our kids turn out to be YouTube addicts when they're older we'll upgrade; maybe we won't for that reason.

What are people doing with their higher-speed Internet connections that makes it valuable to have such fast ones??!



Pulling or pushing Docker images, downloading LLM models, installing AAA games from Steam. There are so many use cases that you won't see if you're just doing email and web browsing with a little bit of video streaming.

It's also helpful for off-site backup. I believe off-site backup is very important, and having gigabit upload is very helpful for this.

> I don't understand the desire (fetish?)

If you don't need it then you should be happy with what you've got, but calling other people's uses a "fetish" is unnecessary. And weird.


I agree that calling it a fetish is weird, but I also have a hard time believing ordinary people are pulling and pushing docker images all the time.

The reality is that the reason these high speed internet political initiatives fail is that for most people internet access is a solved problem, and there isn't the critical mass of people to push through legislation.

Which is not to say that for a minority, it's not a solved problem, but the desires of those in a minority situation don't get prioritized in the democratic process.


Yes, you're right. "Fetish" was too weird and dismissive, but I can't edit it now.

Perhaps "irrational obsession" would be a better word.

I don't understand why so many people are obsessed with maximizing technical specifications of their computers. I want a computer that works predictably and reliably with some minimum acceptable level of performance.

Overlockers void their warranties and spend thousands of dollars to squeeze out 50% more performance than what they'd get from off-the-shelf CPUs… and for what? To play games at a slightly higher framerate, sometimes, but often just to show screenshots of benchmarking rates for bragging rights.

I view people who are obsessed with Internet connection bandwidth similarly.


Gigabit is incredible if you are a dev


> installing AAA games from Steam.

it's weird how insanely large games are now, like 100's of gigabytes. I can't remember which game my son was playing, maybe DCS or some other milsim but it was around 300gb download. That's roughly 64-65 dvds.


It's Stockholm syndrome from what I can tell.


Why?


The total bandwidth up/down is only part of the story.

I was on a cell modem until very recently. Just the latency difference between gigabit fiber and anything else is noticeable for me. When a website loads a ton of stuff in a single page, some of that is serialized and requests are back to back instead of parallelized. The longer the serial chain, the higher you multiply your round trip time. This is especially so with auth providers that take you away and back to a site (or similar for online purchases via external sites (eg: PayPal etc.)) All of that time adds up.

So, my home connection is now down to 11.9 ms to google.com, my wifi adds another 5ms. I did "start timeline recording" and hit the google homepage. It just took 900ms to load the front page in Safari. On a good day with my cell hotspot, my latency is 35 at idle and goes way up (sometimes in seconds) when pushing bandwidth.

Video calls with 1000ms and higher latency are ... difficult. Especially when everyone else is in the sub 100ms range.


Yep, latency's also big if you play competitive multiplayer games. With DOCSIS you get ~11ms +- 3ms added to every packet no matter what because it's shoehorned existing cable infrastructure. Fiber is much better in this regard.

Ping to my public IP's gateway address:

  30 packets transmitted, 30 received, 0% packet loss, time 29031ms
  rtt min/avg/max/mdev = 1.449/1.915/2.212/0.166 ms


I've been consulting a long time, and my home is my office.

Besides the uses other people have suggested, here are some uses I would have for a fast symmetrical connection:

- Backing up data to my home/office NAS while away.

- Remoting to my workstation desktop from any location, for any reason.

- Using my home as a Tailscale exit node for clients for whom it's already a hassle to allowlist my home office's IP, so I can work from anywhere.

- Switching my nixos configuration using the caches in my home office where my custom derivations are built.

I have 90Mbps down and 20Mbps up. All of the above is workable but it would be great, amazing if it were faster.

The remote places I would do this from:

- the doctors' waiting room because we have teenagers

- the bleachers of the pool for the diving lessons because we have teenagers

- the in-laws spare bedroom where we're visiting for an extended time during school holidays but not work holidays because we have teenagers.

Some of us have different needs, under choices that we make that are optimal for other aspects of our life but not for having a slower asymmetric connection at home.


I use my home connection for VPN access remotely. I back up snapshots of data every day. I like to be able to download games and Linux ISOs practically on demand. I work from home and often enough faster speeds can avoid several minutes of additional waiting in a day.

This connection is shared as well. My partner relies heavily on cloud syncing. We both like to stream 4K HDR video. I like being able to get devices updated and ready to use with minimal time spent waiting for downloads.

I also live in NZ, where multi-gigabit fibre connections are often cheaper than what Americans have to pay for a fraction of the bandwidth. It’s not a notable financial burden or anything, and it’s not like we have data caps to worry about. It’s very much a situation where the use cases naturally find themselves once the option is there.

Also, 25/10 Mbps is painfully slow for a shared home connection in the modern day. There’s videos on YouTube that can push a higher bitrate than that. The absolute slowest plan that my ISP even offers is 100/20 Mbps for about $35 USD per month, while the most common/baseline plan for most households in NZ is 500/100 Mbps after the fibre carriers continued to increase speeds at the lowest tiers.


I have gigabit synchronous fiber at home thanks to a group of local tech folk who built out the network. The biggest change for me is that I rely more on my NAS at home over a Wireguard tunnel for things I would have used the cloud or a hosting service for before.

Going to work? No worries about forgetting a USB stick or portable SSD. I can always just fire up Wireguard and grab it from home.

Sharing Jellyfin access with family and friends has also been fun.


Working from home on raw and cooked SKA data and visualisations being remote served by supercomputing centres, team co-editing of multiple raw RED cinema camera channels.

Essentially any job that involves massive fat data streams that ends up having a real time collabrative hybrid remote team.


> What are people doing with their higher-speed Internet connections that makes it valuable to have such fast ones??!

Honestly, most people go for the shiniest number they can afford.

But I will say that as a software developer who has had to fix bugs on random branches of very substantial software projects (web browsers), there is a tradeoff between recompiling the whole project and simply downloading the binaries that the CI system has already built. When I had to jump six months or a year into the past to test some old build, gigabit service was the difference between a few minutes to download the binaries and 20–30 minutes to recompile them myself.

But these days 100Mbps is really more than enough.

An interesting technology that might become a killer app in a year or two is 4D gaussian splats. This is a way of creating photo–realistic animated 3D scenes that the user can move around in, viewing them from all angles (without any need for artists to build geometry or paint textures). Right now streaming them in real time needs about 500Mbps. I’m sure that a few years of iteration on compression techniques will lower it significantly, but it’ll always be more expensive than mere 2D images (even animated 2D images). For reference, most streaming services use about 15–25Mbps for a 4K television stream.


> > What are people doing with their higher-speed Internet connections that makes it valuable to have such fast ones??!

> Honestly, most people go for the shiniest number they can afford.

Right. It's a costly and irrational anxiety or obsession over a headline number that likely doesn't matter for most people. https://news.ycombinator.com/item?id=47677199

> But I will say that as a software developer who has had to fix bugs on random branches of very substantial software projects (web browsers), there is a tradeoff between recompiling the whole project and simply downloading the binaries that the CI system has already built.

Right. I thankfully don't do that kind of development much, but I agree that I'd want a faster connection if I had to. I would certainly expect to have it at the office, though I don't find a need tor it at the home.

In the past when I worked as a developer for FAANG and related spinoff companies, I did most of my development on "cloud desktops" or VMs which had much faster connections. This was mainly in order to have more CPU, RAM, and storage for builds, although the additional network bandwidth was important as well.

> For reference, most streaming services use about 15–25Mbps for a 4K television stream.

To me, 720p seems incredibly high-quality. I think I'm living out something like https://m.xkcd.com/606, although perhaps on a 20-year lag rather than a 5-year lag.


>I have 25 Mbps up. 10 Mbps down. Have had it for years. It's fine.

Do you mean the other way around, 25Mbps Down and 10 Mbps up?

It is nice to have, especially when it doesn't cost much. That is why I am perfectly OK with PON rather than dedicated fibre. You only need the 1 or 10Gbps speed for may be a 10 min window per month.

I do think 25Mbps on a house hold bases is quite low. On a 5Mbps Video file I want the first 10 second buffer, 50Mbps done instantly. While I am loading multiple page in the background. Multiply that with a few more user in family. It is perfectly useable a you said, if you dont mind waiting.

Otherwise I think 50 - 100Mbps per person is generally the point we see law of diminishing returns.


Yes, I reversed the up/down bandwidths as you noticed, but didn't see the mistake until I could no longer edit.

> Otherwise I think 50 - 100Mbps per person is generally the point we see law of diminishing returns.

Right. Whether we think the diminishing returns are at 10 or 20 or 50 or 100 Mbps per user, there are diminishing returns.

The vast, vast majority of residences simply do not need symmetric 25 Gbps bandwidth, and it would be a massive waste of resources to try to build out a residential network providing that level of bandwidth, rather than prioritizing universal accessibility of 50 or 100 Mbps.

I'd liken it to the overprovisioning of EV batteries, particularly in North America. Many, many car owners would be perfectly satisfied with a car with a range of only (say) 60 miles or 100 km, and overall EV cost and adoption rate is hurt by the fact that leading-edge manufacturers, especially Tesla, were only building EVs with range of 5x that.


I don't want to wait 6hrs to download a game patch that's 40gb or whatever because that's sadly the norm. With 1gbit I can do anything and it doesn't induce latency or cause connection quality issues with anything else because no one thing can come close to saturating it really, with a few exceptions (Steam being the main one). I can also seed at high speed to private trackers. It'd be an effort to max out a 25gbit connection at home that's for sure.


- Downloading local LLM models

- Downloading games, movies etc.

- Updating software.

- Doing remote data backups or restoring from them.

- Browsing the internet. Fiber still makes a noticeable difference especially in badly optimized websites.


Maybe I've just got deep scars from the 90's, where I'd wait 15-25 minutes sometimes to download a single mp3.

I have a FIOS connection here at home, and it seems entirely sufficient. Even AAA steam games, I hit 'download' and go grab a snack in the kitchen and it's done. My server does incremental backups to s3 every night, but its not like i'm sitting there watching it.

I download a new large model maybe once every other week. It takes a few seconds, maybe minutes. I don't really notice either way? 25x faster doesn't seem like it would make any difference.


Past 200Mbps down I typically see very little real benefit.

That said, I do find myself downloading packages and watching 4K video all day long. 25Mbps is noticeably slower the majority of the time. You can get by, of course, the same way you can compile an Xcode project on a 2019 Intel Mac (I still think MacOS 26 supports Intel?) but it's a significantly nicer experience on more recent M series machines.

Who likes to sit around waiting on downloads/compilation?

Now I'm realizing you said 25Mbps up, 10Mbps down. Wow, assuming this isn't a mistake, 10Mbps is slow enough to make even normal web browsing start to chug IME.


Uploading 15 minute videos to YouTube, downloading hundreds of gigabytes of 3D assets, updating large applications, streaming movies for a 4k projector, frequently downloading beta OS updates, etc, etc, etc.


> I have 25 Mbps up. 10 Mbps down. Have had it for years. It's fine.

This is dog slow, you can't even stream Youtube at 4K resolution with that. Downloading a reasonably recent game would also take close to an entire day. Not everyone needs a full symmetric Gbps perhaps, but 100Mbps is kind of a baseline nowadays, and more is better for faster downloads.


> you can't even stream Youtube at 4K resolution with that

Not something I care about.

I don't have a 4K TV. I'm not sure if I've ever seen a 4K TV. I've definitely seen a very high-resolution TV at a friend's house, and it was kind of cool, but not something interesting enough to motivate me to spend $500 or $20/month to acquire one of my own.

I hardly watch TV. My TV has a native resolution of 720p, I think, and 720p videos look really high-quality to me on the occasions when I watch.


An example from someone who has lived at a condo in Asia. Around 8 PM the internet becomes unusable. Everyone's back home and they want to watch their favorite series. If you need to work at that time, e.g. if you work US hours, you are screwed.

P.S. I experienced this at different condos in different countries in South East Asia.


Are you sure that isn't wifi interference?


Most places in Asia, this is due to massive oversubscription. No relation at all to wireless spectrum.


That's easy to claim, but there are a lot of places where everyone is surrounded by everyone else's wifi routers. If you have 9 routers that you share walls with and even more that can reach you, wifi starts to break down, but people will blame their service provider.


I've been there a bunch, my colleague has lived there. We work in the telco area. My own experiences I would question, his I don't.

It's oversubscription.

Can I provide citations or proof? No. That's extremely hard to do with oversubscription in general, no telco will admit their exact ratio without being forced to. Sometimes you can reverse engineer it from peering relationships, but that doesn't allow identifying bandwidth constraints on medium haul.


It's oversubscription.

You said that already, but repeating something isn't evidence. Oversubscription would be something that happens with cable internet on a node by node basis, so to say the problem is one thing and only that doesn't make any sense. Not only that but people will sign up for hundred megabit to gigabit internet but they only really need to watch some streams that use 3mb each.

You can actually figure out oversubscription if you ping nodes, especially over time.

There are more factors like international bandwidth, lack of caching servers in smaller countries so bandwidth has to be international, cable signal levels etc.

None of these come close to wifi contamination. If you have two neighbors trying to watch tv over wifi and you're trying to watch tv over wifi, you're sunk. Now take that to being surrounded by a dozen people, all watching tv over wifi and all watching videos on phones and tablets.

Unless someone is in a house, more isolated from their neighbors it is going to be a much bigger problem for almost everyone.

You can say 'oversubscription' because you're buddy said that, but even that can have some truth while still being a marginal issue next to the real problems. Even in places with great internet, people get a single wifi router, put all their computers and TVs on it, then blame their ISP.


You seem to be making arguments with significantly less of a connection to the actual scenario being discussed. And you're explaining oversubscription to 2 network engineers, one of them having presented at APRICOT and APNIC.

I guess I should've focused less on oversubscription and made clear that we know it's not spectrum utilization. For that, we have the equipment to measure, and we did, and it's not the problem.


For that, we have the equipment to measure, and we did, and it's not the problem.

It has been a problem for basically everyone living in apartments that had network problems that I've seen. If you measure at the wrong time it's going to look fine. You have to be there when people are watching video over wifi.

Again, just because people can't get their full bandwidth, it doesn't mean oversubscription is the actual bottleneck.


Dude, how oblivious are you. You're {man,nerd}splaining. Hard. Did you miss the "network engineer" part? Do you think our first step in debugging performance issues seen on wifi would be anything other than grabbing a cable?


Did you miss the "network engineer" part?

I get that you don't want to actually confront what I'm saying by pulling out a label and doing the appeal to authority routine, but that isn't real evidence or information.

You're still ignoring that two things can be true, but it doesn't mean they are both equal contributors when you take a birds eye overview of entire countries.

You also aren't explaining why wifi wouldn't be the primary bottleneck when you're surrounded by dozens of routers with shared walls and everyone is watching multiple video streams.

If you go into an apartment 10 floors up anywhere with internet you see dozens of wifi networks.


So sorry, you're right, wifi interference was slowing down the wired ethernet connection on the router that had no wifi.

/eot


It seems like you're taking a single instance and generalizing it to millions of people.


Not a network engineer by any measure - but I think if it was wifi contamination, it wouldn't get worse in the evenings. The routers are on 24/7. Thoughts?


That's why professionals don't call this "wifi contamination", we call it spectrum utilization/congestion. The problem isn't the number of wifi APs, the impact of beacons (i.e. idle APs) is, while not zero, quite limited and only visible in extreme cases. The actual problem is traffic, which consumes available spectrum when being carried.

It's a factor of RF bandwidth, time and space. With some non-obvious parts:

- setting your TX power too high makes you consume spectrum in a larger area, harming your neighbors. Don't yank the TX power to maximum just because it "feels" like that should be better; there is no difference between MCS (= speed/rate) 11 with 10 dBm headroom and MCS 11 with no headroom, you get ≈120Mbit either way.

- conversely, using old APs, devices, or stretching the wifi connection too far consumes excessive spectrum since you'll get a bad data rate and use much more time to convey the same data. Due to this, a repeater can in fact improve performance for devices not even using it, by getting rid of low-MCS traffic.

- don't use wide channels when you don't need the performance. A 160MHz channel means 160MHz width of picking up interference. While chipsets are somewhat intelligent about this, if you're fine with ≈200 Mbit (single MIMO stream) there's no point in going wider than 40 MHz.

- multicast is death. It's a very common wrong belief that wifi requires you to send multicast traffic at the lowest possible rate. It doesn't, but almost all low-end implementations are lazy and do just that. "Lowest possible" in this case means the lowest rate the BSS configured to support. If you have 802.11b enabled, that's generally 1 Mbit/s. Disable that, and you get the lowest 802.11n rate, which is ≈6.5 Mbit/s. If you need to deal with a lot of multicast, disabling some low MCS might also be worth it to raise that even further, but then those MCSes are not available to cover far-away devices anymore. But then again you may not want that to begin with (see above).

- it's highly dependent on building characteristics; thick stone/steel walls block much more RF energy than drywall or wood.

- if you can, just use cables. If it doesn't help you, it might still help your neighbors.

So… yeah, a lot of people consume media in the evening, and that does make it much worse.

P.S.: MCS indices: https://mcsindex.com/


Thanks for that. I thought the AP being on 24/7 was enough, but it makes sense that actual traffic is what makes the difference.


it wouldn't get worse in the evenings

Why not? People come home and start using their internet. They watch TV over wifi, use their PCs, watch videos on their phones, everyone uses what they have at the same time.

Thoughts?


I was going to respond on topic, but you might be a bit too snarky for my taste


Was the "snark" using your exact words?


640k ought to be enough for anybody?


Running speedtests to see that number go up. Mostly Hacker News though :D


The optionality of consuming services from places other than internet titans for one would be nice.


What exactly does that have to do with the bandwidth of one's home Internet connection?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: