Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One /24 of IP’s hammering on your website at a rate limited 2 rps is still a combined 500/s. I’m not sure many sites can sustain that.


For a public website? Well if you don't have thousands of pages, then the solution would be as simple as installing Varnish, which is good practice anyways. If you actually have enough unique paths for an unauthenticated botnet to saturate, well that's a bit more complicated.


Many sites hosted on Vercel, I suppose. If sites are hosted on nginx/varnish I’d be surprised if they didn’t do an order of magnitude more.


Yeah the playbook for serverless is to target developers that don't know anything about infrastructure, lock them in with proprietary APIs, and then hit them with a huge bill once they have any real traffic.


If you're using nginx as a proxy like the above commenter suggested, then if you're serving static/cached pages (should be able to for most public pages?), it can do over 10k RPS even on my n100 minipc (the limit there is actually the 1 Gbit NIC).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: