For a public website? Well if you don't have thousands of pages, then the solution would be as simple as installing Varnish, which is good practice anyways. If you actually have enough unique paths for an unauthenticated botnet to saturate, well that's a bit more complicated.
Yeah the playbook for serverless is to target developers that don't know anything about infrastructure, lock them in with proprietary APIs, and then hit them with a huge bill once they have any real traffic.
If you're using nginx as a proxy like the above commenter suggested, then if you're serving static/cached pages (should be able to for most public pages?), it can do over 10k RPS even on my n100 minipc (the limit there is actually the 1 Gbit NIC).