Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLM inference is much more geographically fungible than electricity, so maybe it’s just not worth the complexity yet and there is enough (not highly latency sensitive) load on average globally.
 help



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: