Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
lxgr
7 days ago
|
parent
|
context
|
favorite
| on:
Claude March 2026 usage promotion
LLM inference is much more geographically fungible than electricity, so maybe it’s just not worth the complexity yet and there is enough (not highly latency sensitive) load on average globally.
help
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: