Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You decouple the workloads from human interaction (ie when you submit the job to the queue vs when it is scheduled to execute) so when they run is not a consideration, if possible. The economic incentives encourage solving this, and if it can’t be solved, it buckets customer cohort by willingness (or unwillingness) to pay for access during peak times.
 help



Sure, but if I ask the LLM a question, I'd like it to respond now, instead of tonight.

Certainly, interactive workloads aren’t realistic for time shifting, but agentic coding likely is. Package everything up and ship it as a job, getting a bundle back asynchronously.

I don't know, my agentic coding is pretty interactive. Maybe once the plan is done, sure. That would be interesting, though OpenAI already does this with batch workloads.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: