Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I get what you're trying to say, though. The less world knowledge you provide the LLM, which it otherwise lacks, the worse its outputs will be

... No, wasn't trying to say that at all, I'm saying that it seems like the tokens a LLM produce works much worse as inputs than the tokens a human would produce, regardless of what it actually seems to say.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: