Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"at least not for the kind of things I want to do."

Can you share?

 help



I have an experimental project where I was asking various LLMs/tools (ChatGPT, Cursor, Google, Lovable) to implement an old game for me. They all failed spectacularly in various ways. For example, when trying to debug an issue, got into a loop making the same sets of mistakes over and over again. Or "solving" a problem by removing an implementation, or claiming something was fixed but all it did was stop checking the error. It's been disastrous.

I've had better success with LLMs as just a supercharged search engine, but only after I went through several rounds of adding instructions to prevent hallucinations and lies.

I also asked one to create a tutorial for me to follow in regards to a complicated game I'm trying to understand. It lied repeatedly, making up features and telling me to set options that just didn't exist.

My boss loves LLMs and claims it really improved his productivity, but the stuff he's talking about is JS stuff. When he (and I as well) try to use it with Java the viability of the results drops off dramatically.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: