I could get behind some of this hate directed to Vercel’s CEO or even Cursor’s, but Deno is sort of like a breath of fresh air around the myriad of parasitic tech out there. Still, why so much hate? Who hurt you? What’s going on
Folks labeling "AI generated" might be jumping the gun considering OP described his process took him the last couple months and then some for this project.
Call it what you want, but I think this sits better with "AI assisted" and, perhaps, really well supervised full of the human intent behind of it. Then again, labels are strange, we call algorithmic and synthesizer assisted music "electronic" music these days and we still praise musicians who take the time through endless Moog / Ableton fine-tuning sessions to find the perfect loop patterns for their craft.
I could definitely feel the connection between the human author side of this post, thank you for sharing it!
> we still praise musicians who take the time through endless Moog / Ableton fine-tuning sessions to find the perfect loop patterns for their craft.
There are still plenty of purists that will not consider this a "craft". But it's always been that way. The electric guitar itself was a controversial music transition. Bob Dylan was famously criticized heavily for going electric.
But that was a long time ago, and people got over it. And they will again this time.
Dylan going electric was not about instrument choices. It was about abandoning the radical folk music tradition that Seeger, Guthrie, etc. had revived.
Bingo. The problem with this take is that the people pissing and moaning in the early '70s were right. Early Dylan sounds good. The texture of an acoustic guitar draws focus to songcraft and away from objectively bad execution. Dylan's vocals were always bad but they went from charmingly bad to just-plain-bad with the transition to electric. The bigger sound was not flattering for him. With 60 years of hindsight, folk still remains a largely acoustic genre because the sound is flattering to the rest of the genre too. That isn't to say that all folk should be acoustic, it's just that you have to come correct otherwise. I find later Dylan annoying despite loving his early records, and I was born 30 years after everyone stopped caring.
The only consensus among serious music fans is that there is no consensus among serious music fans. Source: me, serious music fan.
A lot of things about Dylan got empirically better throughout the '70s, I'll give you that. Deeper concepts, more challenging structure, yada yada yada.
The problem is that I don't decide what I listen to based on anything empirical. If I'm standing around thinking "man, I want to listen to Bob Dylan today," I'm thinking of Freewheelin'. You could say "well that's just you," but we both know it isn't. A third group probably thinks of Highway 61 or something.
Same thing goes for a lot of artists. Master of Puppets is the best Metallica album empirically, but if I'm thinking "gee I want to listen to Metallica today," I'm playing Ride the Lightning, or And Justice for All.
In any case, I think all of this subjectivity might suggest that Dylan going electric was a bad comparison for AI generated art, lol.
I can't tell someone not to listen to their favourite Dylan, but in reality we serious music fans often do decide based on empirical factors, we just don't think about it. The "Deeper concepts, more challenging structure, yada yada yada" are all things that go into making the music more interesting and satisfying for your brain, and that's why we keep going to them.
Some people appreciate simplicity more, or folk more than rock, but many people with Dylan, just like the social commentary more than the music itself.
Ghost produced isn't a good parallel here, the "ghost" in ghost produced comes from the NDA when acquiring a track from a different producer, which, most often is a human.
My impression was that the "ghost" comes less from the formal NDA and more from the fact that somebody else produces/writes the work and is uncredited for doing so. Then, the "author"/"producer" passes it off as their own work.
Yeah that's both true and the fact that it is a work for hire where the agreement implies the original author cannot claim credit, hence, turn into a ghost.
I think the difference here is that the AI isn't a "work for hire" setup, it's more like a tool. It would be closer to buying algo sample packs, using Apple's Logic Pro AI drummer for part of the work, or other drum machines for example, and working your way around to glue them together into a composition.
The parallel would need to be between "tool" <—> "composition", rather than "author" <—> "composition" imo.
Maybe it's because I think your comment throws away a lot of relevant context from OP's submission on HN.
He says he spent months on this piece and then some, I think it's safe to assume here that this was well supervised, guided, thoughtful and full of human intent despite the AI-assisted part.
In short, I think calling it "AI generated" takes all the human effort that went into these months and the ingenious creativity of OP towards crafting this piece!
Reading it, I get the feeling the author worked the story the way Tom Hartmann works those agricultural machines. The AI gave input, but the author was tweaking it with human knowledge and wisdom.
Yes, although what I think is different in this setup here is the OpenShell gateway override, as they mention:
> NemoClaw installs the NVIDIA OpenShell runtime and Nemotron models, then uses a versioned blueprint to create a sandboxed environment where every network request, file access, and inference call is governed by declarative policy. The nemoclaw CLI orchestrates the full stack: OpenShell gateway, sandbox, inference provider, and network policy.
I think this means you get a true proxy layer with a network gateway that let's you stop in-flight requests with policies you define, so it's not their hardware but the combination of it plus OpenShell gateway and network policies.
I also think the reason they are doing this is to try and get some moat around these one-clik deployments and leverage their GPU for rent type of thing instead of having you go buy a mac mini and learn "scary" stuff (remember, the user market here is pretty strange lol)
I like that these companies will name their products OpenShell or OpenVINO or whatever with the implication that anyone else will ever contribute to it beyond bugfixes. The message is "Come use and contribute to our OPEN ecosystem (that conspicuously only works on our hardware)! Definitely no vendor lock-in here!"
It's not something like Mesa. It's open source in the same way chromium or android is open source. A single company is the major contributor and decides the architecture and direction the whole ecosystem will go.
What are the odds that Intel would ever use any of this open source Nemo stuff or vice-versa? If they do, it would be a complete rewrite that favors their own hardware ecosystem and reverses the lock-in effect. When you write code that integrates with it, you're writing an interface for one company's hardware. It's not a common interface like vulkan. I call it the CUDA effect.
Right, the gateway layer is the genuinely interesting part. Intercepting every outbound network call before it leaves the sandbox gives you a real enforcement surface, not just "trust the app to behave". The problem is the threat model is still inverted for the security critics in this thread: the agent is the client, so the dangerous calls are the ones going out to your authenticated services (Gmail, Slack, whatever), and a gateway that filters those is only as good as your policy definitions. One misconfigured rule and ure back to square one.
The GPU rental angle makes total sense too. This is basically Nvidia saying "don't buy Mac Mini, rent ours" wrapped in enough infrastructure glue to make it feel like a platform.
OpenShell is the gem here indeed. A lot of good ideas like network sandbox that does TLS decryption and use of policy engine to set the rules. However:
> Credentials never leak into the sandbox filesystem; they are injected as environment variables at runtime.
The LLM will easily leak these credentials out. So the creds should be outside the sandbox, and the only thing the sandbox should see is a connection API that opens a socket/file handle.
Alternatively where is needs an API key, it should be one bound to the endpoint using it. E.g. a ticket granting ticket is used to create a bound ticket.
A copy on write filesystem would be an interesting way to sandbox writes, but there is difficulty in checking the diff.
reply