I'm surprised Python is on that list. TypeScript doesn't seem like a terrible choice, as it can leverage vast ecosystems of packages, has concurrency features, a solid type system, and decent performance. C++ lacks as robust of a package ecosystem, and Python doesn't have inbuilt types, which makes it a non-starter for larger projects for me. Rust would have been a great choice for sure.
Python and C++ have been used for countless large projects— each one for many more than typescript. It’s all about trade-offs that take into account your tasks, available coders at the project’s commencement, environment, etc.
People like to put companies that are household names on pedestals, but the choices they make are mostly guided by what their people can do and which choices give them the most value for free. They mostly operate how smaller companies do but they have a bigger R&D budget to address issues like scale that the larger market has little incentive to solve.
Also, this product is like a year old… it has barely hit its teething phase. I wouldn’t be surprised if the core is still the prototype someone whipped up as a proof of concept.
I reckon some believe these companies are basically magical, and are utterly astonished when they’re shown to be imperfect in relatively uninteresting ways. I’m a lot more concerned about the sanity of the AI ecosystem they operate in than the stability of some front-end Anthropic made.
I mostly mentioned it because it is pre-installed on some (linux) systems. Though of course if you're trying to obfuscate the sourcecode you need to bundle an interpreter with the code anyway.
But it has historically been used for big programs, and there are well established methods for bundling python programs into executables.
They have an annoying sandbox issue which pollutes your repository root with a set of empty files. Not the cleanest tool, but the paradigm is a big upgrade to previous AI coding.
Anthropic acquired Bun. Clearly, Bun is not a runtime for C++, Rust, or Python. For an engineering project, strongly typed TypeScript was basically the only possible choice for them.
Is Anthropic's acquisition of Bun alone still not enough to infer their tech stack? What more obvious signals would be needed?
Also, honestly, given the speed constraints of large models, it makes almost no difference what language an agent is written in. The small performance differences between programming languages do not even begin to matter compared with network latency, let alone the speed at which a large model streams tokens.
The flip side of that is now you're running old software and CVEs get published all the time. Threat actors actively scan the internet looking for software that's vulnerable to new CVEs.
Not all distributions work with a staging repository, and it's not really intended for this purpose either.
Besides there's always a way to immediately push a new version to stable repositories. You have to in order to deal with regressions and security fixes.
I know not all, but Debian/Ubuntu/Fedora does, and while the intended purpose of multi-stage releases is not necessarily security but stability, it still does help up with security too. Because third parties can look and scan the dependencies while they are still not in stable.
Most of the supply chain vulnerabilities that ended up in the NPM would have been mitigated with having mandatory testing / stable branches, of course there needs to be some sort of way to skip the testing but that would be rather rare and cumbersome and audited, like it is in Linux distributions too.
Is a "AUR" now just how we name unaudited software repositories?
Just to note, if we're talking about Linux Distributions. There's also COPR in Fedora, OBS for OpenSUSE (and a bunch of other stuff, OBS is awesome), Ubuntu has PPAs. And I am sure there's many more similar solutions.
reply