Yes and no, in this case you don't need a local vscode instance to connect to your vscode container, just a browser. It's more similar to self-hosted Github Codespaces, and but I believe code-server was actually released first.
Personally I've fiddled enough with podman that that part of it doesn't bother me, and I don't like Docker/Podman desktop (which I think is what devcontainers assume the setup is) and just want to use the docker/podman daemon/cli, so that's something I also don't want to have to mess with. I know you can also point devcontainers at a remote or local docker daemon, but that means giving it full control of the daemon, which is a nonstarter and too risky for me from a security standpoint. I really like to have my podman containers really locked down in terms of capabilities, userns, gvisor, etc. and given that it just wants full control of docker I'm guessing it takes none of that into consideration (or, painful to try and graft on).
I also really don't trust VSCode's remoting protocols (especially remote tunnels, which is basically just a reverse shell). So for me it's a matter of having more control over container runtime security (the docker/podman ecosystem usually treats it as an afterthought in favor of convenience, and devcontainers appears to be no different), and it doesn't depend on anything on my machine I'm accessing it from. Since VSCode just a becomes another self-hosted web app this way, it's pretty much the exact same experience no matter where I'm connecting from.
I personally think stopping the use of AI is silly. I mean you could apply this logic to everything that you would be using specialized tools for, to make life better.
Why would you use a car? studies show walking is better.
Why would you use a calculator? studies show mental math sharpens your brain.
Why would you type on a computer? studies show writing on a physical paper is better for you.
Fundamentally people tend to use AI to get a result and then just use it instead of critically thinking over it. It makes sense, we take the path of least resistance. Why care about how something works when it just works? This leads to cognitive decline.
Rather I propose that you use AI as one of the greatest search engines ever invented. Think of AI (LLMs) as a dataset of fixed knowledge that can talk back to you and actually help you find the information you want.
When I was looking into making selenium scripts, that I had to write for work, more robust, I came up with a solution to create an object for each page and use that object to navigate and interact with the page. I then had a conversation with gemini pro regarding this where it told me that, what I am trying to achieve is technically just Page Object Model (POM), a pre-existing concept in the world of test automation. This lead to a deep dive into the advantages/disadvantages and I then pitched it as a change to better write selenium code and it was accepted at work.
Likewise, you should not use AI to vomit you code but as a rubber duck that quacks back at you [0]
One difference with AI is that you make a dependency on private corporations for a forever changing tool.
A calculator from 1970 works the same as the one on your phone. A typewriter from 1930 has the same layout roughly as your laptop. The AI you use today will change in 30-90 days time and be different. In addition just 5-10 companies in the world can create AI that you use for work vs. decent competition in regular computer space with a FOSS operating system and kernel available.
Local AI is possible but not in the same league. To boot these tools are VC subsidised with no profitability plan. You are the product again.
2 companies make airplanes. Around 10 make cars. Entertainment comes from a handful of streaming services. 2 make a phone OS. I could go on.
I guess what I'm saying is that you're right. There's a limited number of AI providers. But just about any technology I use, or the appliances I buy, or transport, or whatever, is taken from a very small pool of suppliers.
So yeah, your argument is completely true, but its also true for most everything else.
There isn't a limited number of aircraft suppliers - my son built one at a public high school in Australia, we (small exploration company) flew 19 air frames for several decades, sourced from many suppliers.
In the same manner there will not be a limited number of AI suppliers forever and hardware to train today's monster models will be affordable and available to more and more, smaller and smaller groups, as time progresses.
Location: Chennai, India
Remote: Preferably
Willing to relocate: (yes mostly if the pay is good)
Technologies: Java, Python, applied AI/LLM applications, Golang, SQL, Linux administration, Docker, Git, Javascript/Typescript
Resume/CV: https://home.alles-tools.com/resume (please solve the captcha)
Email: suvarnanarayanan2g (at) gmail.com
I'm a backend developer for the most part but I also do full stack development if the UI isn't highly complex. Before I got laid off - I was making it @ Autodesk.
I have quite some experience dealing with LLM based applications - taming them into giving more sane answers. The best one that I have built so far lets you attach to any data sources (SQL/NoSQL/Files etc) and then just ask questions about them. Literally any question (even outside the schema) and it never makes up stuff.
reply