Their poll is, and this is not the only poll I picked. CNN has multiple polls listed on its site, and this is the first I found which gives the party breakdown.
My Claude can create a reuse a 200% deterministic script that generates invoices tailored exactly to my needs. No way any other tool can match that. You guys live in ancient times
I was responding to GP, who stated "capitalism means ownership by investors", and then used that definition to make the argument that "it is the investors who ruin your product."
So in this context, the "theoretical fact" as you put it, is relevant.
More generally, I would argue that such things are relevant because we're all using words to communicate, and it helps a great deal if we all understand what those words and phrases actually mean.
I was talking about Actually Existing Capitalism, not hypothetical scenarios that don't obtain in the world we're subjected to. It is a diagnosis, not a theoretical law of nature.
When it comes to communicating, the meaning of a word is how it is used in practice. Use creates meaning. In practice, capitalism means ownership by investors. In the world that exists, the Capitalist Class is the Investor Class, and the dynamics created by this reality have caused non-hypothetical products to be ruined time and again.
Proprietorships and partnerships where the management is also the owner(s) are an older economic system that predates the innovations of modern capitalism. Even businesses that begin this way commonly seek investment capital, get bought out or go public. The ones that don't are considered stable and therefore uninteresting, such as lawyers, but private equity will eventually gobble them up too.
If you look at the products that have avoided enshitification, they are the ones that have avoided or subjugated investors.
Paul Kinlan published a blog post a couple of days ago [1] with some interesting data, that show output tokens only account for 4% of token usage.
It's a pretty wide-reaching article, so here's the relevant quote (emphasis mine):
> Real-world data from OpenRouter’s programming category shows 93.4% input tokens, 2.5% reasoning tokens, and just 4.0% output tokens. It’s almost entirely input.
My own output token ratio is 2% (50% savings on the expensive tokens, I include thinking in this, which is often more). I have similar tone and output formatting system prompt content.
Yes but with prompt caching decreasing the cost of the input by 90% and with output tokens not being cached and costing more than what do you think that results in?
I think we still skew back to an insanely high input token ratio when you consider agentic loops. For example, when I see the tools I use do a web fetch or a search or other tool use, it's an incredibly high number of new input tokens.
You haven't been using the LSP API then. There have also been multiple breaking changes over the last five years, including breaking compatibility with established default vim keybindings.
A documented breaking change does not mean the application is unstable.
The Neovim developers have been extremely clear that part of the process of getting to 1.0 is finalising the API, and that there will be breaking changes en-route.
I have never experienced this many breaking changes in stable software. There's a reason nvim still hasn't hit 1.0
To be clear, it's fine to have breaking changes. Especially if you're working towards something substantial.
But nvim and its plugin ecosystem seem to be altogether too keen to change absolutely everything and adopt all bleeding edge developments. Even when a mature system would serve the purpose just as well.
We may mention them in `:help news-breaking` for visibility, but that's only because I don't care about pedantry. API breakage != UI changes (e.g. mappings).
You've picked a high bar there.
reply