Hacker Newsnew | past | comments | ask | show | jobs | submit | monooso's commentslogin

> This is the dumbest, most pointless military conflict in American history.

You've picked a high bar there.


Genuine question: is Fox News a credible, unbiased source for these figures in particular?

Their poll is, and this is not the only poll I picked. CNN has multiple polls listed on its site, and this is the first I found which gives the party breakdown.

I believe the phrase you're reaching for is "carrying out extra-judicial executions of civilians alleged to have been smuggling drugs."

Or shorter.. NAZIS!

My clients love non-deterministic invoices.

But if they use "AI" to approve them, you can pray that your "AI" generates an invoice for 100x the amount owed and their "AI" pays!

My Claude can create a reuse a 200% deterministic script that generates invoices tailored exactly to my needs. No way any other tool can match that. You guys live in ancient times

lol, exactly xd

I believe capitalism is defined by private ownership (not necessarily by investors) of the means of production.

Cool, but how is this theoretical fact relevant to the real world.

I was responding to GP, who stated "capitalism means ownership by investors", and then used that definition to make the argument that "it is the investors who ruin your product."

So in this context, the "theoretical fact" as you put it, is relevant.

More generally, I would argue that such things are relevant because we're all using words to communicate, and it helps a great deal if we all understand what those words and phrases actually mean.


I was talking about Actually Existing Capitalism, not hypothetical scenarios that don't obtain in the world we're subjected to. It is a diagnosis, not a theoretical law of nature.

When it comes to communicating, the meaning of a word is how it is used in practice. Use creates meaning. In practice, capitalism means ownership by investors. In the world that exists, the Capitalist Class is the Investor Class, and the dynamics created by this reality have caused non-hypothetical products to be ruined time and again.

Proprietorships and partnerships where the management is also the owner(s) are an older economic system that predates the innovations of modern capitalism. Even businesses that begin this way commonly seek investment capital, get bought out or go public. The ones that don't are considered stable and therefore uninteresting, such as lawyers, but private equity will eventually gobble them up too.

If you look at the products that have avoided enshitification, they are the ones that have avoided or subjugated investors.


If you don't use any AI assistance when coding I suspect you're already in the minority.

If you refuse to use software that was built with any AI assistance, well... good luck finding an operating system to run.



And we know they're right, because that lawyer signed a contract on TV saying he'd be liable if they were wrong.

Paul Kinlan published a blog post a couple of days ago [1] with some interesting data, that show output tokens only account for 4% of token usage.

It's a pretty wide-reaching article, so here's the relevant quote (emphasis mine):

> Real-world data from OpenRouter’s programming category shows 93.4% input tokens, 2.5% reasoning tokens, and just 4.0% output tokens. It’s almost entirely input.

[1]: https://aifoc.us/the-token-salary/


My own output token ratio is 2% (50% savings on the expensive tokens, I include thinking in this, which is often more). I have similar tone and output formatting system prompt content.

That's actually useful to know and it aligns with what I see (I wrote the cost post)

Yes but with prompt caching decreasing the cost of the input by 90% and with output tokens not being cached and costing more than what do you think that results in?

However output tokens are 5-10 times more expensive. So it ends up a lot more even on price

Even more than that in practice once you factor in prompt caching

I think we still skew back to an insanely high input token ratio when you consider agentic loops. For example, when I see the tools I use do a web fetch or a search or other tool use, it's an incredibly high number of new input tokens.

I've found Neovim to be remarkably stable, even when building from main.

You haven't been using the LSP API then. There have also been multiple breaking changes over the last five years, including breaking compatibility with established default vim keybindings.

A documented breaking change does not mean the application is unstable.

The Neovim developers have been extremely clear that part of the process of getting to 1.0 is finalising the API, and that there will be breaking changes en-route.


I have never experienced this many breaking changes in stable software. There's a reason nvim still hasn't hit 1.0

To be clear, it's fine to have breaking changes. Especially if you're working towards something substantial.

But nvim and its plugin ecosystem seem to be altogether too keen to change absolutely everything and adopt all bleeding edge developments. Even when a mature system would serve the purpose just as well.


Changing default mappings is not a "breaking" change.

It is. And iirc, neovim themselves mark them as such.

We may mention them in `:help news-breaking` for visibility, but that's only because I don't care about pedantry. API breakage != UI changes (e.g. mappings).

API breakage != UI breakage, yes, ofc. Because API != UI.

But the UI is also an interface, and the user is part of the total system. That system interface is broken if you change default mappings.

It doesn't matter if the interfacing component is software or a user.


I would argue that ambiguity and uncertainty slow down reading, and more importantly comprehension, far more than a few additional characters.

It depends on whom you are optimizing for. Someone who knows the language, but not this system/codebase, or someone who works in this area often?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: