Hacker Newsnew | past | comments | ask | show | jobs | submit | btown's commentslogin

This would be true if the algorithm changes were limited to for-you feeds. But the larger problem is that the set of people willing to pay for X are boosted in replies. So if that set of people, which tends towards a certain political bias, is hostile towards a poster, that poster will be driven away from posting on X.

The net result is that X shows breaking news, in the same way that the (infamous) meme of bullet holes marked on the WWII plane only shows part of the story - the people who have departed the platform aren't posting, and thus X is only breaking news from a subset of people.

This might be fine for certain types of topics. For understanding the zeitgeist on culture and politics, though, you can't filter your way towards hearing from voices that are no longer posting at all.


I don't care about culture and politics on X, in fact it is something I actively block. By discussion I mean tech news and trends, ie how is someone using the latest AI model or what new project was created, that sort of stuff. The people I follow provide me that, not politics. If you're there for politics then I agree with your point, look elsewhere.

To be sure, the problem isn't that the plugin injects behavior into the system prompt - that's every plugin and skill, ever.

But this is just such a breach of trust, especially the on-by-default telemetry that includes full bash commands. Per the OOP:

> That middle row. Every bash command - the full command string, not just the tool name - sent to telemetry.vercel.com. File paths, project names, env variable names, infrastructure details. Whatever’s in the command, they get it.

(Needless to say, this is a supply chain attack in every meaningful way, and should be treated as such by security teams.)

And the argument that there's no CLI space to allow for opt-in telemetry is absurd - their readme https://github.com/vercel/vercel-plugin?tab=readme-ov-file#i... literally has you install the Vercel plugin by calling `npx` https://www.npmjs.com/package/plugins which is written by a Vercel employee and could add this opt-in at any time.

IMO Vercel is not a good actor. One could make a good argument that they've embrace-extend-extinguished the entire future of React as an independent and self-contained foundational library, with the complexity of server-side rendering, the undocumented protocols that power it, and the resulting tight coupling to their server environments. Sadly, this behavior doesn't surprise me.

EDIT: That `npx plugins` code? It's not on Github, exists only on NPM, and as of v1.2.9 of that package, if you search https://www.npmjs.com/package/plugins?activeTab=code it literally sends telemetry to https://plugins-telemetry.labs.vercel.dev/t already, on an opt-out basis! I mean, you have to almost admire the confidence.


I’ll just say that as someone who was on the React team throughout these years, the drive to expand React to the server and the design iteration around it always came from within the team. Some folks went to Vercel to finish what they started with more solid backing than at Meta (Meta wasn’t investing heavily into JS on the server), but the “Vercel takeover” stories that you and others are telling are lies.

Similarly, one of the great things about Python (less so JS with the ecosystem's habit of shipping minified bundles) is that you can just edit source files in your site_packages once you know where they are. I've done things like add print statements around obscure Django errors as a poor imitation of instrumentation. Gets the job done!

Benchmarks are meaningless until the pelican benchmark comes out: https://simonwillison.net/

> ended up enabling groq

For those reading fast, this isn't a reference to SpaceX's Grok, this is Groq.com - with its custom inference chip, and offerings like https://groq.com/blog/introducing-llama-3-groq-tool-use-mode... and https://console.groq.com/landing/llama-api


Really liked Groq due to its speed but it seems like after Nvidia bought it it has been discontinued...

Right? This feels like an "arms race" similar to scraping vs. anti-scraping; countermeasures will be developed, likely due to the action of actors entirely disconnected from what you're doing, but to block something else in the ecosystem... and you'll need to re-engineer your approach entirely. Rinse and repeat.

(The amount of innovation in anti-anti-scraping that's resulted from "sneaker bots" - automated scalping of limited-edition shoe releases - is astounding, and somewhat relevant here in how an environment can become adversarial in ways that impact broad ecosystems. I suppose the equivalent here would be environmental ads that seek to penetrate noise-cancellation in a similar way.)

I suppose, though, that all this is good news for a company that wants to turn your bicycle bell into a subscription product!


I don't see why this would become an "arms race." There's no particular competitive value in filtering out this ONE sound.

I think there's a broader indication of an arms race between noise cancellation systems and things that want to be heard, like advertising. And this just-happening-to-exist bandpass that the DuoBell is depending on could easily become collateral damage in that fight.

I was going to make a joke about advertisers working in some kind of ultrasonic modulation to their audio so it breaks ANC (I'm aware this wouldn't really work) but then thought, whats more likely, advertisers doing that, or advertisers partnering with 80% of ANC chip makers to just let them by-pass with specific tone markers...

Then we'll be hacking our headphones with specific 3d printed clip-ons that involve a particular brand of coffee filters that happen to attenuate the "clear freq" enough for the headphones to miss it.


"Dad, why do our coffee filters advertise that they can run fast fourier transforms?"

"Well, kid, back in the year 2026, there was this bicycle bell.."


Are the Gnome features planned to be ported to macOS? Frozen columns and cancelable queries are pretty vital things!

Yes. Definitely, those two will be ported soon, among some others.

Shipped those two today.

PWAs were more than an experiment - they were even mentioned in Apple keynotes (IIRC). And sandboxing was every bit as stable as website sandboxing.

They were killed because app store operators realized they bypassed an ability to police payments that could not be monitored and (effectively) taxed.

This was a technology that could have been successful in any environment where a merchant's freedom-to-request-direct-payment was protected. In such an environment, it would have shifted incentives that apps now become a burden on developers as well as on Apple and Google's review processes, and PWAs would flourish.

But that's not the environment we were in! And arguably, even post Epic's litigation, we aren't fully.


The same way coding agents don’t replace the need for an IDE, content generation needs to support arbitrary human-to-agent handoffs, where the human can say “this is the wrong direction, I sketched this change of what I want it to look like, see how it’s different and apply that pattern.”

And, in the broadest sense, that human interface is a CMS; the agent is just another editor, albeit one that happens to read and write raw data rather than using a WYSIWIG (or similar) editor.


> coding agents don’t replace the need for an IDE

Depending on who you talk to, they may not agree. (I am not in this camp but I am certainly aware of people who are.)


FWIW this was the status quo (webpage could ping arbitrary ports but not read data, even with CORS protections) - but it is changing.

This is partially in response to https://localmess.github.io/ where Meta and Yandex pixel JS in websites would ping a localhost server run by their Android apps as a workaround to third-party cookie limits.

Chrome 142 launched a permission dialog: https://developer.chrome.com/blog/local-network-access

Edge 140 followed suit: https://support.microsoft.com/en-us/topic/control-a-website-...

And Firefox is in progress as well, though I couldn't find a clear announcement about rollout status: https://fosdem.org/2026/schedule/event/QCSKWL-firefox-local-...

So things are getting better! But there was a scarily long time where a rogue JS script could try to blindly poke at localhost servers with crafty payloads, hoping to find a common vulnerability and gain RCE or trigger exfiltration of data via other channels. I wouldn't be surprised if this had been used in the wild.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: