> With RSS, you subscribe directly to websites, blogs, or news outlets, meaning there is no middleman algorithm deciding what you see.
This enters a failure mode very soon, especially because most people using RSS-like technologies would typically subscribe to more sources than they can typically read through. Like it or not, _the algorithm_ does serve the purpose in prioritizing and discovery. The trouble, IMO, is with the objectives for these recommendation and ranking algorithms.
A middleman/aggregator who is paid by subscribers would be incentivized for the users, a marketplace-like aggregator would always have trade-offs.
Algorithms other than FIFO are fine when they serve you. Way back when I had a mail reader (Gnus) that used a Bayesian classifier to predict which emails I might especially want to read, based on past reading experiences. That was nifty! An RSS reader could do the same, on my own machine, based on my own preferences and not some marketer’s. I’d like that an awful lot.
You sort of can with very little work. When I used RSS more I had a "primary" folder and a number of secondary folders. I always looked at the primary; I'd dip into various secondaries when I had the time.
I do sort of agree with the general premise. The sort of social media that sort of replaced RSS is largely dead.
This is already a problem with things like Mastodon - as soon as you subscribe to some more "spammy" accounts such as news outlets, all the other content is drowned out.
So yes, having kind of re-ranking _algorithm_ can be a good thing, whether we like it or not.
Semi-Popular youtube channels regularly get offers from someone who wants to buy their channel. There are people and companies that put up good/useful content for a while to get subscribers and then shift focus. There have been several cases where someone has lost control of their system/password because of a "hack". Likely there are more that I'm not aware of.
The solution I've found, whether with RSS or other feed-based platforms (e.g., Mastodon / the Fediverse), is 1) organise feeds (by topic using RSS, by interest generally); 2) to ruthlessly prune feeds particularly in my high-interest list/category/tag; and 3) park any voluble feeds into their own "voluble / noise" group. They can drown out each other, but not lower-volume, higher-quality feeds.
Interest level works far better than category for social-media feeds, if only because few people (as opposed to organisations) tend to stick to a given topic. On Google+, one feature I used for my own outbound content was its own classification system, such that my tech posts went to a tech channel, science to science, news/current events, etc. to their own. Those following me could choose which of those they were interested in or not.
Yes, but this isn't a solution for most people. Most people don't want to do that active gardening. It's like when Google Plus excepted everyone to put their contacts in various circles and keep it up to date.
Funny thing was that I'd developed this strategy on G+. It took me years to arrive at it.
And no, G+ "Circles" were hardly straightforward or convenient to use.
But the notion of restricting your highest priority follow list to a small set (10 -- 50 profiles or feeds, and less is definitely more here) is key.
Over time that list will likely grow, but more because many of the feeds have fallen silent or infrequent. Pruning the departed actually takes some work, and is something I'll do maybe once a year or so.
The way I'd arrived at this though was that I'd gotten desperately sick of G+ (and Google) at one point, and having initially followed many, many profiles with abandon, I pruned off virtually all of them, leaving a small core I particularly cared about. Ironically, as I was trying to make myself less dependent on the network, not more, my stream quality improved immensely. Virtually all of the annoying bullshit, even if only vaguely annoying, vanished. The people I was left with largely knew me and interacted with me regularly, and had things to say I found interesting.
G+ is gone, but I've carried through that strategy to Mastodon (still relatively active, though I've taken a break much of this year) and Diaspora* (dying its slow death, but something I'll still check into a few times a year). A small but interesting curation still proves quite compelling. A key realisation was that the voluble streams which do occasionally produce an interesting insight will almost always have those forwarded by others I do follow directly, allowing me to rely on them (or their own upstreams) for curation.
It's also given me insight into mechanics from the age of print newspapers and magazines: when a local region had its own publication (as with newspapers), syndication or curation would gather content from elsewhere, and major stories tended to get carried locally. It might seem that distant publications produced exemplary content, but in truth what I'd read was creamed off the very top, and digging further into such a source often proved disappointing. I keep that in mind with current social / algorithmic / stream-based media. Economics of print publication mean that that former behaviour is largely lost to us now, but high-quality periodic publications (The Economist, Atlantic, Foreign Policy, and the like) can remain worth picking up and reading even now, should you happen across a physical storefront actually carrying them. Might even choose to subscribe should the desire be strong enough.
To be clear, G+ Circle Management was a tedious, largely unrewarding, PITA, both in general and in the specific mechanisms provided through the G+ interface. Though pruning was quite often quite rewarding.
It's probably a good strategy, but Hacker News is a bubble and most people aren't going to do this absent a very hard limit in the app itself (like Path did).
App-based limits/nudges are all but certainly the only way to see widespread adoption. But the tactic arises naturally out of both attention scarcity and the tendency for high-salience (or high-appeal) messages to be widely distributed regardless.
More problematic is when you're searching for needles in haystacks / nuggets of gold: sparse signal in high-noise environments.
Didn't Bluesky solve this problem already by allowing anyone to publish their own algorithms?
I feel like user generated sorting algorithms would be a great fit for RSS. Power users would get an ability to tweak their feeds to their liking, while other users would have a lot to choose from
I'm now building an RSS reader that is specifically designed around the algorithm that learns what sources you like the most. It also slightly adjusts the rankings for high/low frequency feed so subscribing to The Verge won't lead to you skipping updates from some personal blogs.
And I now use it far more than I ever used Reeder.
This enters a failure mode very soon, especially because most people using RSS-like technologies would typically subscribe to more sources than they can typically read through. Like it or not, _the algorithm_ does serve the purpose in prioritizing and discovery. The trouble, IMO, is with the objectives for these recommendation and ranking algorithms.
A middleman/aggregator who is paid by subscribers would be incentivized for the users, a marketplace-like aggregator would always have trade-offs.