I thought, oh, great, this is the key to making FB decent again.
But after a while of using it, it usually shows almost nothing. Sometimes literally nothing. People are still posting stuff though that I can see with the default view. I almost think I may have triggered some sort of adversarial logic that is trying to force me to use the engagement-driven ordering.
Edit: In fairness, it could be that they are hostile to uBlock origin, either deliberately or as emergent behavior. I wouldn't mind ads in principle, normal ads like you used to have in print media, but I often get ones that are very unpleasant if I don't use the blocker.
I have experimented with Chronological order in the past, and it almost invariably provides a worse feed. It's like email with no spam filters.
The grim reality is that most of the content that is uploaded to Facebook is repetitive and uninteresting. In my personal experience the friends who used to post the most interesting things have mostly stopped, and all that remains is the chaff.
Yeah this is less a statement on chronological ordering and more a product of subscribing to all of your uncles. I do think chronological ordering can help you diagnose this problem quicker.
It's always baffling to me to read about people who complain about their Facebook feed because their friends post shit, and because they like pages who post shit...
Facebook can't magically fix your feed for you, you have to curate it yourself, how come people don't understand that?
It pisses me off when someone reassigns the blame to the user and not to the cancerous Facebook shit and it's scumbag leadership. Just because WE are too stupid to "curate it ourselves".... Here are the three top search results when searching (in DDG) for: Facebook timeline psychological experiment
Now then.. are we all so stupid, or is Facebook ran by a bunch of scumbags who care only for "user engagement" no-matter-what? And it may be that some people get confused by its practices, then understand the dishonesty that is causing the confusion and give it up (abandon FB), while others get trapped in it, so net-net FB made more £€¥$ and kept the bad practices.
As a science experiment enthusiast I don't mind facebook running the odd experiment. The issue is whether they use the knowledge and power for good or ill. I'll give them a 6/10 for that.
Stuff on your timeline doesn't appear there because Facebook picks stuff at complete random to display to you. Your feed is built from the sum of what your friends post, and what your liked pages post. Granted, Facebook does an enormous amount of shenanigans to highlight the most engaging/enraging posts from that source, and they're manipulating your friends to re-post engaging/enraging things they've seen, so it can spread to you.
But at the end of the day, the source for your feed is what your friends post. If all your friends post nothing but cute baby pictures, your feed is going to be 100% cute baby pictures, and nothing else. If your friends post nothing but cute cat pictures, it's caturday forever on your feed. And if your friends keep posting political bullshit that you hate, then that's what you're going to keep seeing.
The position I don't understand is the one where people endlessly complain about how Facebook has turned to shit, and then expect Facebook to magically fix it for them? That's never going to happen. Either curate your feed, or stop using Facebook. Those are your options.
I disagree. My friends almost post nothing, but what they post I wamt to see. What they do much of however, is like and comment on pages. Fb, for some reason, think that's interesting to see. However, I cannot stop fb from doing so. I cannot say I don't want to see things this person comments on, then I have to mute them.
I don't care that they comment on some post to win some product, or is tagging a friend in some viral video. I don't want my feed to be full of fake engagement. But I guess that's all Facebook is now, as no one really posts anything anymore, so that's what they show to make it seem things happen.
Of course FB doesn't pick stuff at random; that's the problem! At complete random would be almost as good as chronological. I certainly wouldn't complain.
Saying if everyone posted only agreeable stuff it would be fine is an impossible scenario. Nobody has friends that all post exactly the same thing all the time, and nobody is going to eliminate all their friends that aren't exactly identical. And you know, if you got nothing but saccharine positive thoughts and "law of attraction" style BS, that would be unbearable too.
The well known saying is that "the poison is in the dose". Facebook can concentrate virtually anything into a toxic dose.
It's the same argument as posting content to any audience-limited platform. E.g. LinkedIn.
If I have interesting content, that I wish the maximum number of people to see, then why wouldn't I make it internet-accessible?
Consequently, what's left is (a) people who actually want access to the userbase (e.g. advertisers) and (b) people too technologically limited to avail themselves of other options.
You can tweak the engagement algorithms all you like, but boring content is still boring content.
I think this is partly a byproduct of the non-chronological feed: the algo rewards quality (via visibility/engagement boosts), but it doesn't punish quantity, which incentivizes high-volume posting with no regard for your average signal:noise ratio.
The resulting noise keeps users reliant on the algo for a coherent experience.
When everyone sees a chrono feed by default, their own conduct is informed by the behavior they like/dislike in others. For example, I remember ~10 years ago when posting a picture of your lunch every day was widely understood to be poor form, precisely because it diluted everyone else's experience. It's a bit ironic that it's become a legitimate use case for instagram.
Yep, this has happened to me for years. Occasionally it will show me days-old posts from friends and then say that there are no more stories. If I switch back to the regular view, it will show me many new stories.
My guess is they aren't ruining this experience on purpose, but they aren't keen to fix it because they don't want you using this mode.
I find it almost unbearable the way it selects the worst of what everyone posts. I know, it's easy to blame the person and not the algorithm. But I think people underestimate how much manipulating context can warp perceptions and how vulnerable anyone who posts a fair amount is.
I get to feeling almost persecuted and then I look at someone's posts in context chronologically and it seems quite normal. Just by selecting a small fraction of posts, there is an illusion of a personality that doesn't exist. Context matters so much. It's not humanly possible to overcome the effect of what you don't see other than by challenging it, going out and looking for a more representative sample.
Yeah, look, I mean, I only follow groups like specific model vehicle owners group for my local state / country and say, in my own context as an example, permaculture and mushroom identification groups, which are usually well moderate to exclude poor behaviour / encourage good behaviour.
Having said that, I do see how the most divisive posts and comments get more attention.
And fuck Facebook and everyone one of the individuals who work there and enable that.
I use FBnofb and it allows me to block all the garbage on Facebook by not having an account and never visiting that dump.
On a more serious note, I deleted my account several years ago. This after coming to the realization how much I was passively scrolling past misinformation, polarizing political discussions and curated self-promotion from acquaintances I barely new.
It was funny to have people who never talked to me, complain to a mutual friend that they didn’t what was going on in my life because I was no longer on Facebook, but then never once reached out via any other channel to catch up.
There is some middle ground possible - I simply unfollowed everyone on Facebook, so my feed is always empty, and unfriend anyone with whom I wouldn't be willing to have a coffee.
I'm often invited to events that are organized via Facebook, and it's sometime the only point of contact I have for people when they change their other contact info, so my approach seems, for me, more practical than deleting my account entirely.
It doesn't work anymore - it used in the past before they started dumping sponsored posts, before the option of "subscribing" to selected friends was added and before they redesigned privacy options. What I do noticed is that after few times of selecting between the posts sorting options, there won't be anything else shown in "most recent" but "top stories" will puke posts at random from random time up to about a week. Maybe it's for everyone or maybe it's just for me - I'm not sure; I'm visiting that site once per month to check if someone wanted something and thus, perhaps algorithms have no data to display.
When you say 'discovered' do you mean you used the UI? That option is in my UI on my facebook. Although I realize that different people may have different UI from what I have heard.
To be fair, the kind of ads social media companies have been most criticized for (those masquerading as real posts) are exactly the sort adblockers don't effect.
I always use the chronological feed. That way when I see a story I recognize I know I can stop. It's much much faster than the default view where you have to scroll through a ton of content you've already seen to find something your friend just posted.
I've noticed recently that if I flag too many things I start getting ones that I can't reject. It's just unrelenting implacable pressure; you can report or hide things forever, but it never gets you anywhere and you start suspecting some subtle retaliation in the algorithms.
I don't know about Facebook specifically, but every time I see something I don't like on Google News (on my phone), I tell it I never want to see that news source again, but then it started to give me news stories that collect a large number of sources together, without the ability to get rid of them all at once.
Trying to curate this stuff is like sculpting fog.
Reading through the list of top news stories on Facebook scared me. It is clear that news can't be chosen based on engagement and that Facebook can't change.
Related: Wikipedia's current events portal [1] acts like a user sourced news feed which is a pretty good representation of reality. I wonder if you could commercialize a similarly moderated aggregator.
If you want a raw current events feed just read reuters or AP rss. No need to establish a for profit middle man to do the function of existing technologies already in place.
The problem I have with reuters and ap is that they are for literal news events. Reporting of what happened now.
There's a need for longform journalism, synthesis of information, bias and persuasion. It's hard to consume just that without also consuming the daily noise.
Longer stories will look less engaging unless facebook also tracks how long you spend reading each site.
>There's a need for longform journalism, synthesis of information, bias and persuasion.
What? It's important that Facebook curates biased information and intentionally persuasive content for you because you're otherwise unwilling/unable to seek that out yourself?
I may have misread this entirely, but I'm pretty sure you can find RSS feeds for most news sources outside of AP/Reuters. Whatever content you'd prefer with any sort political or ideological slant you'd like is freely available outside of Facebook by definition.
If RSS isn't available for Breitbart or Daily Beast or KCNA or Patribotics or whatever, surely there must be an easy way to consume that outside of the Facebook platform?
I absolutely said biased and opinionated journalism is important.
On one hand, you can have a story that says "3 people cut their grass today" and on the other hand you can have a story that says "3 people cut their grass today, their reason for doing so, and why you should join them in grass cutting unity." The latter is much more important to a functioning democracy than the former.
The crucial part you missed is:
>It's hard to consume just that without also consuming the daily noise.
RSS is filled with noise. It's filled with stories that are stories because space needed to be filled, that capture your attention for the sake of capturing your attention, not to educate you on things you "need to know."
The whole point of something like Facebook's algorithm is to identify a signal from noise. It is very hard to consume the news without consuming noise, and that is an opportunity for Facebook to actually do something morally right (as opposed to agnostic) and identify what is important for people to learn. That by definition is going to involve a level of bias. Its going to involve them going against everything they believe in, NOT using engagement data to rank content, something I dont think their organization is really capable of.
Drudge tells you "heres what important to know for the day." Techmeme tells you "heres what important to know for the day." Wikipedia Current Events tells you "heres what important to know for the day." Reddit and Facebook tell you "heres what the most people with short attention spans clicked on today."
The world needs better aggregation of important stories.
Exactly, people don't visit Facebook to get unbiased news. They don't want unbiased news. People go there to see news that interests them (and talk to friends and whatnot). The only reason to be against Facebook having a feed based on engagement is if you believe people should not be able to view news that is inflammatory and biased. Perhaps we should force Vice, Buzzfeed, Fox, the Daily Wire, and CNN to shut down and only allow people to read Reuters and AP. That would be even better for the world that Facebook discontinuing the "biased, inflammatory" news algorithm if one is against that sort of algorithm.
Oh come on. None of these are collecting user data in real-time and leading the user down engagement rabbit holes they never would have gone down otherwise.
I watched a couple of amateur WW2 history videos on YouTube and almost immediately started getting suggestions for some very questionable alt right kinda stuff.
If the algorithms are leading people to alt right sources, (which I suspect is the real issue here, not that the Facebook news is inflammatory or biased, but that right leaning content is more represented) it is leading people to there because that's what they want to see. The algorithm sees that more people click on alt right videos than far left videos. Similar to how browsing reddit leads to rabbit holes leading to subreddits that joke about sending people to the Gulag. Unless you believe people are manipulating the algorithms deliberately to do this, but that sounds like a conspiracy. I use that word not to dismiss it, but because there is not any evidence of it.
Again, if one believes serving users content which is exactly what they want to see if that is what users prefer is wrong, then only allowing people to see completely un-selection-biased content is even more right.
So you're arguing that there's been a massive pent up demand for alt right content that has gone unmet until now? I'm not buying it. People are very easily influenced by what you put in front of them. Until recently there were some moderating influences on what that was.
There was far more alt right content on YouTube several years ago. YouTube has gotten much stricter for what content they allow. I remember seeing recommendations for a rapper called "moonman" who rapped about killing blacks that had hundreds of thousands of views and were more than a year old after I watched a vaporwave album by an artist called Saint Pepsi (they both used the same McDonald's mascot in the videos). I didn't find any such videos today with YouTube's search. What moderating influences were there back then that we don't have today? Ten years ago was the wild west for the internet; I also remember major controversies over subreddits called r/watchpeopledie and r/fatpeoplehate being banned, which were exactly as you would imagine. I don't mean this as an insult or to dismiss your views, but I can only imagine you haven't been on the internet very long if you believe it used to be more moderated and less radical.
I've been on the internet since it was called Gopher and there was no such thing as the WWW. It's far far more influential now than ever before and that influence seems much more negative now than ever.
>It is clear that news can't be chosen based on engagement and that Facebook can't change.
Why can't news be chosen on engagement? That's what people want to read. Must Facebook control what news people are reading?
Wikipedia's current event portal may have a more even representation of what events are happening, but Wikipedia also has users that are interested in a wider range of topics. Most people mainly care about sports and politics. Scientific discoveries, economics, and international relations are simply not as interesting to as many people. Why shouldn't Facebook just have news based on engagement and a separate list with equal representation of events from many different topics?
> Why can't news be chosen on engagement? That's what people want to read. Must Facebook control what news people are reading?
They can optimize only for engagement with no curation, but they shouldn't. It shouldn't be done that way because people won't understand that this is the mechanism for content selection. They will think that because content shows up in a news section it has factual merit, and that the content being presented is an accurate representation of the current world context. If you only prioritize engagement, what you will create is a propaganda section masquerading as a news section because the most inflammatory, baseless, bullshit content is what draws the most engagement. That is in no one's best interest long term and it undermines the credibility of news media in general. If they want to create this, it should be labeled as entertainment, not news.
And if you curate news, you likely create a propaganda section selected by individuals or a small group of individuals. It's not as if an unbiased AI hand-picks which stories are of factual merit and representative of the current world context. Is a story from Facebook's curated feed about protests over a children's TV show more significant and accurate than "NASCAR’s Bubba Wallace wants confederate flags banned from race tracks"? Is J.K. Rowling doubling down on a "transphobic manifesto" more significant and/or accurate than the sister of a slain police officer asking "Where’s the outrage for a fallen officer who happens to be African American?"? A group of curators selecting content does not necessarily make that content any more factual either. Curated news greater possibility of being propaganda because the news selection is in the hands of the few instead of the many.
Is a propaganda section a propaganda section if it contains propaganda from both sides? I suppose, in that it is a section that contains propaganda, but the section itself does not serve as an unequal biasing agent for viewers, unless your issue with the engagement-based recommendation is that you believe it is biased in a certain direction.
That seems like quite the assumption to assume that people will think content has factual merit because it shows up in their news section. People should not be treated with kid gloves and herded in certain directions out of fear that we will harm ourselves. Facebook should clearly label news sections as "Selections Based on Engagement" and "Selections By Facebook Curators". People are capable of understaning that the selected news is biased.
Technically we can choose news by any metric we want depending on what we want, but I'd prefer the results reflect some combination of accuracy, reputability, importance, and interest. Whatever Facebook's current algorithm is doing, it's not that.
> Must Facebook control what news people are reading?
No, only what news they promote to their network. I don't care if FB manual curates the list or uses an algorithm, they're involved in the design and so they're making an editorial decision here.
The Facebook algorithm is not based on what people want to read.
It is based on what makes people the most upset. That’s what makes people stop what they were doing before, and click.
That’s what Facebook is built to do: elevate content that “stops the thumb”. This is a direct quote from a FB rep advising my employer on how to perform better on Facebook. Here’s another one: “don’t be reasonable, be extreme. You need to stand out.”
Imagine if you picked anything else that way. Only making friends with people who make you the most upset. Only eating foods that make you the most upset. Only dating people who make you the most upset. Is that a good life or bad life?
Social Media works by curating content that you like and building a safe bubble around you by showing the user the content that will positively reinforce their belief systems. Thereby keeping them shielded from views that are challenging or contradictory to their belief systems. Creating a positive reinforcement loop for the user to depend on it even more.
The reason fb wont do this is because there is so little to show on the friends and family front, it will do nothing but drive users away and solidify its market positioning as a political news reader.
To pull off separate feed fb needs to fix trust so people actually posts friends and family content as much as before.
Anyone have references to this effect occurring?
My sense was that friends and family personal posts were down but I’ve never seen any research or articles on it specifically.
Your aunt might post a photo of their kid once a week that you take 5 seconds to see, and you only spend 5 seconds on facebook catching up on the new content for the week rather than spend 5 hours reading your aunt's crazy shared right ring news stories and debating with geriatrics in the comments section.
Users were able to subscribe to their friends posts based on post type (image, text, link, etc.)
Back then I was an active Facebook user and selected to subscribe only to image content.
It worked great for a while but got obsolete both by Facebook removing those subscription options and (earlier) the users emerging habit to add pictures to every post to increase reach, no matter the content.
Or better yet take the family photo sharing out of Facebook. I don't consider it a safe place for that. I just use Facebook for groups and events now. Family stuff goes through vzg.me or shared albums. But I agree within Facebook it would be cool to have them classify post types and let you filter: Family, Flamewar, Humblebrag, Meta
Oooh, vzg.me looks promising, thanks for the tip. (Also got a tip about gath.io on HN a while ago, which has worked pretty well for a couple smallish, informal events I've hosted.)
Why would they do that? Thanks to network effects they have an effective monopoly & clearly they make more money from their current user hostile design.
Why on Earth should this be a surprise to anybody?
Social media news is really a polite way of saying unsubstantiated rumors, gossip, and bullshit. Tabloids are higher quality stories. The new FB News is journalism, which is about as similar as TikTok videos to Harvard Law textbooks.
The issue is if there's two sections, one "very engaging" and one "somewhat engaging," then we won't solve anything. "Very engaging" will far outpace "somewhat engaging" because, well, that's what people want to engage with.
I think you're right about the need to shift people but in order to do so, engagement is key. They should maintain engagement as the primary KPI, and instead nudge the content towards less inflammatory. Engagement is, IMO, a primary measure for if they going to be able to reestablish trust.
I would be blown away if Facebook wasn’t using an algorithm based on what publishers you used the most to show you news. Unless you can set up a 1000 computer clean room study with clean ips and everything and prove that they all get the same news no matter how much they go to one side first there’s no real way to know.
So basically what we know is this author likes NYT and the verge
Social Media works by curating content that a user like and building a safe bubble around the user by showing them the content that will positively reinforce their belief systems. Thereby keeping them shielded from views that are challenging or contradictory to their belief systems. Creating a positive reinforcement loop for the user to depend on it even more.
“Engagement” seems to be the key word that doesn’t appear in OP but is mentioned in many sibling comments. This metric is great for driving ad revenue but is irremediably broken for constructive social discourse.
Does not that mean the mainstream media is no longer mainstream?
It is important to get info from multiple sources including left, right, because they all have their biases diverting from the truth and often reality. You cannot simply trust that one side is presenting all the relevant facts.
Well, it's what most FB users aren't reading. My theory/hope is that they visit normal news stories outside of Facebook, so when they see the same headline on FB, they think "ah I've read that already". Meanwhile when the user sees a click-baity title on FB, they haven't heard about that on mainstream sites, so they end up clicking on it...
People still use Facebook for their news? I’ve been off it for 1-2 years and started following some local/national agencies instead. Much more level headed coverage of everything
I think social media is bad in the sense that sensational news gets attention. This does mot necessary mean that news that are news worthy get attention.
This ignores the impact of advertising and FB's advertising algorithms, which promote stories from Fox etc. onto the feeds of users who otherwise wouldn't see them. Thus I don't think you can say this is an accurate representation of what people do "when left on their own."
Not a terribly strong argument against why hosting and platforming reactionary rightwing rags can lead to more radicalization. To quote: "Correlation doesn't imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing 'look over there'."
It's not an argument against or for anything, it's a simple logical fact. Two things lining up logically doesn't mean there is causation, as there can be other things that also line up that could be an equally valid explanation. Finding causation requires actually looking for it, and doing some statistical studies. Of which I'm betting many do exist!
White supremacist propganda and neonazi recruiting is what causes radicalization. Deplatforming those people is a good idea, yes.
Unlike what leftwing youtube and a few researchers from liberal universities would like you to believe though, Ben Shapiro, Jordan Peterson, Stefan Molyneux and conservative news sites are not radicalising people. There's no evidence for that.
It's likely responsible for radicalization of both sides. I really doubt it's a coincidence pressure for social movements intensified more between 2009-2020 than it has for any time prior.
More like radicalization in general. Left wing got similarly radicalized, if not more so. When everybody lives in their own bubble people get pushed to the extremes :-(
This doesn’t survive challenging- the American, and for that matter global right wing Has been indoctrinated For a long time. In America, The left has only Started coming online since the trump admin.
America went through the founding of Fox News, creationism (religion over science victory), the birth to the tea party because the children born of slanted news reporting thought the republicans weren’t extreme enough, all the way down to the Trump election. Breitbart, Steve Bannon, Roger aisles are avowed propaganda outlets aimed at converting people.
For the right, science and reason themselves are a threat. They’ve been at war a lot longer than the left even realized how deep the rift was.
Around the world, the Fox News playbook has been copied and reason/science based approaches are being pushed back.
America’s left wing isn’t even that far left, the emergence of Bernie and like minded politicians is only recent. Antifa was just the past few years.
The left has always been reacting, weakly, to the mainstream radicalization and Grass roots organization on the right.
There are no communist activist violent groups, so no not to the same extent. At least none visibly active anywhere reported.
That argues that the radicalisation has been predominantly right wing, as there are many active violent fascist groups now, and weren't a few years ago.
^Also a racially motivated murder that people don't like to confront "Muhammad's goal in Phase One was to kill six white people a day for 30 days (180 per month).". The media is partially responsible for these attacks by stoking racial divisions in the US.
The media suddenly becomes super interested in racial divisiveness and pandering to racists who exclusively vote for Democrats around June of a Presidential election year.
In what way does anti-fascism imply anything other than anti-authoritarian, least of all pro-communism? This guy I linked below when trying to understand your comment even questions if it can be considered a "group" at all ( "I also question whether antifa can be considered to constitute a “group” at this point in time"), and I agree. Either way, equating far-right violence (which was responsible for every extremist death in 2018) and violence associated with antifa, for example against people like those at Charlottesville, Neo-Nazis, Neo-fascists, and white supremacists, just seems deeply disingenuous to the point of trolling.
You don't really understand what this is about. The original antifascist movement was about using communism to fight fascism (also known as national socialism, BTW). That is, communist totalitarianism vs capitalist totalitarianism. It was a Stalinist movement. The current "Antifa" pretty much quote The Internationale [1] as their position, without even knowing what it is, or that they are quoting it. This ignorance would be humorous, were it not so dangerous.
To quote wikipedia: Antifa (German: [ˈantifaː]), was a militant anti-fascist organisation in Weimar Republic started by members of the Communist Party of Germany (KPD) that existed from 1932 to 1933. They flew hammer and sickle over their headquarters too [2]. Many of the current members are commies also.
Can you point out where this purported capitalized Antifa posts their updates? Because right now it seems like you have so many "understandings" about capitalized Antifa that it would be easiest to just get the fundamental things like exactly who you're speaking of instead of asking you to necessarily have to explain it all here.
Also: Here is an account of anti-communist anti-authoritarians singing The Internationale in 1989 when in police custody. Odd they would sing that when you characterized the song as a singularly communist anthem. One possibility is that every anti-authoritarian member of Capitalized Antifa isn't a communist after all.
Yes, there are people who subscribe to it without understanding what it really means. I think I pointed this out a couple of posts upward. Such people are often called "useful idiots" by those who really understand what's going on and have an actual goal. 1917 revolution in Russia was mostly carried out by useful idiots, tens of millions of whom have died afterwards of famine and repressions.
So only the people who "have an actual goal" is who Antifa consists of? Or just Antifa "leadership" I'm just not totally understanding why anti-X necessarily has to shoot past center to pro-Y.
"So" followed by a completely made up version of what you think I was saying is a tell for cognitive dissonance. It's pointless to engage after this point.
Certainly not, it's that if one follows your tenuous logic down it's path it falls apart badly without you being willing to clarify the things I asked. My guess is just that you're so deeply anti-communist and pro-capitalism, that even anti-fascism is offensive to you. Just looks like indoctrination to me, but again I do welcome you to point me to wherever the capitalized Antifa violent communist group is posting. That would clear everything up.
That was very recent and people have been talking about antifa for years. Not to suggest it was the first false flag or whatever. But I distinctly remember people ranting about them around the time of the Virginia protests.
I vaguely remember a story a few years ago about a guy receiving a blow to the head from a bike lock and requiring therapy for potential brain damage from said blow.
Let's be clear, that sucks, but it's nowhere near the radicalization we've seen on the right. Boogaloos shooting cops, neo-nazis responsible for numerous synagogue / church / mosque shootings, driving cars into crowds, etc. aren't really comparable to a bike lock to the head.
Given the history of "antifa" as a movement, I would argue that against them being radicalized even so in recent times.
What I do think is that antifa has become more relevant because of the rise of radical right wing in social media as a bogeyman.
Not based on the list shown in the article. The closest to a radical left article is the lowest ranked; and it's about a petition to declare the KKK a terrorist group, which is hardly an exclusively left-wing position.
Show me the mass murders by leftists in this decade. Because that's what we're seeing on the right. If not mass murders, what's the substance behind your "if not more so"?
In this thread, the violence I'm seeing attributed to the left is a nonlethal assault, a dude who killed a bunch of cops but didn't demonstrate any specific political motivation, and some attempted firebombings.
The claim is "similar if not moreso." I'm not absolving or supporting any of this violence, but bothsideism doesn't appear to be supported by the data.
Seeing lotsa downvotes, no actual rebuttals. Where's this dangerously radical left that everybody is so afraid of?
OT-ish: There was just a political story on the FP, about a open letter from a college that starts with B. It disappeared, which makes sense considering that political stuff is strongly down rated on a tech site, so I went into /newest and went back many days looking for the [flagged] and or [dead] item, and I dont see it. So I used the search interface, but it does not show flagged or dead stories, if logged in, is it possible to have the search interface include those too?
I discovered that the following link shows my FB feed in chronological order:
https://www.facebook.com/?sk=h_chr
I thought, oh, great, this is the key to making FB decent again.
But after a while of using it, it usually shows almost nothing. Sometimes literally nothing. People are still posting stuff though that I can see with the default view. I almost think I may have triggered some sort of adversarial logic that is trying to force me to use the engagement-driven ordering.
Edit: In fairness, it could be that they are hostile to uBlock origin, either deliberately or as emergent behavior. I wouldn't mind ads in principle, normal ads like you used to have in print media, but I often get ones that are very unpleasant if I don't use the blocker.