@WillOremus | 42,279 followers
Facebook secretly weighted reaction emojis, including "angry," as 5x the value of "likes"--over the integrity team's warnings.

We wrote about the obscure, often arbitrary, human decisions that shape Facebook's algorithm and how we all interact online: washingtonpost.com/technology/202…

Tweet Engagement Stats

Stats are based upon replies and quotes of this tweet

Engagement Map

Quotes
Replies
Retweets
Trendsmap

Replies and Quotes

Total of 38 replies and 193 quotes found
Two years before Facebook decided to assign more value to content & interactions that make people angry, we had been building a tool that rewarded & highlighted good, humane interactions on the platform and the FB team killed the access we had been using to enable the feature.
 
Welp. Here it is. Facebook is *literally* fomenting hate and anger. Intentionally building algorithms that take advantage of negative emotions because it produced more "engagement".

Many of us have been saying this for a long time. But some of y'all needed the smoking gun.
 
Imagine if someone else—a media outlet, for example—were to prioritize outrage-inducing content because it drove more engagement.
 
In reply to @WillOremus
To me this is not a story of Facebook intentionally fanning anger for profit. It's a story of how arbitrary initial decisions, set by humans for business reasons, become reified as the status quo, even as evidence mounts that they're fueling harms. washingtonpost.com/technology/202… pic.twitter.com/jqsxgzEdZj
 
This clearly lays out a cycle at Facebook. Facebook rolls out new product -> optimizes for quick adoption and wide use -> integrity comes raises alarms -> bureaucratic nightmare and inaction -> real world harm -> integrity measures adopted
 
let the words ‘technology is neutral’ never be spoken again
 
“Favoring “controversial” posts — including those that make users angry — could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, “It’s possible.” 😡
 
"Facebook’s own researchers were quick to suspect a critical flaw. Favoring 'controversial' posts could open 'the door to more spam/abuse/clickbait inadvertently,' a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, 'It’s possible'.”
 
emojis, shemojis, what Facebook needs is a simple single step button to cancel and delete your own account forever
 
So Facebook has cynically used anger and hate to make billions? Sickening.
 
Si vous voyez une désinformation passer sur Facebook, ne partagez pas votre colère avec l'algorithme de Facebook : ça ne fera qu'amplifier la désinformation.

Oui car plus de colère, c'est plus d'addiction. Plus d'addiction, plus d'attention. Et plus d'attention, plus de profits.
 
 
noo don’t overemphasize negative emotional responses to increase the time i spend on your app your so sexy aha
 
Interesting piece without a single mention, quote, interview or reference to academic work that has been done on this topic for the past decade. Academics who write about this: @tainab, Dave Beer, @anatbd, @NicholasAJohn, @DanKotliar, @silvertje, @bevskeggs, myself & many more.
 
Det här avslöjandet gör en ju ganska arg, för att prata det språk som Facebook gillar.
 
 
 
Anger scholars - you should write about this! Why is anger so powerful and what does it do to engagement? @zeitzoff @Davin_Phoenix @stevenwwebster
 
this is just kinda the whole thing, ya know?
 
This fits with what we found in our research: negative posts about the out-group tended to receive a lot of angry reactions.

Yet, Facebook's algorithm rated "angry" reactions as 5x more valuable than likes.

pnas.org/content/118/26… pic.twitter.com/weTSZ8KtFh
 
$FB probably the most evil company in existence that has resulted in both the dumbing of the Nation and caused the most animosity / divisiveness
 
They literally generated the perfect boomer fascist radicalization machine
 
This is pretty damning and exactly why we shouldn't trust Facebook to ever get this right. Instead of demanding they do MORE algorithmic manipulation ("just suppress all the bad stuff!") we should be fighting for policies that make it harder for them to do this at all.
 
Giving ⬇️ a 5x or even 2x weight relative to ⬆️ never even crossed my mind
 
 
What happens when billions of people around the globe connect themselves to an AI system designed to make them angry?

The global rise of fascism.

You can draw a straight line.
politico.com/news/2020/09/2… pic.twitter.com/2VLHU0mcAh
 
No hay diferencia entre quienes guardan silencio por algún compromiso profesional con un gobierno que comete errores graves y quienes guardan silencio por algún compromiso profesional con una empresa que comete errores graves.

Facebook DEBE hacerse responsable de este desmadre.
 
 
I remember that when FB introduced reaction emojis years ago, I actually thought that concerns about FB exploiting affect for profit through those dumb UI gadgets were actually a bit over the top. How naive can one be?
 
In reply to @WillOremus
Reading this just made me swoon like a Victorian Lady. This shit is relentless
 
 
I'm not great with predictions, but as this meltdown burns its way out of the containment chamber, Facebook may single-handedly end founder control of startups past a particular size.
 
More from the documents disclosed to the SEC and Congress by our client Frances Haugen.
 
So, posts on FB that got 'angry' reactions were prioritized in the algorithm. Sound about right, America. Can we get back to 'liking' each other?
 
Can we please stop this type of journalism?

What you could say was: "Facebook weighted reaction emojis, including "love' as 5x the value of likes".

And, of course, they did. People expressing love are more valuable than people who just click like.

There is no scandal here!
 
Pétition pour rendre tous les sites et applications d'utilités publiques 100% open-source.
 
 
 
Stories like this show the danger of a Facebook that’s worried it has stopped growing. Parents, for the sake of our democracy, please urge your kids to spend 3 hrs/day clicking Facebook
 
Well damn. Here's a big old pin in the whole "people interacting with emotionally charged content is just human nature" bubble.

It is, but especially if you, as a platform, decide that that content is more valuable.
 
Burn Facebook to the ground and salt the earth.
 
In reply to @WillOremus
So you mean they did what every other company vying for your attention on tv, radio, print, theaters, billboards, and digital does 🤷‍♂️
 
the funny thing is this is EXACTLY what people predicted would happen when they added the extra emotes - we knew it was WHY they were adding those emotes - and everyone said we were crazy for it, lol
 
The architecture of information … just little structural rules like this can have such wildly disproportionate effects in online platforms.
 
In reply to @WillOremus
i hate facebook with a passion (haven't used since 2016) but this article's headline is clickbait. the reaction emoji's include 'love' & 'funny' & 'thoughtful', etc. It wasn't just 'angry' vs. 'like'. All the emoji's were counted. This issue doesn't need to be dumbed down.
 
In reply to @WillOremus
For a while I was in an FB group where everyone reacted to everything with a 😡 in an attempt to fuck up FB's algorithm... But even that started to feel sinister after a while
 
The 1/6 rioter I wrote about for @washingtonpost in February blamed social media for his descent into the rage that led him to the Capitol. At the time the way he described Facebook as an actively malevolent force sounded far-fetched. Turns out it was the actual business model. pic.twitter.com/oHumPpovW7
 
Here it is. This is why I always wonder what a social media landscape could have looked like if interactive features centered on curiosity — like “ask a question” or “oo I wanna learn more!” — instead of negative emotive reactions.
 
Glad to see my belief that giving corporations complete control on how content is distributed with zero moral or social responsibility, and zero oversight from society at large, was an incredible mistake validated once again
 
We started teeeting several years ago that algorithms used by social media companies must be subject to regulatory approval by a professional regulator to ensure neutrality, objectivity & social harmony.
 
There is basically one way to guarantee Facebook gets the message that you don’t like the content, shutdown your account
 
 
*rubs forehead and sighs*

Ok so they weighted all emoji reactions by 5X the weight they give regular likes in terms of pushing things in the algorithm. I wonder how that affected groups and moderation. I'm curious especially about male dominated groups. For a friend >_>
 
Nice example that things like this aren't always complicated "algorithms". It can be a pretty straightforward (technically) value choice that can (and should) be explained to data subjects.
 
 
HAHAHAHA this is like a cautionary tale in a data science text book JESUS CHRIST I hate it here!!!!
 
Holy *shit* what a perfect crystallization
 
Alright ima need y’all to start angry reacting my streams. Thanks
 
 
this is why today, you’re more convinced than ever that you’re right and everyone else is wrong
 
 
“Anger and hate is the easiest way to grow on Facebook,” Haugen told the British Parliament on Monday.
 
Facebook is horrendous and has always sucked. It has always made ethically terrible decisions, starting with its founding hack. The founder and his company 👏 are 👏unethical. They demonstrate it repeatedly and it is *stupid* to act like it is something more innocent than that.
 
Algorithms are at least as dangerous as the people who write them.
(I too will weigh all angry replies to this tweet more than the likes)
 
 
Those decisions that appear obscure and arbitrary are almost definitely not.

Facebook has psychology PhDs doing research on their users, assisting in the algorithms that determine engagement and display content, etc.

Facebook did not come to power by being arbitrary.
 
In reply to @guacamayan
what's new is the artillery division's worth of smoking guns, like this. it's not that rage is an emergent artifact of some complex algorithm. FB CONSCIOUSLY DECIDED THAT RAGE WAS WORTH 5X THE VALUE OF "LIKE" WHEN DECIDING WHAT TO SHOW PEOPLE
 
i’ve said it before, one of the most important questions we can ask of any tech is: what are we optimizing for?
 
"It's the economy stupid". The digital economy is about owning your time. Anger/outrage holds you hostage to the platform. Any of us who spent five minutes in the digital mines, have been screaming about this since the 2010's. Trump/Brexit etc. woke media up and I'm grateful.
 
Well, that explains why things that make you angry or disgusted spread more rapidly on social media than things that make you feel good.

It’s not just due to the human tendency to gossip about stuff that bother us— it’s the #algorithms.

#disinformation #misinformation
 
Dieser Artikel ($) der @washingtonpost zeigt einmal mehr, welch unregulierte Macht in den Händen von Facebook liegt.
Das Unternehmen experimentierte zB indem es einzelne Freunde vermehrt im Feed ausspielte und kontrollierte, ob mit denen auch danach mehr Kontakt gehalten wurde. pic.twitter.com/h242ymLhwP
 
Yet more evidence that “algorithms” and other forms of artificial “intelligence” are only given so much power in order to give plausible deniability to deliberately harmful human decisions.
 
Computers do nothing more than amplify human action. “The algorithm” has no intelligence, it’s just the actions of a surprisingly small group of people.
 
A massive global social experiment to make us all angrier, meaner, and more racist. Awesome.
 
Algorithms are written by people
 
 
I suspected this for years, but had little proof to back me up outside of my own experience with posting. Posts that elicited anger always did better than others.
 
The internet is accelerationist, individuals cannot compete against aggregator platforms, and surveillance capitalism is viciously damaging to community IRL.
 
Not surprised considering fostering conflict for fun and profit has always been FBs priority.
 
The story notes that Facebook also counted a ♥️ as 5x the value of a mere Like and finally set the value of 😡 to zero in September. But still... as I've said many times before, Facebook's obsession with its precious goddamn engagement needs to be shot into the sun.
 
 
 
Shut this fucking thing down already before it kills all of us
 
Why? Simple. Negativity evokes discussion. Discussion increases time-on-site.

Don't fine facebook. Make their entire business model illegal.
 
En secreto, y pasando por encima de las advertencias de su propio equipo, Facebook quintuplicó el valor de las reacciones (incluida 'me enfada') con respecto al del 'like'.
Así es como los contenidos extremos y falsos ganan terreno.

No te pierdas la nueva investigación del @WSJ
 
 
This thread makes me angry 😠
 
Vanity metrics were always the bane of any Social media platform.
Carrot & Stick motivation vs. intrinsic community building.

You always want people to share with you because they feel a part of the more significant movement.

Big numbers don't = better impact.
 
Over 143 replies and quotes not shown

Retweeters

1,529 retweeters not shown
For access to this functionality a Trendsmap Explore subscription is required.

A Trendsmap Explore subscription provides full access to all available timeframes

Find out more

Thanks for trying our Trendsmap Pro demo.

For continued access, and to utliise the full functionality available, you'll need to subscribe to a Trendsmap Pro subscription.

Find out more

This account is already logged in to Trendsmap.
Your subscription allows access for one user. If you require access for more users, you can create additional subscriptions.
Please Contact us if you are interested in discussing discounts for 3+ users for your organisation, or have any other queries.