This is how the Facebook extremism machine works
The social media group no longer has its own platform under control. Internal tests reveal that the road to extremes is frighteningly short for users.
“The content in this account has taken on disturbing and polarizing features in a very short time,” a Facebook researcher judged.
Photo: Facebook files
Carol Smith and Karen Jones have a lot in common. They both signed up for Facebook in the summer of 2019, and they are 41 years old and from Murphy, a small town in the southeastern United States. They even indicated in their profile the same interests: children, parenting, Christianity. But if they met in real life, Carol and Karen would probably have a little more to say to each other. Carroll supports Donald Trump and loves memes that mock liberals. Karen is a fan of Bernie Sanders and can’t stand the former US president.
This is how two experiments began to impressively show that Facebook has built an infernal machine that it can no longer control. Carol and Karen are not real people, but they are fake accounts created by a Facebook researcher in quick succession to check what is being introduced to new users. The scientist’s name is withheld in the documents, and her identity is known to this newspaper. It summarizes the result as follows: “The content in this account (which basically followed our recommendation systems!) took on disturbing and polarizing features in a very short time.”
“Typical entrepreneur. Lifelong beer expert. Hipster-friendly internet buff. Analyst. Social media enthusiast.”