Yet another step toward a police state: FaceBook Experiments with Manipulating Your Mind
By Alfredo Lopez
How does the news on the Internet make you feel?
What sounds like a frivolous question, the kind you might be asked in a bar after a few drinks, is actually a profound and powerful one. If the Internet's content can affect your feelings, the manipulation of that content can exert powerful social control.
So for a week in 2012, Facebook, in collaboration with Cornell University and the University of California at San Francisco, set out to explore that possibility. It edited the content seen by a select 689,000 of its users, overloading its news feed content with positive news for some users and negative news for others and then studied their posts in reaction without their knowledge.
As a result, Facebook learned a lot. According to an abstract of the study, "for people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred."
And, when news about the study broke last week, Facebook confronted an immediate and powerful push-back from horrified activists and users (and now a couple of governments) who raised some significant questions. Does a company have the right to use its customers as test subjects without their knowledge? Is it ever ethical to change news feed content for any reason?
But the more important issue sits behind those questions. Facebook obviously thought this was okay; it does research on users all the time. And its hunch about the outcome proved correct. So what does it mean when one of the largest information companies on earth, the centerpiece of many people's information experience, practices how to program people through lies?...
For the rest of this article by ALFREDO LOPEZ in ThisCantBeHappening!, the new uncompromising, four-time Project Censored Award-winning online newspaper, please go to: www.thiscantbehappening.net/