1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Everything We Know About Facebook's Secret Mood Manipulation Experiment

Discussion in 'BlackHat Lounge' started by The Scarlet Pimp, Jul 4, 2014.

  1. The Scarlet Pimp

    The Scarlet Pimp Jr. VIP Jr. VIP Premium Member

    Joined:
    Apr 2, 2008
    Messages:
    788
    Likes Received:
    3,120
    Occupation:
    Chair moistener.
    Location:
    Cyberspace
    Facebook's News Feed -- the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone -- is not a perfect mirror of the world.

    But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.

    We now know that's exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average.

    And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

    This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine "emotional contagion," as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.

    The experiment is almost certainly legal. In the company's current terms of service, Facebook users relinquish the use of their data for "data analysis, testing, [and] research."

    Is it ethical, though?

    Since news of the study first emerged, I've seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.

    We're tracking the ethical, legal, and philosophical response to this Facebook experiment here. We've also asked the authors of the study for comment. Author Jamie Guillory replied and referred us to a Facebook spokesman. Early Sunday morning, a Facebook spokesman sent this comment in an email:

    This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.

    A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow.

    We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely.

    http://www.theatlantic.com/technolo...s-secret-mood-manipulation-experiment/373648/

    ---

    Controversy Over Facebook Emotions Experiment Continues

    When it came out earlier this week that Facebook was secretly experimenting with the emotions of users via their Timeline, many people were outraged. How could the site have done something so unethical? Why weren't any standards in place to prevent such a study?

    More information has continued to come out about the study this week, and none of it's good; it's been revealed that the Cornell University ethics board did not pre-approve the study before they participated in it with Facebook, and that Facebook's own data usage policy at the time of the experiment may have not "implied" user permission, as previously argued by the site.

    Four months after the study was conducted, Facebook added a line to its data policy about how it could use user information "For internal operations, including troubleshooting, data analysis, testing, research and service improvement."

    This would suggest that Facebook conducted the study without user permission, though the site insisted otherwise in an interview with Forbes.

    "When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer," a Facebook spokesman said.

    "To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word 'research' or not."

    For as much as Facebook wants this story to go away (and as many half-hearted apologies it offers), it's clear this controversy won't be done anytime soon.

    http://facecrooks.com/Internet-Safe...-Facebook-Emotions-Experiment-Continues.html/

    ---

    Sheryl Sandberg Apologizes For Facebook Emotion Manipulation Study... Kind Of

    The Facebook emotion contagion study has finally reached the executive level. Facebook COO Sheryl Sandberg said the word "apologize" in reference to the study that involved nearly 700,000 Facebook users in January 2012, but she made it into one of those classic corporate "We're sorry if this offended you" apologies. Via the Wall Street Journal:

    "This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," Sandberg, Facebook's chief operating officer, said while in New Delhi. "And for that communication we apologize. We never meant to upset you."

    Her wording is maybe not the best there. Part of why people are so upset is that Facebook did mean to upset some of them as part of the study. Translation: Facebook is not sorry for doing the emotion contagion study. It was done in the normal course of business.

    It is sorry that everyone is upset about the fact that it purposely made some users upset a couple years ago.

    http://www.forbes.com/sites/kashmir...-facebook-emotion-manipulation-study-kind-of/
     
    • Thanks Thanks x 2
  2. umerjutt00

    umerjutt00 Jr. VIP Jr. VIP Premium Member

    Joined:
    Oct 28, 2011
    Messages:
    3,644
    Likes Received:
    1,904
    Occupation:
    Ninja
  3. slipperflip

    slipperflip Registered Member

    Joined:
    Feb 11, 2014
    Messages:
    84
    Likes Received:
    16
    Occupation:
    Developer
    Location:
    USA
    I pulled all FB accounts several years ago. They never made me any money. In my experience, it was all about freebies. In fact, FB would distract from my optins and thus cause me to lose out to a platform that only rarely put my message in front of my customers. And I got tired of playing their little game.
     
  4. Cratos

    Cratos Jr. VIP Jr. VIP Premium Member

    Joined:
    Aug 16, 2012
    Messages:
    1,756
    Likes Received:
    1,095
    Gender:
    Male
    Occupation:
    SEO
    Location:
    Where The Elite SEOs Dwell
    FB is full of Sh#t. That's why I deactivated my account and deleted all i could. Thanks for posting this. Very interesting read.
     
  5. rodvan

    rodvan Jr. VIP Jr. VIP

    Joined:
    Jul 27, 2010
    Messages:
    1,226
    Likes Received:
    481
    Occupation:
    developer, marketing, automation, machine learning
    Location:
    Wizard of Bots
    Home Page:
    Well, then I should think on making a complete software for facebook, since they apart from milking us with money (PPC) they use collectivity to do psychological experiments?
    It should have a good reason, but dont know yet.
     
  6. pxoxrxn

    pxoxrxn Supreme Member

    Joined:
    Dec 21, 2011
    Messages:
    1,397
    Likes Received:
    2,066
    Who cares? It's obviously made some BHW members pretty moody. In the world of experiments, this is nothing. Back in when psychology was just starting to become a recognized discipline, some dude locked a monkey in a cage on it's own for years at a time...
     
    • Thanks Thanks x 1
  7. SeanAustin

    SeanAustin Power Member

    Joined:
    Mar 4, 2013
    Messages:
    740
    Likes Received:
    707
    Location:
    Rocky Mountains
    Ya, facebook is full of shit. But manipulating some facebook feeds for the sake of science doesn't seem that evil to me. Unless they somehow profited off of it...
     
  8. pxoxrxn

    pxoxrxn Supreme Member

    Joined:
    Dec 21, 2011
    Messages:
    1,397
    Likes Received:
    2,066
    They will find a way...

    I didn't read what OP said but I got the gist of what happened from the news. From what I could gather, it wasn't FB conducting the experiment, it was a University. In all honesty, I think it's pretty good. If we can somehow get a reduction in depression from these results, saving one kids life or just stopping one person from living the kind of life that makes them want to end it, then it's all worth it.

    We are manipulated in a similar way our whole lives, its estimated that gen-y is exposed with about 22 000 calls to action each day. We are usually manipulated to part with our money, at least this will hopefully have some other benefit that will actually benefit society.
     
  9. Cratos

    Cratos Jr. VIP Jr. VIP Premium Member

    Joined:
    Aug 16, 2012
    Messages:
    1,756
    Likes Received:
    1,095
    Gender:
    Male
    Occupation:
    SEO
    Location:
    Where The Elite SEOs Dwell
    A monkey? that's supposed to be something.. ? lol now that is nothing. There are 1000x worse experiments going on at this very moment with CHILDREN. But no one will talk about that will they.

    They'll just talk about the poor dogs that have no home and need $2000 dollars donation for a surgery because it looks so sad :''((( let's all donate to the cause (tear, tear)
     
  10. georgehappy

    georgehappy Newbie

    Joined:
    Jul 3, 2014
    Messages:
    19
    Likes Received:
    2
    people use Facebook these days more than email :confused:
     
  11. bartosimpsonio

    bartosimpsonio Jr. VIP Jr. VIP Premium Member

    Joined:
    Mar 21, 2013
    Messages:
    8,844
    Likes Received:
    7,452
    Occupation:
    ZLinky2Buy SEO Services
    Location:
    ⇩⇩⇩⇩⇩⇩⇩⇩⇩⇩⇩⇩
    Home Page:
    Google probably conducts their own experiments using SERPs. The big difference between Google and Facebook is Facebook is running this experiment within the platform they built themselves. And Google probably does this sort of thing in the public WWW, where the people should control the WWW not Google. THAT is the big problem I see with Google : they are closing down the WWW and making it their own platform.

    A LOT of Matt Cutts' job is psyops. He's constantly testing SEO's and catching them off-foot or misleading them into traps.