Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

Options
View
Go to last post Go to first unread
Offline News  
#1 Posted : Sunday, June 29, 2014 11:12:54 AM(UTC)
News


Rank: Member

Reputation:

Groups: Administrators, Registered
Joined: 9/23/2007(UTC)
Posts: 25,073

Was thanked: 3 time(s) in 3 post(s)
Here’s an interesting psychological factoid: Emotional states can be transferred to other people via text-based messages on social media, such as Facebook. That means that if, for instance, you view a bunch of sad posts, you’re more likely to pen a sad post yourself shortly thereafter even though you don’t realize that the sad posts made you sad.

Here’s another even more interesting but more disconcerting factoid: Researchers figured that out by running experiments. On Facebook. Without your knowledge or consent.

Here’s a snippet from the “Significance” section of the paper, which was published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS):

We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

That’s almost 700,000 people that Facebook experimented on. “In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed,” reads the article’s abstract. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.”

Facebook HQ
Facebook HQ

The research itself is significant because, according to its authors, “emotional contagion” can happen not just in real-world interactions, but also from social media interactions. Thus, a social network could be a vehicle for massive, large-scale emotional contagion.

The results are fascinating, to be sure, but what of the research process by which Facebook--and it is Facebook here, as the lead author is Facebook’s Data Scientist Adam D.I. Kramer--gathered this data? The social network purposely manipulated the Newsfeeds of hundreds of thousands of people. Shouldn’t Facebook have had to notify those users that it was doing--something?

Research must be governed by an ethics board. University-based research is governed by the Institutional Review Board (IRB), which approves a given research team’s methods and procedures before they can proceed. This is to protect research subjects from abuses. (If you haven’t read about the Stanford Prison Experiment or the Milgram Experiment, do so now.)

Facebook would have to adhere to certain research ethics to conduct this study, but it’s not clear what ethics exactly. Because Facebook is not affiliated with a university, it would not have to bother with IRB approval; it’s possible that all Facebook had to do was tell the PNAS that it adhered to its own internal research ethics. Thus, w[censored]ver approves internal research at Facebook apparently felt that the social network’s broad data use policy, that all users sign, allowed them to conduct this experiment without asking users’ permission.

Of course, there’s always risk when researchers run tests. Ostensibly research ethics boards will ensure that test subjects are protected throughout. Did that happen in this case?
Offline basroil  
#2 Posted : Sunday, June 29, 2014 7:11:26 PM(UTC)
basroil


Rank: Member

Reputation:

Groups: Registered
Joined: 6/19/2013(UTC)
Posts: 84

"Because Facebook is not affiliated with a university, it would not have to bother with IRB approval; it’s possible that all Facebook had to do was tell the PNAS that it adhered to its own internal research ethics. "

Because it's not affiliated with a university it falls under federal regulations set up since the 60s and 70s requiring explicit consent (ToS does not count) for any medical or psychiatric experiments.

Offline Versifier  
#3 Posted : Monday, June 30, 2014 1:51:36 AM(UTC)
Versifier


Rank: Member

Reputation:

Groups: Registered
Joined: 4/11/2013(UTC)
Posts: 17
Location: Canada

Seems like an excellent way to have people stop using your service. Well done Facebook!

Offline GwenEMugliston  
#4 Posted : Monday, June 30, 2014 3:55:36 AM(UTC)
GwenEMugliston


Rank: Member

Reputation:

Groups: Registered
Joined: 6/5/2014(UTC)
Posts: 2

That is correct. Now I want the dates, from x to y, posted. And I want families of people who used Facebook regularly to get those dates. If the families had family members who committed crimes, became extremely depressed and committed harm to self and/or others, to know those dates. Facebook use can be documented. if it can be documented the behaviors of those people at risk can be correlated with Facebook's experiment, then I think a massive civil suit is in the making. Inexcusable behavior on Facebook's part.

Offline MattWindsor  
#5 Posted : Monday, June 30, 2014 4:10:18 AM(UTC)
MattWindsor


Rank: Member

Reputation:

Groups: Registered
Joined: 6/30/2014(UTC)
Posts: 1

I would respond to this but I have to wait to receive guidance from Facebook. lol

Offline rapid1  
#6 Posted : Monday, June 30, 2014 8:56:30 AM(UTC)
rapid1


Rank: Member

Reputation:

Groups: Registered
Joined: 2/29/2008(UTC)
Posts: 4,851
United States

The things in play here are much wider than people realize. We now live in a media driven society where people mistakenly feel they are " informed" when in reality they're driven. The extra part of this equation is that the "law" now is very subjective as well. So there may be a law but that laws writers and enforcers are not subject to it or the regulations it implies.

So since we are media driven someone says we will protest, file a lawauit, start a rebellion etc. someone else see'zs that says oh good someone is going to do something about it and moves on forgetting it completely then nothing gets done. It is pack mentality in all reality and one of our basic instincts to follow the pack for safety reasons.

This applies to everything in our society nowand is easily viewable. Take talk radio as a general example. The individuals who listen to it automatically assume there are hundreds in the room and follow that pack mentality, the host be it kurmet the frog or Rush Limbaugh are delivering a message to the audience, that audience feels group empowered because of the general acceptance of being said group, the leader better known asthe host delivers whatever message theyhave been paid and or influenced to deliver. Then the audience feeling the power of the group or pack and also spreads.said meessage given the orater that much more power and enlarging said pack.

The problem is the rules are in no way universal and you have been given as much info as you need and nothing more. Mark Zuckerberg and his business and varrying groups are part of the group to which the "rules" do not apply as are all politicians in office and many after being so at some.period.

So under our collect mass consciousness where we think falsely to an extent that we're "informed" we as a mass follower our leader's and do what they say. I would go as far as to say slavery never ended it just changed methods of enforcement and reception. That is unless someone on uere is going to file a lawsuit which I really doubt as I am sure that will be forgotten now that it was said as I previously stated.

Next time we go to an election we will also vote for one of the two choices given neither of which will ever do anything they said or that will change this current MOO (method of operation) as that would not benefit them in any way. We are however also completely defeated as a society until someone does something to change it which now that you have read this you will do nothing about it because I am right?

Offline CharlesYu  
#7 Posted : Wednesday, July 2, 2014 9:15:55 AM(UTC)
CharlesYu


Rank: Member

Reputation:

Groups: Registered
Joined: 3/27/2014(UTC)
Posts: 11

I think that the experiment in itself was quite cool (please don't kill me) but I do agree that they should have asked permission. However, taking Facebook's side, asking permission may have altered the constants of the experiment. I don't know which side to take.

Users browsing this topic
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.