Translate

Wednesday, October 22, 2014

Meet Facebook’s Mr. Nice

At Facebook, Creating Empathy Among Cyberbullying

 

 

Of Facebook’s 7,185 employees, Arturo Bejar may have the most difficult job.
No, he is not responsible for increasing advertising revenue or keeping the website alive 24 hours a day. Mr. Bejar has a much more inscrutable task: teaching the site’s 1.3 billion users, especially its tens of millions of teenagers, how to be nice and respectful to one another.
Respectful? Online? Ha! That’s never going to happen. Everyone knows that social media is an unwinnable game of who can be meaner. If Mr. Bejar thinks he can make Facebook users nice, he is — to borrow a popular Facebook comment — just stupid!
But that kind of response is exactly what Mr. Bejar is counting on. As the director of engineering for the Facebook Protect and Care team, 80-strong, he believes that most users are not trying to be mean and that they will retract a comment (and even feel bad about it) if they realize it has caused someone harm.
“The way our brains work, we have evolved to understand each other by tone of voice or seeing facial expressions, but that gets lost through the devices we use to communicate,” Mr. Bejar said last week in an interview at Facebook’s New York offices.
In other words, Mr. Bejar is trying to create empathy among Facebook users, in what used to happen in real settings like the playground through social cues like crying and laughter.
This may seem like a piffling side project to some. But I believe the success of social media largely depends on solving this problem and teaching users to be kinder and more empathetic. Most people I know who have quit services like Twitter and Instagram have done so because commenters were spiteful, insensitive or just plain nasty.
According to a report released this week by Pew Research’s Internet Project, 65 percent of young adults 18 to 29 in the United States said that they had been harassed online, and 92 percent had witnessed someone else being bullied.
But Facebook’s efforts to curb this may be working. The company told me that each week eight million Facebook members use tools that allow users to report a harmful post or photo. (The tools can be used by clicking on the little upside-down arrow in the upper right corner of a post or the options button at the bottom of photos.)
Mr. Bejar’s team designed these tools to let people know someone had hurt their feelings, and he said the system actually worked. (This is different from the newsfeed experiment in June, when Facebook received criticism for tinkering with people’s emotions as part of a psychological study to examine how emotions can be spread on social media.)
Creating empathy on Facebook has not been easy. Researchers have learned that a few letters can have a profound impact. For example, in the first iteration of these tools, Facebook gave users a short list of vague emotions — like “embarrassing” — to communicate why they wanted a post removed. At the time, 50 percent of users seeking to delete a post would use the tool, but when Facebook added the word “it’s” to create a complete sentence (“It’s embarrassing”), the interaction shot up to 78 percent.
Teenager are a particular focus, not just as victims of cyberbullying but because they sometimes lack the emotional maturity to handle negative posts.
Dr. Marc Brackett, director of the Yale Center for Emotional Intelligence, who is working with Facebook on this emotions project with the Protect and Care team, said their research found that teenagers need more pathways and options to voice their feelings.
On Facebook, teenagers are presented with more options than just “it’s embarrassing” when they want to remove a post. They are asked what’s happening in the post, how they feel about it and how sad they are. In addition, they are given a text box with a polite pre-written response that can be sent to the friend who hurt their feelings. (In early versions of this feature, only 20 percent of teenagers filled out the form. When Facebook added more descriptive language like “feelings” and “sadness,” the figure grew to 80 percent.)
“We’ve played around with having pre-populated messages versus no message at all,” Dr. Brackett said. “If kids are given a blank box, often times they are going to say things that are not going to be helpful,” including cursing at their friends. When Facebook offered more developed responses like “This post is mean. It makes me feel sad and I don’t want it on Facebook,” 85 percent of teenagers who wanted a post removed sent a message.
“When kids let someone know they’ve hurt their feelings in a personal way, there’s a strong likelihood that the other kid will take that down,” Dr. Brackett said.
Interestingly, more often than not, the posts were not meant to hurt, but were jokes lost in digital translation. When Facebook asked people why they shared a post that hurt someone else, around 90 percent of respondents said they thought their friends would like the post or would think it was funny. Only 2 percent of users wanted to provoke or alarm someone else.
“Believe it or not, most of the time people do mean well,” said Dacher Keltner, a director of the Greater Good Science Center at the University of California, Berkeley, who is also working with Facebook’s empathy team.
Researchers are looking at other ways to help users be more empathetic on social networks. Last year, Facebook borrowed ideas from Charles Darwin’s 1872 book, “The Expression of the Emotions in Man and Animals,” to create stickers with facial expressions.
Next, Mr. Bejar said, his team is experimenting with sounds to help people convey how they feel. (Imagine sending someone the sound of a grunt, sigh or a giggle to communicate your feelings about a post.) But he acknowledged, “technology still has a lot of work to do to humanize each other.”
Maybe this idea isn’t that stupid after all.

No comments:

Post a Comment