Thursday, July 12, 2012

Facebook merges emotion to his efforts against bullying

Facebook merges emotion to his efforts against bullying
 Facebook is changing how offensive or bullying content is reported, the company announced Wednesday.
Communication on Facebook started with the simple wave emotion "poke". Over the past eight years, many features were added to the social network, expanding the ways in which people can interact with each other. Someone you can chat and message, to mark them in a photo or an item, check them in one place, to post on their calendar or mention them in your own posts. A negative side effect of all these exchanges is that the potential for misunderstanding and conflict has also skyrocketed, from an adult not like how they look in a photo tagged with cyberbullying among teenagers. To make it easier to catch and resolve volatile situations from the beginning, Facebook is changing the way content is reported, the company announced Wednesday. It is to give users tools to better communicate their feelings and manage conflicts themselves. The changes are the result of collaborations with Yale, Columbia and Berkeley that involved months of research and focus groups with children, teachers and psychologists cliniciens.Le first change is specifically for 13 - and 14 years (you must be at least 13 to register for a Facebook account). If a boy in this age group wants to report a post mean or threatening, the image of a classmate put on Facebook, they can click "This position is a problem" (a new term chosen to replace the stiff "Report") and go through a series of questions from random to determine what kind of issue it is and how to get the situation is serious. There's even a grid to classify emotions. Once he finished the questions, a list of proposed actions is generated according to how their complaint is based. If the boy is more annoyed than frightened, he could choose to send pre-written message to another person by saying that the post makes him uncomfortable. If he is afraid he will be invited to get help from a trusted friend or adult. There are links to catch anyone who may be suicidal and direct them to professionals and its own Facebook chat suicide hotline. "We think it is important that Facbook provide encouragement for children to find their own support network," said Robin Stern, a psychoanalyst at Columbia University who worked on the project. "The children we say they are spending hours online ... they live their lives with Facebook in the background. " Children are not the only ones who need a little help to communicate their feelings on the Internet. Facebook looked at the pictures that are reported for the removal of all ages, for offenses reported to be pornographic, contain hate speech or representative drug. When they started digging in, the team noticed the images were frequently being marked for more personal reasons - someone did not like how they looked in the photo, was hampered their boss was the dancing on a table or maybe just trying to wipe the evidence of an old romance. Usually, when a picture is reported for violation of community standards, it goes to a Facebook employee who must decide what to do. That's a lot of requests. By expanding the options and direct people to ask the person who posted a picture of the take down, Facebook is putting its members in charge of their own problems and save himself some resources as a bonus. "How do you build in emotion, this ancient part of human nature, to the Facebook site?" Dacher Keltner asked, director of the Laboratory of social interaction at Berkeley. Keltner team worked with Facebook to add a little emotion in the process, customization of applications to purchase shares based on the reason for wanting the picture removed and how important it is for the offended party it down. The wording was made more polished and recipient pre-writen given answers to choose too, opening a dialogue between the two sides. The changes are not the biggest Facebook has done, or even the most remarkable settings to deploy this week. But the changes show the company pays special attention to the subtle ways people communicate emotions on the social network. And doing research, changing formulation and monitoring of response rates, it is also to understand how best to involve its users. The motive was good, conflict resolution and help children, but the method could also have practical applications for the delivery of paid content. "Language issues and really design that really matters for this stuff," said Jake Brill, Facebook product manager. "The slightest change can impact truly remarkable." The changes have been available for many users Facebook as part of a trial run and to all members of the United States this week. Statistics are promising: The rate of people who completed the questionnaire at untagging image has jumped 48% to 78 %. The team hopes to expand the program to other languages ​​and countries, but only after carefully recalibrate the wording of these crops. Study results were presented Wednesday (which was the Research Day Facebook Compassion) in Menlo Park, California. Anti-cyberbullying The company was thrust began shortly after the suicide of Tyler Clementi, though Facebook says it has not been inspired by a single event.

0 comments:

Post a Comment

 

Copyright © Our world today Design by O Pregador | Blogger Theme by Blogger Template de luxo | Powered by Blogger