"Instagram helped kill my daughter": the severe complaint of the father of a 14-year-old girl who has committed suicide



[ad_1]


Molly Russell has committed suicide in 2017. She was 14 years old

"Molly was the youngest of the three sisters, she was 14 years old and she was a normal teenager, she was enthusiastic, finished her homework and prepared the backpack to go to school that night. we woke up the next morning, she was dead. "

Ian Russell's voice is broken. He is the father of Molly, a 14 year old English girl who committed suicide in 2017.

"It's very sad, in an instant you realize that your life will never be the same again," he says.

"Molly left notes, we are lucky to have notes of her after her death because she tried to explain how she felt," she says.

"Some notes said," I am the problem of everyone's life, I love them all, be strong, I am proud of you. "


His father says the content he's seen on Instagram has encouraged him to die
His father says the content he's seen on Instagram has encouraged him to die

After her death, the family investigated Molly's social networking stories and found information about depression and self-harm.

"I do not doubt that

Instagram

He helped kill my daughter, "says Russell now.

According to data from the World Health Organization, more than 200,000 people aged 10 to 29 committed suicide in 2016 worldwide.

"My daughter had a lot to offer, and everything is gone, we have to accept it, the hardest part is that everything is gone with the help of the Internet and social networks. "

The content of social networks

The father explains that they decided to study the social networks in which Molly had an account.

"I remember finding a drawing with a caption:" This world is very cruel, I do not want to see it anymore, "he says.

"There were stories of people who were depressed, self-destructive or suicidal, and Molly had access to a lot of similar content," she adds.

Ian says some of the materials were positive: groups of people trying to help each other, maintaining a positive attitude and not getting hurt.

But he explains: "Another part of the content shocks, encourages self-destruction and links self-harm to suicide".


Molly left notes explaining how she felt
Molly left notes explaining how she felt

The BBC has viewed the content of the "hashtag" (tag) of "selfharm" on Instagram and has found very explicit images of users.

In addition, hashtags help find more similar content because users can subscribe and follow the posts.
who have a particular label.

The content labeled "depression" (depression) leads to lamentable material, such as videos about suicides.

"The reports of these reports are usually in black and white, they are fatalistic, they leave no room for hope, it's like saying: join the group, you're depressed, me also, "explains Molly's father.

"We could not imagine that this type of content could be on a platform like Instagram, and it's still there, it's very easy to find, it's not hidden, it's available," adds there.

The role of algorithms


Ged Flynn is the director of Papyrus, a UK youth suicide prevention organization.
Ged Flynn is the director of Papyrus, a UK youth suicide prevention organization.

Ged Flynn is the director of
Papyrus, a youth suicide prevention organization, created in the UK in 1997.

In an interview, the BBC showed Flynn the images he had found on Instagram.

"I was going to say that I'm not surprised, but I think suicide is not a hashtag, it's an unimaginable and devastating tragedy," Flynn said.

Added problem: Instagram algorithms help users locate badociated content. By following an account of this type, the social network suggests more.

"If a social network algorithm is programmed to offer more content of the same type as the one you are looking for, it may need to be more careful than when searching, for example, the term" flowers, "says Flynn.

He adds: "The laws on suicide are very clear: to encourage someone to end his life is illegal.

Internet

or in real life, with words or pictures, anyone who suggests you should do it is, at least, a potential accomplice. "

"Instagram should seriously consider changing its algorithms to save lives, and it must do it now," he said.

The answer of Instagram

The BBC attempted to interview Instagram officials, who declined the proposal but issued a statement.

"We do not allow content promoting or idealizing eating disorders, self harm or suicide, we eliminate them," they said in the article.

Instagram has a tool that warns against certain search terms and offers help.

But users can simply refuse help and continue browsing.

Papyrus, which operates in the UK, provides information and practical advice to all young people with suicidal urges.


Molly was the little girl of three sisters
Molly was the little girl of three sisters

All members of the organization's board of directors have been personally affected by the suicide. Many have lost a child by suicide.

"It's not ok for a child to access such explicit images", social networks can not continue to argue, for example, that they have a button that offers the # 1 39; help if the algorithm detects terms such as "self harm" and "suicide". "Three or four times," says Flynn.

"My message would be to take it seriously, suicide is the leading cause of death among young people in the UK, where does it take to get something done?", He adds.

The UK government urges social networks to take on more responsibility for content that illustrates and encourages methods of suicide and self harm.

"Devastating" and "complicated"


Steve Hatch, Facebook director for Northern Europe
Steve Hatch, Facebook director for Northern Europe

Following the announcement of Instagram, Steve Hatch, director of Facebook – the company to which Instagram belongs – for Northern Europe, told the BBC
in an exclusive interview that Molly Russell's death was a "devastating event".

Hatch told Amol Rajan, editor of the BBC, that he felt "deeply anguished" when he heard Molly's father's accusations that the social network is partly responsible for the death the girl.

"I can not even imagine how Molly's father and the rest of the family feel," he said.

When Rajan showed him images of self-mutilation that were supposed to defeat Instagram's policy, but still available on the social network, the leader replied, "We must make sure to badyze these images and to make sure we get rid of it. "

Hatch also said that Instagram constantly reviews its policies for everything related to depression and suicide images.

"It's a very complicated business," he added.

"We work with experts who help to design policies regarding self-harm images, it's a very complex area."

"Experts tell us that when these images are published by people who are obviously going through a very difficult situation, they often do so because they are looking for support or help."

"In these cases, it can be very useful and very useful for these images to be available on the platform, which is why we allow them and offer support to those who can see them."

"What we do not allow is those who applaud or exalt (suicide)."

He did not, however, want to answer the question of whether he allowed his children to use Instagram, but pointed out that the social network "works hard" to eliminate these types of images and also offers "a lot of support" to people.

.

[ad_2]
Source link