[ad_1]
Ian pointed out the images that can be found on the platform and some hashtags gathering disturbing images about injuries, or related to depression and suicide.
This tragedy highlights the role of social networks in the dissemination of this type of content: should they regulate content? How much responsibility do they have in this regard? In the course of an interview with The telegraphAdam Mosseri, CEO of the company, said that in the future, self-harm images will not be allowed on the platform.
He also explained that the search mechanisms within the site will be modified, so that they will be removed from the Explorer tabs and search for pages or images containing hashtags referring to self-injury.
Moseri said that the technology is being developed to scramble the content of this type to place behind a private screen so that users do not find it easily.
These days, the official has met with the British Secretary of Culture, Jeremy Wright, who plans to announce next month new plans to regulate social networks.
As part of this initiative, it is proposed to put in place mechanisms to force platforms to eliminate illegal content related to violence and child abuse; as well as content that, even if it is not illegal, may be harmful to users as content related to bullying or self-harm images.
Ian Russell praised the intentions expressed by the Instagram manager and asked them to implement the stated changes as quickly as possible.
"It is encouraging to see that decisive action is being taken to try to protect children from the disturbing content on Instagram and I hope that the company will act quickly to implement these plans and fulfill their commitments," he said. he concluded.
Source link