[ad_1]
The technology sector was invited to share data with public sector researchers to better understand the impact of their services on mental health and psychosocial consequences, as well as to help fund the necessary independent research in the next ten years. years.
The Chief Medical Officers of Health of the United Kingdom made this appeal in a document providing advice and guidance to the government on the use of screens by children and youth. They also called on the industry to agree on a code of conduct.
In the United Kingdom, the impact of digital technologies on the mental health of minors and vulnerable youth is of growing concern.
Last year, the government committed to legislate on social media and security. It is planned to publish a white paper outlining the details of its projects before the end of the winter, and calls have been made for platforms to be regulated as publishers by imposing on them the following. legal obligation to protect non-adult users. . Although it is not yet known whether the government intends to go that far.
"The technology sector needs to anonymously share the data it holds with recognized and registered public sector researchers for ethical research in order to improve our scientific evidence base and our understanding," write now Chief Medical Officers of Health.
After reviewing the existing evidence, the CMs stated that they had not been able to establish a clear link between screening activities and mental health issues.
"Scientific research is currently insufficiently conclusive to support the evidence-based guidance of the UK CMO on the optimal amounts of screen usage or online activities (such as the use of social media)," stress -they.
Last week, the UK Parliament's Science and Technology Committee made a similar call for high quality anonymized data to be provided to further public interest research on the impact of social media technologies. .
We asked Instagram, a Facebook-owned company, if it would agree to provide data to public sector researchers on mental health and wellbeing earlier this week. But at the time of writing, we are still waiting for an answer. We also asked Facebook to respond to the recommendations of the collective management organizations.
Update: A Facebook spokesman said:
We want the time spent online by young people to be meaningful and, above all, safe. We welcome this valuable work and we wholeheartedly agree with the Chief Medical Officers of Health on the need for the industry to work closely with the government and society at large to ensure that young people receive the advice they need to make the most of the Internet while preserving their security. .
The boss of Instagram, Adam Mosseri, meet today the UK health secretary to discuss the concerns of underage users exposed to disturbing content on the social media platform.
The meeting follows public outrage at the suicide of a schoolgirl whose family said they were exposed to Instagram accounts sharing images of self-harm, including to accounts that they believe actively encouraged suicide. Prior to the meeting, Instagram announced some policy changes – stating that it would no longer recommend self-destructive content to users, and that it would begin filtering sensitive images, forcing users to click to view them.
In the guidance document, CMOs indicate that they support the government's decision to legislate "to set clear expectations for the technology sector". They are also asking the technology industry to put in place a voluntary code of conduct on how to protect children and youth who use their platforms, in consultation with civil society and independent experts.
Areas that CMOs report for possible inclusion in such a code include: "clear terms of use that children can understand", active application of their own terms and conditions – and "verification effective age "(they suggest working with the government on this).
They also suggest that the platforms include commitments to "remove dependency capabilities" from the UX design of their services, a critique of the so-called "persuasive" design.
They also suggest that platforms commit to guaranteeing only "age-appropriate advertising".
The code should ensure that "no normalization of harmful behaviors (such as bullying and self-intimidation) occur," they suggest, while integrating work in progress on security issues such as bullying and grooming.
In the parents 'and guardians' advice also included in the document, the collective management organizations encourage the setting of use limits around the devices – stating that children should not be allowed to take devices to their room at bedtime to avoid disturbances of sleep.
Parents also encourage screen-less meals to allow families to "enjoy a face-to-face conversation."
CMOs also suggest that parents and guardians talk to children about the use of a device to encourage meaningful social sharing – also emphasizing that adults should never badume that children are happy that their photo is shared. "If in doubt, do not download," they add.
Source link