This mental health app can share your data without telling you



[ad_1]

Free apps sold to people with depression or wanting to quit are sources of haemorrhage in third parties such as Facebook and Google, but often do not admit them into their privacy policies, according to a new study. This study is the latest to highlight the potential risks of entrusting sensitive health information to our phones.

Although most easy-to-find apps for depression or smoking in Android and iOS stores share data, only a fraction of them disclose this information. The results add to a series of disturbing revelations about how applications work with the health information we have given them. For example, a the Wall Street newspaper The survey recently revealed that the Flo rules tracking app shared the dates of the rules and the pregnancy plans of the users with Facebook. And previous studies have reported health apps with security holes or sharing data with advertisers and analytics companies.

In this new study, published Friday in the journal JAMA Network open now, The researchers searched for apps using the keywords "depression" and "smoking cessation". They then downloaded the applications and checked if the data stored there was shared by intercepting the traffic from those applications. Much of the data shared by the applications did not immediately identify the user or was even strictly medical. But 33 of the 36 apps shared information that could give advertisers or data analytics insights into the digital behavior of users. And a few shared very sensitive information, such as health journal entries, self-reports on substance use, and user names.

Such details, as well as the name or type of application, could give third parties information about the mental health of a person that that person might want to keep confidential. "Even knowing that a user has downloaded a mental health or smoking cessation application on his or her phone is a valuable" health-related "item," Quinn Grundy, assistant professor at the # 39, University of Toronto, which studies the influences of business on health and has not been involved in the study, tells The edge in an email.

John Torous, director of digital psychiatry at the Beth Israel Deaconess Medical Center and co-author of the new study, feared that users may not be aware of how their apps share their data. "It's very difficult to make an informed decision about using an app if you do not even know who's going to have access to certain information about you," he says. That's why he and a team from the University of New South Wales in Sydney conducted this study. "It's important to trust but to check – to tell where your health data goes," says Torous.

By intercepting data transmissions, they discovered that 92% of the 36 apps shared the data with at least a third, mostly services managed by Facebook and Google, which facilitate marketing, advertising or data analysis. (Facebook and Google did not immediately respond to requests for feedback.) But about half of these apps did not disclose sharing this data with third parties for different reasons: nine apps had no privacy policies ; five applications made but did not say that the data would be shared this way; and three applications actively stated that this type of data sharing would not occur. Steven Chan, a physician with Veterans Affairs Palo Alto, who has previously worked with Torous, did not participate in the new study. "They lie basically," he says of apps.

The researchers do not know what these third-party sites were doing with this user data. "We live in a time when, with enough breadcrumbs, it is possible to re-identify people," says Torous. It is also possible that the breadcrumbs remain in place, he says – but for now, they do not know it. "What happens to this digital data is a kind of mystery." But Chan is worried about potential and invisible risks. "Advertisers could potentially use this to compromise the privacy of their visitors and influence their treatment decisions," he says. For example, what happens if an advertiser discovers that someone is trying to quit? "Maybe if someone is interested in smoking, would it be through the electronic cigarette?" Chan said. "Or could they potentially introduce them to other similar products, such as alcohol?"

The authors of the study write that part of the problem lies in the business model of free applications: insurers may not pay for an application that helps users quit, for example, the only way for a free application developer to stay afloat is sell data. And if this app is considered a wellness tool, developers can circumvent laws designed to preserve the confidentiality of medical information.

So, Torous recommends caution before sharing sensitive information with an application. The potential of mental health applications to help people is exciting, says Torous. "But I think that means you want to take a break twice and say," Do I trust the person who created the application, and do I understand where are these data going? ? "" Some quick checks might allow you to verify that the application has a privacy policy, that it has been updated recently and that the application comes from a reliable source, such as that a medical center or government. "None of these questions will guarantee you a good result, but they will probably help you filter," he says.

One of the long-term ways to protect people who want to use health and wellness apps is to form a group that can approve responsible mental health applications, Chan said. "A bit like having FDA or FAA approval certifying the safety of a particular aircraft," he says. But for now, we must be wary of the application. "When there are no such institutions or the institutions themselves do not do a good job, it means that we need to invest more as a public good."

[ad_2]

Source link