Clubhouse security and privacy lag behind its explosive growth



[ad_1]

Clubhouse still has a long way to go to assure its users that its privacy and security policies are fully developed.
Enlarge / Clubhouse still has a long way to go to assure its users that its privacy and security policies are fully developed.

Carsten Koall | Getty Images

In recent months, the Clubhouse audio social media app has become Silicon Valley’s latest disruptive darling. The format seems familiar: a Twitter part, a Facebook Live part, a talking part on the phone. But as Clubhouse continues to grow, its security and privacy gaps have come under increased scrutiny – and have left the company scrambling to correct issues and manage expectations.

Clubhouse, still in beta and available only on iOS, offers its users “rooms” which are essentially group audio chats. They can also be defined as public addresses or round tables where some users are “speakers” and others are members of the public. The platform is said to have over 10 million users and is valued at $ 1 billion. As of last year, it’s an invitation-only paradise for Silicon Valley’s elite and celebrities, including an appearance by Elon Musk earlier this month. But the company has struggled with both concrete security concerns and more fleeting questions about the level of privacy its users should expect.

“With smaller, newer social media platforms, we need to be on guard with our data, especially when it is growing enormously, it is testing a lot of controls,” says security researcher Robert Potter. “Things you could have done with just 100,000 people on the platform: you multiply those numbers tenfold and the level of exposure increases, the threat increases, the number of people probing your platform increases.”

Recent security issues with Clubhouse run the gamut from vulnerabilities to issues about the underlying infrastructure of the application. Just over a week ago, researchers at the Stanford Internet Observatory shed light on the platform when they discovered that the app was transmitting Clubhouse credentials and lounge ID numbers. unencrypted chat rooms, which means that a third party could potentially have been following your actions in the app. The researchers further pointed out that part of Clubhouse’s infrastructure is managed by a Shanghai-based company and it appears that the app’s data travels through China at least part of the time – potentially exposing users to a targeted or even widespread surveillance by the Chinese government. Then on Sunday, Bloomberg confirmed that a third-party website was grabbing and compiling audio from Clubhouse chats. Early Monday, further revelations followed that the Clubhouse chats were being fetched for an unaffiliated Android app, allowing users of that operating system to listen in real time.

Potter, one of the researchers who investigated the Clubhouse’s various data collection projects, explains that these apps and websites did not appear to be malicious; they just wanted to make Clubhouse content accessible to more people. But the developers were only able to do this because Clubhouse didn’t have anti-scratch mechanisms that could have stopped this. Clubhouse didn’t limit the number of rooms that a single account could be broadcast from at a time, for example, so everyone could create an app programming interface to broadcast each public channel at the same time.

More mature social networks like Facebook have more developed mechanisms to lock down their data, both to prevent breaches of user privacy and to defend the data they hold as an asset. But even they can still have potential exposures from creative scratching techniques.

Clubhouse has also come under scrutiny for its aggressive collection of user contact lists. The app strongly encourages all users to share their address book data so that Clubhouse can help you connect with people you already know on the platform. It also requires you to share your contact list in order to invite others to the platform, as Clubhouse is always invitation-only, which contributes to a sense of exclusivity and privacy. However, many users pointed out that when you invite other people, the app also makes suggestions based on the phone numbers of your contacts who are also in the contacts of the most Clubhouse users. In other words, if you and your local friends all use the same florist, doctor, or drug dealer, they could very well be on your list of suggested people to invite.

Clubhouse did not respond to a request from WIRED to comment through press time on its recent security trips. In a statement to researchers at the Stanford Internet Observatory, Clubhouse detailed specific changes it planned to make to strengthen its security, including cutting pings to servers in China and tightening up its encryption. The company also said it will work with a third-party data security company to help push through the changes. In response to the unauthorized website that was rebroadcasting the Clubhouse threads, the company told media it has permanently banned the user behind it and will add additional “safeguards” to prevent the situation from happening again.

While Clubhouse appears to take researchers’ comments seriously, the company has not been specific on all of the security enhancements it has implemented or plans to add. Additionally, given that the app doesn’t appear to offer end-to-end encryption to its users, researchers say there is still a feeling that Clubhouse hasn’t given enough thought to its security posture. And that’s before you even tackle some of the core privacy issues raised by the app.

When you start a new Clubhouse room, you can choose from three settings: an “open” room can be accessed by any user of the platform, a “social” room only admits the people you follow, and a “social” room. closed ”limits access to guests. Each has its own implied level of privacy, which Clubhouse could make more explicit.

“I think for public rooms, Clubhouse should give users the expectation that public means public to all users, because anyone can join and record, take notes, etc.” says David Thiel, chief technology officer of the Stanford Internet Observatory. “For private rooms, they can convey that, as with any communication mechanism, an authorized member can record content and identities, so be sure to both set expectations and trust attendees.”

Like any leading social network, Clubhouse has also struggled to deal with abuse on the platform. The app’s terms of service prohibit hate speech, racism, and harassment from November, and the platform offers some moderation features, like the ability to block users or flag a room as potentially abusive. But one of the main features of Clubhouse is also an anti-abuse issue: people can use the platform without risking their contributions being automatically saved as posts. This may encourage some users to make abusive or derogatory remarks, believing that they will not be recorded and will not suffer any consequences.

Stanford’s Thiel says Clubhouse is currently storing chat tapes temporarily to review them in the event of abuse complaints. If the company implemented end-to-end encryption for security, however, it would have an even harder time staying on top of abuses, as it wouldn’t be able to make these recordings so easily. Every social media platform faces a version of this tension, but security experts agree that, where applicable, the benefits of adding end-to-end encryption are worth the added challenge of developing solutions. more nuanced and creative anti-abuse.

Even end-to-end encryption does not eliminate the additional possibility that a Clubhouse user can externally record the conversation they are in. This is not something that Clubhouse can easily fix. But he can at least set expectations accordingly, no matter how friendly and informal the conversation is. “The Clubhouse should just be clear on what it’s going to contribute to your privacy,” Potter says, “so you can define what you’re going to talk about as a result.”

This story originally appeared on wired.com.

[ad_2]

Source link