People trying to make less toxic Internet recommendations



[ad_1]

The internet is an ocean of algorithms trying to tell you what to do. The videos on YouTube and Netflix are calculated. Facebook and Twitter filter and reorganize the publications of your relationships, in your interest, but also in their own interest.

New York entrepreneur Brian Whitman has contributed to the creation of such a system. He sold a music badysis startup called The Echo Nest to Spotify in 2014, reinforcing the streaming music service's ability to recommend new songs from his listening. Whitman said he found clear evidence of the value of the algorithms at Spotify. But he founded his current business, Canopy, after he began to fear their disadvantages.

"Traditional recommendation systems involve gathering all the possible data about me, then putting it in a black box," says Whitman. "I do not know if the recommendations it makes are optimized for me, to increase revenues or if they are manipulated by a state actor." Canopy wants to publish an app later this year that offers to read material and podcasts without centralized data. collection, and without pushing people to spend time, they regret it later.

Whitman is part of a movement that is trying to develop more ethical referral systems. Technology companies have long been offering algorithmic suggestions that give users what they want, but there are obvious drawbacks, even beyond hours lost online. Researchers have found evidence that the recommendation algorithms used by YouTube and Amazon can amplify conspiracy theories and pseudoscience.

Guillaume Chaslot, who had previously worked on the recommendations at YouTube but was now working to document their flaws, explains that these problems come from companies that design systems designed primarily to optimize the time spent by users on their services. It works – YouTube said that more than 70% of the listening time came from recommendations – but the results are not always pretty. "Artificial intelligence is optimized to find clickbait," he says.

Analyzing this problem and trying to create alternatives is becoming its own academic niche. In 2017, RecSys, the leading referral research conference, in which technology companies have long attracted and are sponsored by a significant number of participants, has benefited from a workshop on the subject. accompaniment dedicated to the "responsible recommendation".

At the 2018 event, the presentations included a method for recommending Twitter accounts to people who would expose them to different points of view, as well as a method proposed by BBC engineers about the integration of public service values ​​into personalization systems. "There is a new understanding that recommendations that motivate narrow interests do not necessarily meet everyone's needs in both the public and commercial contexts," said Ben Fields, a BBC data specialist.

Xavier Amatriain, who previously worked on recommendation systems at Netflix and Quora, said that understanding was also gaining ground in the industry. "I think we realize that these systems really work.The problem is that they do what you tell them to do," he says.

The broader rebadessment of how technology companies like Facebook operate – somewhat recognized by the companies themselves – facilitates this process. Whitman says he had no problem recruiting engineers who could choose high-tech jobs. Canopy's staff includes engineers who worked on personalization on Twitter and Instagram.

The application they are creating is designed to recommend to each user a small number of articles to read or listen to every day. Whitman says his referral software is designed to look for signs of quality so that it does not just repel the time lost by users, and that the company will share more details as it approaches its launch. To improve privacy, it runs the recommendation algorithms on a person's device and only shares anonymized usage data with the company's servers. "We can not even tell you directly how many people use our app," he says.

Others are studying how to give users more control over the recommendations that are imposed on them. Researchers from Cornell and CUNY worked with the Himalaya podcast application to test a version that asked users what content categories they wanted to listen to and adjusted their recommendations accordingly.

In experiments with more than 100 volunteers, users were more satisfied with being able to guide recommendations and used 30% more additional content. "We are just beginning to understand how to reconcile commercial interests and help users as individuals," said Longqi Yang, a Cornell researcher who worked on the project. Himalaya is currently exploring the possibility of integrating similar functionality into its production application.

At the end of last year, Google researchers released the results of experiments with an algorithm designed to diversify YouTube's recommendations. In January, the company announced that it had upgraded YouTube's recommendation system to "focus on viewer satisfaction rather than views" and make them less repetitive.

Chaslot is pleased to see increasing attention to recommendation algorithms and their effects, especially from technology companies. But it remains uncertain how quickly this new field will bring about real change. Large companies are too constrained by their culture and business models to radically change their business, he said. After leaving Google, Chaslot spent more than a year working on a build recommendation technology to avoid bad content delivery, but concluded that it could not be profitable. "I think we need to raise awareness before alternative companies have a chance," he says.

Whitman of Canopy is more optimistic. He believes that enough people are now wary of big Internet companies to make viable new types of products. "We always feel a bit alone," he says, "but it's a kind of revolution that's just beginning."


More great cable stories

[ad_2]
Source link