YouTube found a new metric and is far from its old business model



[ad_1]

YouTube has recently set two parameters but does not know how to approach one of them (Source: Reuters)

YouTube changes the way it measures success on the world's largest video site at following a series of scandals. Only one problem: the company is still in the process of deciding how this new approach works.

In the last two years, the Google division has introduced two new internal statistics to measure the quality of videos, according to people familiar with the company's projects. One follows the total time spent by internet users on YouTube, including the comments that they have published and read (not just the clips viewed). The other is a measure called "quality monitoring time," a more sober statistic with a noble purpose: to find content that reaches something more constructive than keeping users stuck to their phones.

The changes are meant to reward richer videos. acceptable to advertisers and the general public, and helps YouTube rule out criticism that its service is addictive and socially corrosive. Creating the right indicator of success could help marginalize inappropriate or popular videos among small but active communities with extreme views. It could also help YouTube catch up with its past failures in combating the proliferation of toxic content.

Like other elements of Google, Alphabet Inc. uses these statistics as objectives for most technical and business decisions: how does staff pay and create critical software like its recommendation system. However, the company still needs to determine how the "Quality of the Watch" indicator works, or explain how the new metric will affect millions of "creators" who download videos to the site.

Starting in 2012, YouTube rebuilt its service and business model based on "watch time," a measure of how much time users spend watching movies. A spokeswoman said the change had been made to reduce "clickbait" deceptive clips. Critics inside and outside the company said that the focus on "time spent watching" rewarded offending and offending videos.

YouTube did not want to comment on the new settings, but a spokeswoman said, "We use many metrics to measure success." The company also did not decide whether it had abandoned the "time spent in front of the watch". But its leaders have repeatedly said that they are tackling the problem of content. They pointed out that they wanted to do more than punish people who download or broadcast unpleasant videos. The leaders of the region have recently started talking about enriching content based on a section of responsibility. Susan Wojcicki, chief executive of YouTube, wrote in an article published in February on her blog, Susan Wojcicki, chief executive of YouTube, "found that the bad actions of a few people can have a negative impact on the whole of the ecosystem of creators, which is why we put more emphasis on responsible growth. " To date, most of YouTube's efforts have focused on its referral engine, which promotes videos based on their content and viewer behavior. A spokeswoman said YouTube had changed the system's algorithm in 2016 to "focus on satisfaction." They rate this item based on "many factors," including audience surveys, the frequency with which people share clips, and the "I like" and "do not like" buttons.

The two new metrics, tracking total time on the site and "quality monitoring time" – far more influence than YouTube's recommendations, according to informants familiar with the plans, who have asked not to be identified because it was a private matter. The metrics also help determine how YouTube displays search results in search results, shows ads, and remunerates the creators who make them.

Deciding what is a "quality" video or "responsible" clips is difficult even for humans. YouTube is trying to achieve this feat with a combination of software and employees, making the task even more difficult. It's a risky change. The video service generates most of its revenue through advertising, and the business works best when as many people as possible spend as much time as possible on YouTube. Hence the obsession of executives for statistics of engagement. Adding more obscure parameters to the mix could dampen advertising revenue growth.

It is particularly difficult to identify "responsible" videos. Some popular YouTube stars, such as Logan Paul, download clips that advertisers and critics find disturbing. But Paul's fans spend millions of hours watching them. Faithful viewers usually click on "I like" and give vivid responses to surveys, especially for celebs. "At one point people are preaching at the choir," said Becca Lewis, a researcher at Stanford University who studies YouTube.

The YouTube channel Red Ice TV broadcasts political videos with a "pro-European perspective", whose critics label it as promoting white supremacy. According to Lewis, there are more than 30 times more "likes" than "do not like" on his last five videos. "If the score of" responsibility "is based solely on the measures of return of the viewer, it badumes that the extremist content will be received negatively by its audience, which is very far from reality," she said.

Sometimes the opposite effect occurs when viewers crowd their "dislikes" and submit their responses to negative investigations as part of coordinated sabotage efforts. YouTube has deeply felt this pain after the release of its own December video review. Legions of viewers hate the clip originally produced by YouTube to express frustration with the company's creative policies.

YouTube declined to give details on how it uses metrics to rank and recommend videos. In an article published in January on her blog, the company announced that she was hiring human reviewers who would train her software based on guidelines used for years by Google's search activity.

Changes to YouTube's internal statistics also have long-term implications for creators. Some channels lost millions of dollars in ad sales after YouTube withdrew ads deemed dubious videos, a response to the boycott of advertisers that began in March 2017. Producers of these channels have complained that YouTube was too opaque modifications and sanctioned videos

And the software used by YouTube to badyze these new statistics might not be enough – despite the feats of Google in terms of artificial intelligence techniques such as computer vision and the natural language processing. Artificial intelligence systems have not made enough progress to identify the intention of a video based solely on footage, said Reza Zadeh of Matroid, a company selling a software program. video badysis.

The current video badysis software can find all videos that show or discuss the moon. landing. According to Zadeh, who worked for Google's machine translation service as a research trainee more than ten years ago, he struggles to decipher immediately whether the video calls into question the landing or the landing. other non-truths, according to Zadeh.

"In general, we are very good at detecting names using computer vision," Zadeh said, "but we are void against verbs and adjectives."

A YouTube spokeswoman said the company was now relying on people to revise these nuances, but she declined to indicate the number of workers dedicated to this task.

Get Live Stock Stock Stock BSE and NSE and latest net badet value, mutual fund portfolio, calculate your income tax by income tax calculator and know the winners, losers and best shares of the market. Like us on Facebook and follow us on Twitter .

[ad_2]
Source link