[ad_1]
YouTube is changing the way it hits the world's biggest video site following a series of scandals. There's just one problem: The company is still deciding how this new approach works.
The Google division introduced two new internal metrics in the past two years for gauging how well they are performing, according to the company's familiar plans. One tracks the total time people spend on YouTube, with their comments (and not just the clips they watch). The other is a measurement called "quality watch time," a squishier statistic with a noble goal: "To spot content that achieves something more constructive than just keeping users glued to their phones.
The changes are supposed to be more palatable to advertisers and to the widest possible audience, and help you say that it is addictive and socially corrosive. Creating the right metric for success. It could also help YouTube make up for the past failures in the spread of toxic content.
YouTube, like other parts of Alphabet Inc.'s Google, uses these corporate metrics as goal posts for the most business and technical decisions – how it pays staff and creates critical software like its recommendation system. But the company is still in the business of "the quality watch time" metric works, or the new measure will impact the millions of "creators" who upload videos to the site.
business model around "watch time," a measure of how much time. A spokeswoman said the change was made to reduce deceptive "clickbait" clips. Critics inside and outside the company said the focus on "watch time" rewarded outlandish and offensive videos.
YouTube declined to comment on the new metrics, but a spokeswoman said that "there are many metrics that we use to measure success." The company also did not have a say in the process of "watch time." They have stressed that they want to do more than punish people who upload or spread nasty videos. Executives there is a conversation about rewarding content based on a rubric for responsibility. The company "saw how the bad actions of a few individuals can negatively impact the entire creator ecosystem, and that's why we could even focus on responsible growth," Susan Wojcicki, YouTube's chief executive officer, wrote in a February blog post. [19659002TodatemostoftheeffortsYouTubehasmadetheirrecommendationenginewhichpromotesvideosbasedontheircontentandviewerbehaviorAspokeswomansaidYouTubechangedthealgorithmforthatsystemin2016to"focusonsatisfaction"Theymeasurethiswith"manyfactors"includingviewersurveyshowoftenpeopleshareclipsand"like"and"dislike"buttonsonvideos
The two new metrics – tracking time on site and "quality watch time" – influence a lot more than just The measurements also help dictate how YouTube videos videos in search results, runs ads and pays the creators who make videos.
Deciding what is a "quality" video, or which clips are "responsible," is difficult even for humans. YouTube is trying to pull off this feat with a combination of software and employees, making the task even harder. It's a risky change. The video service provides you with the best of both worlds. Hence executives' obsession with engagement stats. Adding murkier metrics to the mix could crimp ad revenue growth.
Crowd-sourcing the identification of "responsible" videos is particularly tricky. Some popular YouTube stars, such as Logan Paul, upload clips that advertisers and critics see as troubling. But Paul's fans spend millions of hours watching them. Loyal viewers typically click "like" and give glowing survey responses, particularly for stars. "At a certain point, people are preaching to the fall," said Becca Lewis, a researcher at Stanford University who studies YouTube.
The YouTube channel Red Ice TV broadcasts political videos with a "pro-European perspective," which criticizes the label as promoting white supremacy. According to Lewis, there are more than 30 times "likes" than "dislikes" on his most recent videos. "If the 'responsibility' is based purely on viewer feedback metrics, it is making an badumption that extremist content will be received by its audience, which is very far from the reality," she said.
Sometimes the opposite effect happens where viewers will stack up on "dislikes" and submit negative survey responses in coordinated sabotage efforts. YouTube felt this pain after the release of its own year-in-review video from December. Legions of viewers hit dislike on YouTube's original produced clip to register frustration with the company's policies for creators.
In a January blog post, the company said it was hiring human reviewers who would have its software based on it.
Changes to YouTube's internal metrics also have long-lasting impacts for creators. Some channels lost millions of dollars in ad sales after it was questionable, a response to advertiser boycotts that began in March 2017. The producers of these channels complained that YouTube was too opaque about the changes, and punished videos no problematic content.
And the software YouTube uses to badyze these new metrics may not be good enough – in spite of Google's prowess in artificial intelligence techniques such as computer vision and natural language processing. These systems have not progressed enough to identify the subject of a video based on video footage, said Reza Zadeh of Matroid, a company that sells software for badyzing video.
Current video badysis software can find every video that shows or discusses the moon landing. It struggles with immediately deciphering if the video is questioning the landing or spouting other untruths, according to Zadeh, who worked on Google's automated translation service.
"In general, we're very good at detecting names using computer vision," said Zadeh. "But we suck at verbs and adjectives."
A YouTube spokeswoman said the company now linked to these nuances, but declined to say how many workers are dedicated to the task
– Bloomberg News
Source link