YouTube releases new Violative View Rate (VVR) metric

    YouTube releases new Violative View Rate (VVR) metric

    YouTube releases new Violative View Rate (VVR) metric


    Posted: 06 Apr 2021

    We will release a new metric called Violative View Rate (VVR) as part of our Community Guidelines Enforcement Report. The report will have a separate section called ‘Views’ which will lay out historical and the Q4 (Oct-Dec 2020) VVR data, along with details on its methodology. Going forward, we will be updating this data quarterly. VVR helps us estimate the percentage of the views on YouTube that come from violative content.

    In 2018, we introduced our Community Guidelines Enforcement Report to increase transparency and accountability around our efforts to protect viewers. It was a first-of-its-kind look at the content we removed from YouTube for violating our policies, and included the number of videos removed, how that violative content was first identified, reasons for removal, and more. Over the years, we’ve continued to share additional metrics, such as the number of content appeals and subsequent reinstatements. Since launching this report, we’ve removed over 83 million videos and 7 billion comments for violating our Community Guidelines. Just as important, our report has been tracking the impact of our deep investments in machine learning technology in 2017, measuring how well we catch violative content. For example, we’re now able to detect 94% of all violative content on YouTube by automated flagging, with 75% removed before receiving even 10 views. Today, we’re releasing a new data point in our report that will provide even more transparency around the effectiveness of our systems: the Violative View Rate.



    Put simply, the Violative View Rate (VVR) helps us determine what percentage of views on YouTube comes from content that violates our policies. Our teams started tracking this back in 2017, and across the company it’s the primary metric used to measure our responsibility work. As we’ve expanded our investment in people and technology, we’ve seen the VVR fall. The most recent VVR is at 0.16-0.18% which means that out of every 10,000 views on YouTube, 16-18 come from violative content. This is down by over 70% when compared to the same quarter of 2017, in large part thanks to our investments in machine learning. Going forward, we will update the VVR quarterly in our Community Guidelines Enforcement Report.

    VVR data gives critical context around how we're protecting our community. Other metrics like the turnaround time to remove a violative video, are important. But they don't fully capture the actual impact of violative content on the viewer. For example, compare a violative video that got 100 views but stayed on our platform for more than 24 hours with content that reached thousands of views in the first few hours before removal. Which ultimately has more impact? We believe the VVR is the best way for us to understand how harmful content impacts viewers, and to identify where we need to make improvements.

    We calculate VVR by taking a sample of videos on YouTube and sending it to our content reviewers who tell us which videos violate our policies and which do not. By sampling, we gain a more comprehensive view of the violative content we might not be catching with our systems. However, the VVR will fluctuate -- both up and down. For example, immediately after we update a policy, you might see this number temporarily go up as our systems ramp up to catch content that is newly classified as violative.



    Our ongoing goal is for the YouTube community to thrive as we continue to live up to our responsibility. The Community Guidelines Enforcement Report documents the clear progress made since 2017, but we also recognize our work isn't done. It’s critical that our teams continually review and update our policies, work with experts, and remain transparent about the improvements in our enforcement work. We’re committed to these changes because they are good for our viewers, and good for our business—violative content has no place on YouTube. We invest significantly in keeping it off, and the VVR holds us accountable and helps us better understand the progress we’ve made in protecting people from harmful content on YouTube.


    Source: Violative View Rate
    Brink's Avatar Posted By: Brink
    06 Apr 2021


  1. Posts : 255
    Windows 10
       #1

    I think the most important part of the article is this.

    For example, we’re now able to detect 94% of all violative content on YouTube by automated flagging, with 75% removed before receiving even 10 views.
    Their Artificial Intelligence for moderation is very effective.

    Facebook says that 90% of content that gets deleted from Facebook is flagged by artificial intelligence before another person sees it.

    They can detect what topic you are talking about, and use sentiment analysis to detect if you are talking positively or negatively about the topic. Not only that, they can also extract or detect each individual assertion you make, and then cross reference that to news articles to declare it true or false

    So if you say something like
    "hydroxycloroquine treats coronavirus"
    "trump drinks diet coke"
    "hillary clinton has a seizure disorder"

    The AI will be able to detect each claim, and match it to news articles to prove it true or false.

    They're even using artificial intelligence for fact checkers. This article is an interesting read.

    The AI-powered fact checker that investigates QAnon influencers shares its secret weapon

    AI could help scientists fact-check covid claims amid a deluge of research
    Last edited by desbest; 20 Feb 2022 at 09:28.
      My Computer


  2. Posts : 123
    Windows 10
       #2

    I think it's ludicrous that Facebook/Twitter/YouTube now think it's their mission to determine what's true or false. That in effect makes them the arbiter of what's true. Given their reach that is enormous power and influence. Given their obvious bias, that is dangerous.
      My Computer


  3. Posts : 198
    Windows 10 Pro x64
       #3

    So all the flat earth videos have been fact checked and found to be truthful ?

    That's scary...
      My Computer


  4. Posts : 2,189
    Windows 10 Pro 64-bit v22H2
       #4

    It is not Big Brother we have to worry about but Big Brothers. It is apparent that Google is a member of that family.
      My Computers


 

  Related Discussions
Our Sites
Site Links
About Us
Windows 10 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 10" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 21:26.
Find Us




Windows 10 Forums