YouTube will prompt users to reconsider whether offensive comments are “something [they] really want to share” before posting.
The site made the announcement on Thursday, with the new update aiming to make the platform more inclusive. Android users will see the pop-up first, which will give the “commenter the option to reflect before posting,” the company wrote in its statement.
There will also be a link to YouTube’s community guidelines, an edit button, and an option for the user to share their original comment. The prompt won’t accompany every offensive comment, with the platform saying its system is “continuously learning”—and it might also appear on comments that don’t infringe on community guidelines. If the prompt doesn’t appear, then comments can still be taken down if they do violate guidelines.
“Our system learns from content that has been repeatedly reported by users. We hope to learn more about what comments may be considered offensive as we continue to develop it,” the company said.
YouTube employed similar methods at the pandemic’s outset when it started flagging videos that might break its policies. YouTube permitted its algorithms to “cast a wider net” and remove more videos than normal because it “accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible.”
The platform has also improved its content filtering system for YouTube creators, which will separate possibly inappropriate and harmful comments that have been automatically flagged for review. Creators then have the choice of whether to remove them without reading them.
YouTube has boosted its technology that tracks hateful comments: “Since early 2019, we’ve increased the number of daily hate speech comment removals by 46x. And in the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech,” YouTube claimed.
The site is also trying to have a firmer grasp on its understanding of its various communities by asking creators to voluntarily share their gender, sexual orientation, race, and ethnicity information, which YouTube said won’t be used for advertising. This move could transform the company into a more welcoming space for marginalized communities.
“We’ll then look closely at how content from different communities is treated in our search and discovery and monetization systems. We’ll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others,” the company said.
YouTube Will Ask Users If They Want to Rethink Posting Offensive Comments
Written by SOURCE on December 4, 2020
YouTube will prompt users to reconsider whether offensive comments are “something [they] really want to share” before posting.
The site made the announcement on Thursday, with the new update aiming to make the platform more inclusive. Android users will see the pop-up first, which will give the “commenter the option to reflect before posting,” the company wrote in its statement.
There will also be a link to YouTube’s community guidelines, an edit button, and an option for the user to share their original comment. The prompt won’t accompany every offensive comment, with the platform saying its system is “continuously learning”—and it might also appear on comments that don’t infringe on community guidelines. If the prompt doesn’t appear, then comments can still be taken down if they do violate guidelines.
“Our system learns from content that has been repeatedly reported by users. We hope to learn more about what comments may be considered offensive as we continue to develop it,” the company said.
YouTube employed similar methods at the pandemic’s outset when it started flagging videos that might break its policies. YouTube permitted its algorithms to “cast a wider net” and remove more videos than normal because it “accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible.”
The platform has also improved its content filtering system for YouTube creators, which will separate possibly inappropriate and harmful comments that have been automatically flagged for review. Creators then have the choice of whether to remove them without reading them.
YouTube has boosted its technology that tracks hateful comments: “Since early 2019, we’ve increased the number of daily hate speech comment removals by 46x. And in the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech,” YouTube claimed.
The site is also trying to have a firmer grasp on its understanding of its various communities by asking creators to voluntarily share their gender, sexual orientation, race, and ethnicity information, which YouTube said won’t be used for advertising. This move could transform the company into a more welcoming space for marginalized communities.
“We’ll then look closely at how content from different communities is treated in our search and discovery and monetization systems. We’ll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others,” the company said.
Source link
Author
SOURCE
Reader's opinions
You may also like
Meet These Extraordinary FOOD HEROES in Our New Documentary Series
SOURCE
January 30, 2024
Lil Wayne & 2 Chainz Debate GOAT Diss Song, Mixtape, and Animal
SOURCE
December 20, 2023
Lil Wayne & 2 Chainz Debate GOAT Diss Song, Mixtape, and Blunt Rotation
SOURCE
December 19, 2023
Continue reading
Next post
Oscar Isaac Cast as Solid Snake in ‘Metal Gear Solid’ Movie
Previous post
How LaMelo Ball Can Help PUMA Disrupt the Basketball Shoe Space