FRESHEST 103.5

MOST IMPORTANT STATION ON THE NET

Current track

Title

Artist

Background

Pornhub Responds to Allegations of Allowing Child Abuse Videos

Written by on December 7, 2020


Pornhub has decried recent allegations of child abuse being featured on its site, saying over the weekend any such accusations are “flagrantly untrue.”

The statement comes amid reports that both Visa and Mastercard are aiming to investigate their respective business relationships with the wildly popular porn service. These investigations, per MarketWatch, are a response to a recent piece from New York Times columnist Nicholas Kristof that was published last Friday.

A Visa rep said Sunday that the company was “aware of the allegations” and were now “actively engaging” with financial institutions and Pornhub’s parent company MindGeek. In a similar statement, a Mastercard rep said they were “working with MindGeek’s bank to understand this situation.”

In the Times piece, Kristof states that Pornhub “is like YouTube” in the sense that it allows the general public to share their own work. He goes on to say:

“A great majority of the 6.8 million new videos posted on the site each year probably involve consenting adults, but many depict child abuse and nonconsensual violence. Because it’s impossible to be sure whether a youth in a video is 14 or 18, neither Pornhub nor anyone else has a clear idea of how much content is illegal.”

Kristof also includes mention of a 15-year-old girl who went missing in Florida and was later found by her mother, according to the piece, “in 58 sex videos” on the site.

In a statement to Complex on Monday, Pornhub called ridding the internet of child sexual abuse material (CSAM) “one of the most crucial issues” facing platforms today. 

“Any assertion that we allow CSAM is irresponsible and flagrantly untrue,” a Pornhub rep said. “We have zero tolerance for CSAM. Pornhub is unequivocally committed to combating CSAM, and has instituted an industry-leading trust and safety policy to identify and eradicate illegal material from our community.”

The statement also pointed to a comparison of stats on the removal of incidents of CSAM across other platforms including Facebook, Twitter, and Instagram.

“Pornhub has actively worked to employ extensive measures to protect the platform from such content,” the rep said.

Read the full Pornhub statement below:

Eliminating illegal content and ridding the internet of child sexual abuse material is one of the most crucial issues facing online platforms today, and it requires the unwavering commitment and collective action of all parties.


Due to the nature of our industry, people’s preconceived notions of Pornhub’s values and processes often differ from reality – but it is counterproductive to ignore the facts regarding a subject as serious as CSAM. Any assertion that we allow CSAM is irresponsible and flagrantly untrue. We have zero tolerance for CSAM. Pornhub is unequivocally committed to combating CSAM, and has instituted an industry-leading trust and safety policy to identify and eradicate illegal material from our community.


According to leading non-profits, advocates and third-party analyses, Pornhub’s safeguards and technologies have proven effective: while platforms intended to be family friendly like Facebook reported that it removed 84,100,000 incidents of CSAM over two and a half years, Instagram reported that it removed 4,452,000 incidents of CSAM over one and a half years, and Twitter reported that it suspended 1,466,398 unique accounts for CSAM over two years, the Internet Watch Foundation, the leading independent authority on CSAM, reported 118 incidents of CSAM on Pornhub in a three year period.


Pornhub has actively worked to employ extensive measures to protect the platform from such content. These measures include a vast team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and a variety of automated detection technologies. These technologies include:


·   CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery online


·   Content Safety API, Google’s artificial intelligence tool that helps detect illegal imagery 


·   PhotoDNA, Microsoft’s technology that aids in finding and removing known images of child exploitation


·   Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform



Source link

قالب وردپرس


Reader's opinions

Leave a Reply

Your email address will not be published. Required fields are marked *