我想看一级黄色片_欧美性爱无遮挡电影_色丁香视频网站中文字幕_视频一区 视频二区 国产,日本三级理论日本电影,午夜不卡免费大片,国产午夜视频在线观看,18禁无遮拦无码国产在线播放,在线视频不卡国产在线视频不卡 ,,欧美一及黄片,日韩国产另类

立即打開
最強(qiáng)利器,!AI幫助YouTube制止不當(dāng)視頻的傳播

最強(qiáng)利器!AI幫助YouTube制止不當(dāng)視頻的傳播

David Meyer 2018年05月03日
據(jù)YouTube報(bào)告,,經(jīng)過評(píng)估刪除的視頻之中,,超過83%都來自機(jī)器判斷,,并非人工評(píng)定,。

YouTube在一份報(bào)告中首次詳細(xì)披露,已刪除多少違反平臺(tái)政策的視頻,,數(shù)量確實(shí)不少,。

2017年第四季度,谷歌母公司Alphabet旗下的YouTube刪除視頻超過800萬條,。那么,,YouTube怎樣判斷哪些視頻應(yīng)該刪除?機(jī)器學(xué)習(xí)技術(shù)在其中發(fā)揮了重要作用,。

據(jù)YouTube報(bào)告,,經(jīng)過評(píng)估刪除的視頻之中,超過83%都來自機(jī)器判斷,,并非人工評(píng)定,。超過四分之三的視頻還沒點(diǎn)擊量時(shí)就被刪去,大部分都是垃圾廣告或者色情內(nèi)容,。

科技業(yè)人士喜歡將該技術(shù)稱為機(jī)器學(xué)習(xí)或人工智能(AI),主要利用數(shù)據(jù)改進(jìn)算法,,辨識(shí)出模式后自行采取行動(dòng),,無需人工干預(yù),。這次YouTube就用人工智能自動(dòng)識(shí)別會(huì)引起不滿的內(nèi)容。

YouTube團(tuán)隊(duì)在博客文章中表示,,運(yùn)用人工智能技術(shù)成效顯著,。

舉例來說,YouTube平臺(tái)禁止播放含“暴力極端主義”內(nèi)容的視頻,,2017年初采用人工智能技術(shù)以前,,僅有8%的相關(guān)視頻在評(píng)論不足十條的時(shí)候被刪除。2017年年中,,YouTube開始用機(jī)器學(xué)習(xí)識(shí)別視頻,,一半以上包括暴力極端主義的視頻評(píng)論不足十條時(shí)就被刪除。

然而某些原本應(yīng)保留的視頻也被刪除,,因而機(jī)器學(xué)習(xí)也導(dǎo)致一些疑問,,比如有些看似暴力極端主義的視頻其實(shí)只是諷刺,或者只是如實(shí)的報(bào)道,。

Middle East Eye和Bellingcat等多家新聞機(jī)構(gòu)發(fā)現(xiàn),,去年年末,YouTube刪除了之前分享有關(guān)敘利亞戰(zhàn)爭罪行的視頻,。調(diào)查馬航17號(hào)航班飛經(jīng)烏克蘭遇襲墜毀事件中,,Bellingcat曾發(fā)揮公民記者的重要角色,卻發(fā)現(xiàn)在YouTube的整個(gè)頻道都被中止播放了,。

YouTube當(dāng)時(shí)表示:“網(wǎng)站上視頻數(shù)量龐大,,有時(shí)確實(shí)會(huì)弄錯(cuò)。發(fā)現(xiàn)某條視頻或者某個(gè)頻道被誤刪后,,我們會(huì)迅速恢復(fù),。”

YouTube在本周一的博客文章中稱,,機(jī)器學(xué)習(xí)系統(tǒng)審查可能違規(guī)內(nèi)容時(shí)仍需要人工協(xié)助,。隨著人工智能技術(shù)處理視頻數(shù)量變多,實(shí)際上也增加了視頻審核人手,。

YouTube團(tuán)隊(duì)稱:“去年我們承諾,,到2018年年末谷歌內(nèi)部處理違規(guī)內(nèi)容的工作人員增加到1萬人。在YouTube,,大多數(shù)新增人手都是為了審核內(nèi)容,。我們已聘請(qǐng)了解暴力極端主義、反恐和人權(quán)領(lǐng)域的全職專家,,各地區(qū)專家團(tuán)隊(duì)也已擴(kuò)充,。”(財(cái)富中文網(wǎng))

譯者:Pessy

審稿:夏林

?

YouTube has for the first time revealed a report detailing how many videos it takes down due to violations of the platform’s policies—and it’s a really big number.

The Alphabet-owned site removed more than 8 million videos during the last quarter of 2017. But how did it decide to take them down? Machine learning technology played a big role.

According to YouTube, machines rather than humans flagged up more than 83% of the now-deleted videos for review. And more than three quarters of those videos were taken down before they got any views. The majority were spam or porn.

Machine learning—or AI, as the tech industry often likes to call it—involves training algorithms on data so that they become able to spot patterns and take actions by themselves, without human intervention. In this case, YouTube uses the technology to automatically spot objectionable content.

In a blogpost, the YouTube team said the use of the technique had a big effect.

Regarding videos containing “violent extremism,” which is banned on the platform, only 8% of such videos were flagged and removed in early 2017 before 10 views had taken place. After YouTube started using machine learning for flagging in the middle of the year, “more than half of the videos we remove for violent extremism have fewer than 10 views,” the team said.

However, the use of machine learning does raise serious questions about content being taken down that should stay up—some depictions of violent extremism, for example, may be satire or just reportage.

Several news organizations, such as Middle East Eye and Bellingcat, found late last year that YouTube was taking down videos they had shared, depicting war crimes in Syria. Bellingcat, which played a key citizen-journalist role in investigating the downing of Malaysia Airlines Flight 17 over Ukraine in 2014, found its entire channel suspended.

“With the massive volume of videos on our site, sometimes we make the wrong call. When it’s brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it,” YouTube said at the time.

In its Monday blog post, YouTube said its machine learning systems still require humans to review potential content policy violations, and the number of videos being flagged up using the technology has actually increased staffing requirements.

“Last year we committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018,” the team said. “At YouTube, we’ve staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams.”

掃碼打開財(cái)富Plus App