Episode Details
Back to Episodes
第2560期:Meta Ends Third-party Fact-Checking System(1)
Description
Facebook parent company Meta recently announced changes to the way it tries to identify misinformation and harmful material published on its social media services.
Facebook 母公司 Meta 最近宣布改变其识别社交媒体服务上发布的错误信息和有害材料的方式。
Meta chief Mark Zuckerberg explained in a video that the company had decided to make the changes because the old system had produced “too many mistakes and too much censorship.”
Meta 首席执行官马克·扎克伯格在一段视频中解释说,该公司决定做出这些改变是因为旧系统产生了“太多的错误和太多的审查”。
Zuckerberg said the moderation system Meta had built needed to be “complex” to examine huge amounts of content in search of material that violated company policies.
扎克伯格表示,Meta 建立的审核系统需要“复杂”,以检查大量内容,以查找违反公司政策的材料。
However, he noted the problem with such systems is they can make a lot of errors. The Meta chief added about such systems, “Even if they accidentally censor just one percent of posts, that’s millions of people.”
然而,他指出此类系统的问题是它们可能会犯很多错误。这位 Meta 负责人对此类系统补充道,“即使他们不小心只审查了百分之一的帖子,那也涉及数百万人。”
So, he said the company had decided to move to a new system centered on “reducing mistakes, simplifying our policies, and restoring free expression.”
因此,他表示公司决定转向一个以“减少错误、简化我们的政策和恢复言论自由”为中心的新系统。
The new method turns over content moderation duties to a “Community Notes” system. The company said this system aims to “empower the community” to decide whether content is acceptable or needs further examination.
新方法将内容审核职责移交给“社区注释”系统。该公司表示,该系统旨在“赋予社区权力”来决定内容是否可以接受或需要进一步检查。
The changes will be effective for Meta’s Facebook, Instagram and Threads services. Meta said