How the internet can rebuild trust - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

How the internet can rebuild trust

Algorithms and generative AI models that decide what billions of users see should be transparent
00:00

{"text":[[{"start":null,"text":"

As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense
"}],[{"start":5.8,"text":"The writer is co-founder of Wikipedia and author of ‘The Seven Rules of Trust’"}],[{"start":11.29,"text":"When I founded Wikipedia in 2001, pioneers of the internet were excited by its promise to give the world access to truth and connection."}],[{"start":22.18,"text":"Two decades later, that optimism has curdled into cynicism. We scroll through feeds serving up news we no longer believe, interact with bots we cannot identify and brace for the next synthetic scandal created by fake images from artificial intelligence."}],[{"start":42.06,"text":"Before the web can move forward, it must remember how it earned trust in the first place."}],[{"start":48.49,"text":"The defining difference between web 1.0 and the platforms that dominate today is not technological sophistication but moral architecture. Early online communities were transparent about process and purpose. They exposed how information was created, corrected and shared. That visibility generated accountability. People could see how the system worked and participate in fixing its mistakes. Trust emerged not from perfection (there was still plenty of online trolling, flame wars and toxicity), but from openness."}],[{"start":84.49000000000001,"text":"Today’s digital landscape reverses that logic. Recommendation algorithms and generative AI models decide what billions of users see, yet their workings remain opaque. When platforms insist their systems are too complex to explain, users are asked to substitute faith for understanding."}],[{"start":105.78,"text":"AI intensifies the problem. Large language models can produce fluent paragraphs and convincing deepfakes. The tools that promised to democratise knowledge now threaten to make knowledge unrecognisable. If everything can be fabricated, the distinction between truth and illusion becomes a matter of persuasion."}],[{"start":127.36,"text":"Re-establishing trust in this environment requires more than fact-checking or content moderation. It requires structural transparency. Every platform that mediates information should make provenance visible: where data originated, how it was processed, and what uncertainty surrounds it. Think of it as nutritional labelling for information. Without it, citizens cannot make informed judgments and democracies cannot function."}],[{"start":156.64,"text":"Equally important is independence. As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense. Guardrails must ensure the entities curating public knowledge are accountable to the public, not just investors."}],[{"start":177.42999999999998,"text":"And we must revive civility too. Some of the best early online spaces relied on norms that valued reasoned argument over insult. They were imperfect but self-correcting because participants felt a duty to the collective project. Today’s social platforms monetise outrage. Restoring trust means designing systems that reward good-faith discourse — through visibility algorithms, community-based moderation, or friction that forces reflection before reposting."}],[{"start":212.55999999999997,"text":"Governments have a role to play but regulation alone cannot rebuild trust. It has to be observed in practice. Platforms should disclose not only how their algorithms work but also when they fail. AI developers should publish dataset sources and error rates."}],[{"start":232.7,"text":"The challenge of our time is not that information is scarce but that authenticity is. Important aspects of the early internet succeeded because people could trace what they read to another human being, even if the other human being was operating behind a pseudonym. The new internet must restore that chain of custody."}],[{"start":255.83999999999997,"text":"We are entering an era when machines can mimic any voice and invent any image. If we want truth to survive that onslaught, we must embed transparency, independence and empathy into the digital architecture itself. The early days of the web showed it could be done. The question is whether we still have the will to do it again."}],[{"start":284.46999999999997,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1764835851_6780.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

特朗普的“震撼与战争”使这场经济危机不同以往

与伊朗的冲突将比去年的关税危机留下更深、更持久的伤痕。

间谍与补贴:中国加入巴西200亿美元外卖应用之战

本地外卖平台iFood与中国背景的Keeta互相指责对方从事企业间谍活动和卑劣手段。

伊朗战争引发全球油价剧烈波动,散户争相押注原油

与原油价格挂钩的美国最大ETF在油市出现“迷因股”时刻之际录得有史以来最大资金流入。

“每天投出10到15份申请”:英国招聘低迷如何冲击年轻人

分析人士表示,部长们推动帮助近100万名未在教育、就业或培训中的人群的举措还需更进一步。

恕我直言,甜茶,艺术不止关乎流行度

有些愤慨似乎并不真诚,但这位演员在轻蔑地贬低歌剧与芭蕾这一点上却是错的。

来自朝鲜的“假工人”利用AI渗透欧洲公司

平壤的特工部署聊天机器人来执行任务,且常常身兼多角。
设置字号×
最小
较小
默认
较大
最大
分享×