Logo

SemiVoice

  • Google's Former CEO Warns of AI's Potential for 'Bin Laden'-Style Attacks

    tweaktown

    02/13/2025, 12:37 PM UTC

    ➀ 谷歌前CEO埃里克·施密特表达了对人工智能被用于恐怖主义的担忧,强调恐怖分子或流氓国家滥用技术的风险。

    ➁ 施密特指出,人工智能可能被用于制造生物武器、网络攻击或其他形式的大规模破坏。

    ➂ 尽管他有这样的担忧,施密特认为过度监管可能会阻碍人工智能领域的创新。

    Google's former CEO, Eric Schmidt, has recently expressed his concerns about the potential misuse of artificial intelligence (AI) for acts of terror. In an interview with the BBC, Schmidt emphasized the risks associated with AI falling into the hands of terrorists or 'rogue states' that could misuse the technology to cause harm.

    Schmidt, who served as Google's CEO from 2001 to 2017, has been a strong advocate for responsible AI development. Despite Google's investments in artificial general intelligence (AGI) with projects like Google Gemini, Schmidt has remained insistent on prioritizing safety in the development process.

    His primary concern is the possibility of AI being used by nations like North Korea, Iran, or Russia, which have 'some evil goal.' He fears that these countries could adopt AI technology quickly enough to misuse it and cause real harm, including the creation of biological weapons, cyberattacks, or other forms of mass destruction.

    Schmidt also mentioned the 'Osama Bin Laden' scenario, where an evil person takes over an aspect of modern life to harm innocent people. However, he acknowledges that over-regulation could stifle innovation in the AI sector. He expressed his disagreement with the Trump Administration's decision not to sign a global agreement that would set standards for the safe and ethical development of AI, which was signed by many countries, including France, China, and India.

    Schmidt believes that the AI revolution, which he considers the most important revolution since electricity, should not be limited to Europe and should not be stifled by excessive regulations.

    ---

    本文由大语言模型(LLM)生成,旨在为读者提供半导体新闻内容的知识扩展(Beta)。

SemiVoice 是您的半导体新闻聚合器,探索海内外各大网站半导体精选新闻,并实时更新。在这里方便随时了解最新趋势、市场洞察和专家分析。
📧 [email protected]
© 2025