Hassett reveals the White House plans to establish a review mechanism for AI models

robot
Abstract generation in progress

Gold Financial reports, on May 6th, the Director of the U.S. National Economic Council, Hasset, revealed on Wednesday that the White House is studying an executive order aimed at establishing a review mechanism for next-generation AI models like Anthropic’s Mythos to protect corporate and government networks from AI-related cybersecurity risks. Hasset stated, “We are researching, and may even issue an executive order to clarify a future roadmap for everyone, meaning that AI systems that could pose security vulnerabilities in the future should go through a process and be made public only after being proven safe, just like FDA approval for drugs.” The drafting of this order comes just weeks after Anthropic disclosed its groundbreaking model Mythos. Anthropic claims that the model is skilled at discovering network vulnerabilities and could pose a global cybersecurity risk. Currently, the company only grants access to a select few large tech and financial firms, while the Trump administration has been pushing for federal agencies to access Mythos to test government systems. Hasset said, “We have mobilized the entire government and private sector to work together to ensure comprehensive testing before the model is truly made available to the public, to ensure it does not harm American businesses or the U.S. government.” It is unclear whether the executive order will require AI systems to undergo mandatory testing. If true, this would mark a shift in Trump’s stance on AI issues, as he previously emphasized a “hands-off” approach to AI regulation.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin