OpenAI's robotics team has undergone a major change. Catalin Kalinowski, who was leading OpenAI's robotics team, has resigned from the company over issues related to the AI collaboration agreement with the Pentagon.



Kalinowski states that this decision is not personal but a matter of principle. He clarified on social media that although AI can play a significant role in national security, greater caution and transparent oversight are needed on sensitive topics such as domestic surveillance and autonomous weapon systems. He feels that OpenAI did not establish sufficient safety measures before announcing this agreement. Kalinowski joined OpenAI last November. Prior to that, he was working on an augmented reality glasses project at Meta.

OpenAI's response is different. The company says that this collaboration with the Pentagon is an effort to strengthen national security through responsible AI use. According to them, clear boundaries have been set—no use for domestic surveillance, and no development of autonomous weapons. Kalinowski maintains respect for CEO Sam Altman and the team.

However, this controversy is resonating within the industry. When the news broke, uninstallations of ChatGPT increased by over 295%. Additionally, Anthropic's Claude app reached the top position in the free apps list on the US App Store. This clearly indicates that users are concerned about such government collaborations.

This is an interesting moment as questions are being raised about the use of advanced AI systems like dbot. The question is whether companies and governments should have sufficient transparency and ethical safeguards in AI partnerships. Kalinowski's resignation highlights this ongoing debate.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin