Microsoft's AI team accidentally leaked a large amount of data containing more than 30,000 pieces of internal information

巴比特_

According to Jinshi’s report on September 19, the latest research from cloud security company Wiz shows that Microsoft’s (MSFT.O) artificial intelligence research team accidentally leaked a large amount of private data cache on the software development platform GitHub. When the team released open source training data, a link configuration error led to data leakage. Repository users were only allowed to download AI models from cloud storage links, but link permissions were misconfigured, meaning others could delete and overwrite existing files. Wiz said the leaked data includes Microsoft employees’ personal computer backups, which contain passwords and keys for Microsoft services and more than 30,000 pieces of internal Microsoft Teams information from 359 Microsoft employees.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments