Microsoft AI Researchers Accidentally Leak 38TB Of Internal Data Through Azure Storage
hothardware.comMisconfigured cloud storage seems to be a recurring problem for Microsoft, who just last year had customer information leak through an exposed Azure Blob Storage server. Now, the Redmond-based company has had 38TB of private internal data leaked thanks to a bucket of open-source AI training data linked on GitHub.
Wiz, a cloud security company, published a blog post outlining how Microsoft leaked 38 terabytes of private data. The initial entry stems from a GitHub repository called “robust-models-transfer,” which is owned and operated by Microsoft’s AI research team. This GitHub repo then contained a link to download open-source AI models from an Azure Storage URL. However, it was found that the URL “allowed access to more than just open-source models” and “was configured to grant permissions on the entire storage account, exposing additional private data by mistake.”

Copyright of this story solely belongs to hothardware.com . To see the full text click HERE