Microsoft AI Researchers Expose 38TB of Top Sensitive Data
hackread.com- Microsoft AI researchers accidentally exposed 38 terabytes of private data, including a disk backup of two employees’ workstations, while publishing a bucket of open-source training data on GitHub.
- The backup includes secrets, private keys, passwords, and over 30,000 internal Microsoft Teams messages.
- The data was exposed due to a misconfigured Shared Access Signature (SAS) token.
- SAS tokens can be a security risk if not used properly, as they can grant high levels of access to Azure Storage data.
- Organizations should carefully consider their security needs before using SAS tokens.
As part of their ongoing work on accidental exposure of cloud-hosted data, the Wiz Research Team scanned the internet for misconfigured storage containers. In this process, they found a GitHub repository under the Microsoft organization named robust-models-transfer. The repository belonged to Microsoft’s AI research division, whose purpose is to provide open-source code and AI models for image recognition.
After ...
Copyright of this story solely belongs to hackread.com . To see the full text click HERE