Tech »  Topic »  AI Models Retain Sensitive Data, Risking Exposure

AI Models Retain Sensitive Data, Risking Exposure


BigID CEO Dimitri Sirota on Risks of Training AI Models on Proprietary Data Michael Novinson (MichaelNovinson) • May 6, 2025

Companies are increasingly relying on proprietary data to train and fine-tune artificial intelligence models, but few realize the lasting implications. Dimitri Sirota, CEO and co-founder of BigID, said using sensitive data - while necessary for AI to understand a business - can pose serious risks if not handled properly. Organizations need to address this urgent dilemma by maximizing model performance without compromising data security.

See Also: How Linking Identity, Data Security Can Help Cyber Response

The issue lies in the way AI models retain data. Foundational models, whether open source or commercial, become a liability if exposed to sensitive data during fine-tuning. This training data tends to include personally identifiable information or confidential business records, exposing companies to breach risk and regulatory penalties, if mishandled.

"You don't have to be a bad ...


Copyright of this story solely belongs to bankinfosecurity . To see the full text click HERE