Why GenAI Cannot Be Separated from Data Governance

GenAI Security
Blog

Key Takeaways

  • GenAI adoption always intersects with data governance.
  • Broad access in tools like Copilot can expose sensitive data.
  • Secure adoption requires phased frameworks, not blanket bans.

GenAI is reshaping work across industries, and the Women’s Tennis Association (WTA) is no exception. In this edition of Oversharing Perspective, I spoke with Paul Sheth, CISO and Head of Technology and Engineering at WTA, about what it takes to secure generative AI at scale. While WTA supports tournaments and athletes worldwide, the lessons apply to any enterprise navigating GenAI.

GenAI’s Promise, and the Pressure

At WTA, early adoption began with everyday use cases like rephrasing emails, summarizing documents, and boosting productivity. These quick wins were real. But the risks grew just as quickly.

The biggest issue? Oversharing.

When AI tools such as Microsoft Copilot connect to SharePoint or OneDrive, employees can unintentionally expose sensitive files. That includes personal data, financial forecasts, or even unreleased rankings. These exposures rarely come from bad intent. They happen through misconfigurations, missing controls, or default settings that favor ease of use over security.

From Excitement to Frameworks

To address these challenges, Paul described a phased approach grounded in four core pillars:

  • Collaboration: Legal, compliance, security, and business teams aligned on expectations.
  • Policy: Clear guardrails on which tools and data are approved.
  • Governance: Ongoing controls to detect and monitor misuse, even if accidental.
  • Guardrails: Technical layers like auditing, input sanitization, and trust orchestration.

A CISO’s Balancing Act

For Paul, the answer is not to ban GenAI. Employees will find workarounds if tools are blocked. Instead, he advocates for composable architectures and shared accountability, giving teams the freedom to innovate while keeping data safe.

“If leaders do not provide a secure pathway, employees will make their own, and that path may not be secure.”
Paul Sheth, CISO & Head of Technology and Engineering, WTA

Final Takeaway: Guardrails Over Gaps

At Opsin, this is the challenge we focus on every day: building the controls that let enterprises embrace GenAI without exposing what matters most. With the right frameworks in place, organizations can innovate quickly, and securely, at scale.

About the Author

James Pham is the Co-Founder and CEO of Opsin, with a background in machine learning, data security, and product development. He previously led ML-driven security products at Abnormal Security and holds an MBA from MIT, where he focused on data analytics and AI.
LinkedIn Bio >

Why GenAI Cannot Be Separated from Data Governance

Key Takeaways

  • GenAI adoption always intersects with data governance.
  • Broad access in tools like Copilot can expose sensitive data.
  • Secure adoption requires phased frameworks, not blanket bans.

GenAI is reshaping work across industries, and the Women’s Tennis Association (WTA) is no exception. In this edition of Oversharing Perspective, I spoke with Paul Sheth, CISO and Head of Technology and Engineering at WTA, about what it takes to secure generative AI at scale. While WTA supports tournaments and athletes worldwide, the lessons apply to any enterprise navigating GenAI.

GenAI’s Promise, and the Pressure

At WTA, early adoption began with everyday use cases like rephrasing emails, summarizing documents, and boosting productivity. These quick wins were real. But the risks grew just as quickly.

The biggest issue? Oversharing.

When AI tools such as Microsoft Copilot connect to SharePoint or OneDrive, employees can unintentionally expose sensitive files. That includes personal data, financial forecasts, or even unreleased rankings. These exposures rarely come from bad intent. They happen through misconfigurations, missing controls, or default settings that favor ease of use over security.

From Excitement to Frameworks

To address these challenges, Paul described a phased approach grounded in four core pillars:

  • Collaboration: Legal, compliance, security, and business teams aligned on expectations.
  • Policy: Clear guardrails on which tools and data are approved.
  • Governance: Ongoing controls to detect and monitor misuse, even if accidental.
  • Guardrails: Technical layers like auditing, input sanitization, and trust orchestration.

A CISO’s Balancing Act

For Paul, the answer is not to ban GenAI. Employees will find workarounds if tools are blocked. Instead, he advocates for composable architectures and shared accountability, giving teams the freedom to innovate while keeping data safe.

“If leaders do not provide a secure pathway, employees will make their own, and that path may not be secure.”
Paul Sheth, CISO & Head of Technology and Engineering, WTA

Final Takeaway: Guardrails Over Gaps

At Opsin, this is the challenge we focus on every day: building the controls that let enterprises embrace GenAI without exposing what matters most. With the right frameworks in place, organizations can innovate quickly, and securely, at scale.

Get Your Copy
Your Name*
Job Title*
Business Email*
Your copy
is ready!
Please check for errors and try again.

Secure Your GenAI Rollout

Find and fix oversharing before it spreads
Book a Demo →