GenAI is reshaping work across industries, and the Women’s Tennis Association (WTA) is no exception. In this edition of Oversharing Perspective, I spoke with Paul Sheth, CISO and Head of Technology and Engineering at WTA, about what it takes to secure generative AI at scale. While WTA supports tournaments and athletes worldwide, the lessons apply to any enterprise navigating GenAI.
At WTA, early adoption began with everyday use cases like rephrasing emails, summarizing documents, and boosting productivity. These quick wins were real. But the risks grew just as quickly.
The biggest issue? Oversharing.
When AI tools such as Microsoft Copilot connect to SharePoint or OneDrive, employees can unintentionally expose sensitive files. That includes personal data, financial forecasts, or even unreleased rankings. These exposures rarely come from bad intent. They happen through misconfigurations, missing controls, or default settings that favor ease of use over security.
To address these challenges, Paul described a phased approach grounded in four core pillars:
For Paul, the answer is not to ban GenAI. Employees will find workarounds if tools are blocked. Instead, he advocates for composable architectures and shared accountability, giving teams the freedom to innovate while keeping data safe.
At Opsin, this is the challenge we focus on every day: building the controls that let enterprises embrace GenAI without exposing what matters most. With the right frameworks in place, organizations can innovate quickly, and securely, at scale.
GenAI is reshaping work across industries, and the Women’s Tennis Association (WTA) is no exception. In this edition of Oversharing Perspective, I spoke with Paul Sheth, CISO and Head of Technology and Engineering at WTA, about what it takes to secure generative AI at scale. While WTA supports tournaments and athletes worldwide, the lessons apply to any enterprise navigating GenAI.
At WTA, early adoption began with everyday use cases like rephrasing emails, summarizing documents, and boosting productivity. These quick wins were real. But the risks grew just as quickly.
The biggest issue? Oversharing.
When AI tools such as Microsoft Copilot connect to SharePoint or OneDrive, employees can unintentionally expose sensitive files. That includes personal data, financial forecasts, or even unreleased rankings. These exposures rarely come from bad intent. They happen through misconfigurations, missing controls, or default settings that favor ease of use over security.
To address these challenges, Paul described a phased approach grounded in four core pillars:
For Paul, the answer is not to ban GenAI. Employees will find workarounds if tools are blocked. Instead, he advocates for composable architectures and shared accountability, giving teams the freedom to innovate while keeping data safe.
At Opsin, this is the challenge we focus on every day: building the controls that let enterprises embrace GenAI without exposing what matters most. With the right frameworks in place, organizations can innovate quickly, and securely, at scale.