It's rare to join a meeting these days without at least one AI assistant in the room. AI note-takers and transcription tools have become standard — they join calls automatically, record conversations, generate summaries, and sync everything to your CRM or project management tool.
Most people click "admit" without a second thought. But what happens to everything that's recorded?
We reviewed the privacy policies of six widely used AI meeting and note-taking tools to find out: Otter.ai, Fathom, Fellow.ai, Read.ai, Microsoft Copilot, and Google Gemini. Some of what we found was expected. Some of it wasn't.
The Six Biggest Surprises
Rather than walking through each tool one by one, here are the findings that stood out most — the things that would change how most people think about using these tools.
1. One Tool Openly Admits to Selling Your Personal Data to Advertisers
Read.ai explicitly acknowledges in its privacy policy that it may "sell" identifiers — including hashed email addresses, cookie IDs, and employment information — to marketing and advertising partners. It uses the word "sell" because certain U.S. state privacy laws (like California's CCPA) require that level of honesty. Read.ai is the only tool of the six to make this admission openly.
It also performs facial expression analysis on all meeting participants — including people who don't have a Read.ai account — to generate a behavioral "Read Score" measuring sentiment and engagement. EU and UK users are exempt from the facial analysis component, but the behavioral profiling still applies.
2. Your Recordings Are Being Sent to Unnamed Companies for AI Training
Otter.ai shares audio recordings and transcripts with what it describes as "data labeling service providers" — third-party companies that annotate your meeting content to build AI training datasets. These companies are not named in the privacy policy, and standard users have no opt-out for this transfer.
Otter also works with Facebook and other advertising partners for personalized ad targeting, and its OtterPilot feature takes automatic screenshots during virtual meetings without requiring per-meeting consent. An August 2025 class action lawsuit (Brewer v. Otter.ai) alleges inadequate disclosure of these practices.
3. Google Keeps Your Gemini Conversations for 3 Years — Even After You Delete Them
When you use Google Gemini, a subset of your conversations are reviewed by human reviewers (including third-party service providers) to improve the AI. Once a conversation has been selected for review, it is retained for up to three years — and this happens regardless of whether you delete your activity. The reviewed conversations are stored separately from your Google Account, and users have no way to delete them.
AI model training is also enabled by default through Gemini's "Keep Activity" setting. And with Connected Apps turned on, Gemini's reach extends well beyond meetings — into Gmail, Google Drive, Docs, Chat, YouTube watch history, and Search history.
Gemini Live adds another layer: it records audio during real-time conversations, and will capture video and screenshares when those features are active. It can also activate accidentally from sounds that resemble "Hey Google" — and when Keep Activity is on (the default), those accidental recordings are stored and treated the same way as intentional ones.
4. Microsoft Copilot's Biggest Risk Isn't What It Collects — It's What It Can Access
To its credit, Microsoft Copilot has some of the strongest data protection commitments of any tool we reviewed. Prompts and responses are explicitly not used to train foundation language models. Data stays within the Microsoft 365 service boundary. Enterprise protections are real.
But the primary risk is different: overpermissioning. Copilot can surface any data that a user already has access to across the entire Microsoft 365 environment — emails, files, chats, SharePoint sites, OneDrive documents. Research from Concentric AI found that approximately 16% of business-critical files are overshared within organizations. In practice, this means Copilot can accidentally surface sensitive documents that were technically accessible but never meant to be found. The U.S. House of Representatives banned Copilot for congressional staff in March 2024, citing exactly this data security risk.
There's also a newer consideration: as of January 2026, Anthropic is a subprocessor for Microsoft 365 Copilot. Anthropic's models are explicitly out of scope for the EU Data Boundary, meaning EU customer data may leave European boundaries when Anthropic models are used.
5. Every AI Note-Taker Captures Data on People Who Never Agreed to Anything
This applies to all four purpose-built meeting tools — Otter.ai, Fathom, Fellow.ai, and Read.ai. When these tools join a meeting, they record and process the audio, video, names, and email addresses of every participant, including external guests who have never created an account, never seen a privacy policy, and never given consent to the tool itself.
The legal responsibility for obtaining consent falls on the meeting host — but in practice, most hosts don't inform their attendees about what the AI assistant is doing with their data. This is an area of growing legal exposure, particularly in jurisdictions with strict consent requirements.
We explored the broader dynamics of hidden data collection in our piece on the hidden data trail your business apps leave behind.
6. One Tool Collects Your Voice Biometrics (But Handles It Better Than You'd Expect)
Fellow.ai collects biometric voice samples to improve speaker identification across meetings. What makes this stand out — in a good way — is how it's handled: administrators enable the feature at the workspace level, but individual users can opt out via their settings at any time. Users can also delete their voice samples whenever they choose, voice data is stored encrypted, and workspace administrators can disable voice matching entirely. Voice samples are never collected from third-party attendees.
That said, any biometric data collection carries legal implications in jurisdictions with biometric privacy laws, such as Illinois (BIPA), Texas (CUBI), and Washington state.
How the Six Tools Compare at a Glance
Here's how we'd summarize the overall privacy risk of each tool based on our review:
| Tool | Risk Level | Key Concern |
|---|---|---|
| Otter.ai | High | Unnamed data labeling companies annotate your recordings for AI training. Ad partner data sharing. Auto-screenshots. |
| Fathom | Medium | Strongest protections of the dedicated note-takers. Explicitly blocks third-party AI training. Calendar connection mandatory. |
| Fellow.ai | Medium | Biometric voice samples (admin-enabled, user opt-out). Canadian company, PIPEDA compliant. Admin controls are strong. Clear deletion timelines. |
| Read.ai | High | Openly sells identifiers to ad partners. Facial and behavioral profiling of all participants. Extensive Google ecosystem access. |
| MS Copilot | Medium | Strong data protections. Main risk is overpermissioning — Copilot surfaces anything a user can access. Anthropic subprocessor outside EU boundary. |
| Google Gemini | High | AI training on by default. Reviewed chats kept 3 years and can't be deleted. Gemini Live records accidental activations. Broad Connected App access. |
What Businesses Should Do
None of this means you should stop using AI meeting tools. They provide real productivity benefits. But it does mean that choosing which tool to use — and how to configure it — deserves more thought than most organizations give it.
Review Privacy Policies Before Deploying
This sounds obvious, but it rarely happens. Before rolling out any AI meeting assistant to your team, someone should actually read the privacy policy and understand where your meeting content goes, who has access to it, and whether it's used for AI training. The differences between tools are significant.
Audit Your Permissions
For Microsoft Copilot users especially, the overpermissioning risk is real. Review file-sharing permissions across your Microsoft 365 environment. Restrict broad "everyone" shares on sensitive documents. Copilot doesn't create access problems — it surfaces existing ones that were previously hidden.
Inform Your Meeting Participants
If you use an AI note-taker, your external meeting participants deserve to know. Tell them what tool is joining, what it records, and give them a genuine opportunity to opt out. This is both a courtesy and — depending on your jurisdiction — a legal requirement.
Configure the Settings That Matter
Many of the worst behaviors we identified have controls — but they're often not enabled by default:
- Google Gemini: Turn off "Keep Activity" if you don't want your chats used for AI training
- Fathom: Use the opt-out in account settings to prevent de-identified data from improving their models
- Fellow.ai: Have your workspace admin review the AI data sharing toggle
- Microsoft Copilot: Review Purview retention policies and ensure they align with your data governance requirements
Build an AI Usage Policy
Your team is already using these tools — the question is whether they're doing so with guidance or without it. A clear AI usage policy should address which tools are approved, what data can be discussed in AI-recorded meetings, and how to handle external participants. We covered this in detail in our article on why every business needs an AI usage policy.
Understand Your Jurisdiction
Privacy laws vary significantly. Canada's PIPEDA, the EU's GDPR, California's CCPA, and Illinois's BIPA all impose different requirements around consent, biometric data, and cross-border transfers. What's compliant in one jurisdiction may not be in another. If your business operates across borders — or simply has meetings with people in other jurisdictions — this matters. Our overview of Canada's privacy landscape for small businesses is a good starting point for Canadian organizations.
The Bigger Picture
AI meeting tools represent a broader pattern: useful technology that collects significantly more data than most users realize. The gap between what people think these tools do ("take notes in my meeting") and what they actually do ("analyze facial expressions, share data with ad networks, send recordings to third-party annotation companies, and retain conversations for years") is wide enough to be a genuine business risk.
The tools that handle privacy well — Fathom's explicit third-party training restrictions, Fellow.ai's admin controls and deletion timelines, Microsoft's commitment to keeping enterprise data out of model training — show that strong privacy and useful AI aren't mutually exclusive. But the defaults across the industry lean heavily toward collection, not protection.
Knowing what you're agreeing to is the first step. Configuring your tools accordingly is the second. And having a conversation with your team about which tools belong in your meetings — and which don't — is the third.
If you're unsure where your organization stands, our free cybersecurity assessment covers data privacy, access controls, and more. It takes a few minutes and gives you a concrete starting point.
Detailed Privacy Comparison: Reference Guide
The following section provides a detailed breakdown of how each tool handles third-party sharing, permissions, data storage, data types, and deletion rights. This is based on official privacy policies reviewed in early 2026.
Third-Party Data Sharing
AI Service Providers
- Otter.ai (High): Shares audio and transcripts with unnamed AI service providers and data labeling companies that annotate recordings for AI training. No opt-out for standard users.
- Fathom (Medium): Limited sharing with service providers. Explicitly states OpenAI, Anthropic, and Google are not authorized to train their models on your content.
- Fellow.ai (Medium): Shares with AI sub-processors for transcription and summarization. Workspace admins can disable all AI data transfers via settings.
- Read.ai (High): Shares with AI tools. Google Workspace data may be transferred to third-party AI tools. Personalized model training is permitted.
- MS Copilot (Low): Prompts and responses not used to train foundation LLMs. Anthropic is a subprocessor under the Data Protection Addendum. No data shared with OpenAI.
- Google Gemini (High): Human reviewers (including third-party service providers) review a subset of chats. Reviewed chats retained up to 3 years. Training enabled by default.
Advertising and Marketing Partners
- Otter.ai (High): Works with Facebook and other ad partners for personalized advertising. Uses mobile tracking providers including AppsFlyer.
- Fathom (Low): Does not sell or share personal info for direct marketing. Ad targeting limited to device information only.
- Fellow.ai (Medium): Uses Facebook, LinkedIn, and other social platforms for interest-based ads. Uses Google Analytics. Opt-out links provided.
- Read.ai (High): Explicitly acknowledges it may "sell" identifiers to marketing partners under state privacy laws. Shares internet activity and inferences with ad partners.
- MS Copilot (Low): Copilot interaction data is not used for ad targeting. Consumer Copilot personalization can be opted out.
- Google Gemini (Low): Gemini chats are not used to show ads. Google states this policy could change with advance notice.
Required Permissions and Access
Calendar Access
- Otter.ai: Integrates with Google Calendar, iCal, Outlook. Reads event titles, times, attendees, and join URLs.
- Fathom: Mandatory — the app cannot function without linking a Google or Outlook calendar.
- Fellow.ai: Required — reads organizer, title, attendees, dates, times, duration, meeting rooms, and recurrence.
- Read.ai: Integrates with Google Calendar and Outlook. Also reads Gmail, Google Docs, Google Drive, and Google Chat if Google account is connected.
- MS Copilot: Accesses emails, calendar, meetings, chats, OneDrive files, and frequently visited SharePoint sites.
- Google Gemini: With Connected Apps enabled, accesses Google Calendar, Gmail, Docs, Drive, Photos, YouTube history, Search history, and Chrome page context.
Audio and Video Recording
- Otter.ai: Records audio, generates transcripts. OtterPilot takes automatic screenshots during virtual meetings.
- Fathom: Records audio and video. Collects speaker identification from all participants.
- Fellow.ai: Records and transcribes meetings. Collects biometric voice samples for speaker labeling (admin-enabled, user opt-out available, deletable).
- Read.ai: Records audio and video. Analyzes facial expressions and infers sentiment and engagement for all participants.
- MS Copilot: Does not independently record — accesses transcripts and recordings made through Teams.
- Google Gemini: Gemini Live captures audio recordings and transcripts. Video and screenshare content is captured when actively shared. May activate accidentally on "Hey Google" sounds.
Broader Ecosystem Access
- Otter.ai: Limited to calendar events and audio recordings.
- Fathom: Limited to calendar, meeting recordings, and CRM integration data.
- Fellow.ai: Calendar events, Slack and Teams metadata. Does not access DMs or private channel content.
- Read.ai: Full Google Workspace integration — Gmail messages, Google Drive files, Docs content, and Chat data.
- MS Copilot: Full Microsoft Graph access — emails, chats, meetings, documents, calendar, contacts, SharePoint.
- Google Gemini: With Connected Apps: Search history, YouTube history, Gmail, Docs, Drive, Photos, Chrome page context, and Android screen content.
Data Storage and Retention
- Otter.ai: Stored on AWS in the United States. Retained "as long as necessary" with no specific time limit.
- Fathom: Stored in the United States. Retained until actively deleted. Deletion within 30 days of request.
- Fellow.ai: Stored on AWS. Based in Ottawa, Canada. Individual data deleted within 15 days. Backups purged within 30 additional days.
- Read.ai: Stored in the United States. Audio and video retained up to 2 years. No specific limit on other data types.
- MS Copilot: Stored within the Microsoft 365 service boundary. Enterprise retention governed by Microsoft Purview policies. EU Data Boundary compliant (except Anthropic subprocessor).
- Google Gemini: Stored on Google's global infrastructure. Default auto-delete at 18 months (configurable). Reviewed chats retained up to 3 years regardless of user deletion.
AI Model Training
- Otter.ai (High): Trains on de-identified audio and transcripts by default. Manual review is opt-in. No clear global opt-out.
- Fathom (Medium): Uses de-identified data for in-house AI. Users can opt out in account settings. Third-party AI training explicitly blocked.
- Fellow.ai (Medium): Admins can toggle AI data sharing on or off for the workspace.
- Read.ai (High): Customer Experience Program is opt-in. Some data processing for model improvement may still occur within the service.
- MS Copilot (Low): Prompts and responses explicitly not used to train foundation models. Consumer Copilot training can be opted out.
- Google Gemini (High): Training is on by default when Keep Activity is enabled. Even submitting feedback triggers data use for training.
Data Removal and User Rights
- Otter.ai: Email support@otter.ai. Enterprise users must go through their employer. De-identified training data may persist.
- Fathom: Delete via account settings or email privacy@fathom.video. Processed within 30 days. Business account users must contact their employer.
- Fellow.ai: Email privacy@fellow.app. Deleted within 15 days. Backups purged within 30 additional days.
- Read.ai: Email privacy@read.ai. May retain data "for legitimate business purposes" after request.
- MS Copilot: Enterprise deletion subject to Purview retention policies. Individual employees may not be able to self-delete if a compliance hold is active.
- Google Gemini: Delete via Gemini Apps Activity page. Chats already reviewed by service providers are not deleted — they are retained up to 3 years.
Business and Enterprise Account Complications
Across all six tools, enterprise and business account users have limited independent data rights. The employer is typically the data controller, which means individual employees must go through their organization to exercise access or deletion rights. For Microsoft Copilot, this is particularly significant — IT admins control retention policies, eDiscovery holds, and audit logs, and employees cannot delete data that is under a legal hold.
Privacy policies reviewed: Otter.ai (September 2024), Fathom (October 2025), Fellow.ai (2026), Read.ai (February 2026), Microsoft Copilot (March 2026), Google Gemini (February 2026).
This article is intended for informational purposes only and does not constitute professional security, legal, or compliance advice. Organizations should consult with qualified professionals to assess their specific circumstances and develop appropriate protective measures.