Regulations

AI Privacy Policy: What You Must Disclose When Using ChatGPT, Gemini, or Claude

·9 min read

Quick answer: Yes, if your app, website, or SaaS uses AI (ChatGPT API, Gemini, Claude, or any machine learning model), you need a privacy policy that specifically discloses AI data processing. Under both GDPR and the EU AI Act, you must inform users that AI is being used, what data it processes, and how decisions are made.

Why AI Changes Your Privacy Requirements

Using AI APIs in your product introduces new data processing activities that most standard privacy policies don't cover. When a user submits a query that gets sent to OpenAI, Google, or Anthropic, you are transferring personal data to a third-party sub-processor — often across borders.

  • Input data: User prompts may contain personal data (names, emails, health info)
  • Output data: AI-generated responses may constitute profiling or automated decision-making
  • Training data: Some AI providers may use your API calls for model training (opt-out required)
  • Logging: AI providers typically log requests for 30 days for abuse detection

What Your AI Privacy Policy Must Include

1. Disclosure of AI Usage

Under the EU AI Act (2024) and GDPR Article 13-14, you must tell users:

  • That AI is being used in your product
  • Which AI provider(s) you use (OpenAI, Google, Anthropic, etc.)
  • What data is sent to the AI provider
  • The purpose of the AI processing

Example disclosure:

"Our product uses OpenAI's GPT-4 API to generate [descriptions/analysis/recommendations]. When you use [feature name], your input text is sent to OpenAI for processing. OpenAI processes this data under our Data Processing Agreement and does not use it for model training."

2. Automated Decision-Making (GDPR Art. 22)

If your AI makes or assists decisions that significantly affect users (hiring, credit scoring, content moderation, insurance), GDPR Article 22 gives users the right to:

  • Know that automated decision-making is happening
  • Understand the logic involved
  • Contest the decision and request human review

3. Sub-Processor Disclosure

AI API providers are sub-processors under GDPR. Your privacy policy and DPA must list them:

ProviderAPIData LocationTraining on API Data?DPA Available?
OpenAIGPT-4, GPT-4oUS (DPF certified)No (API data excluded by default)Yes
GoogleGemini APIUS / EU optionNo (paid API excluded)Yes
AnthropicClaude APIUS (DPF certified)No (API data not used)Yes
MetaLLaMA (self-hosted)Your infrastructureN/A (self-hosted)N/A
MistralMistral APIEU (France)No (API excluded)Yes

4. Data Retention for AI Processing

Most AI providers retain API request logs temporarily for abuse detection:

  • OpenAI: 30 days (zero data retention available for enterprise)
  • Google Gemini API: Logs retained per Google Cloud terms
  • Anthropic: 30 days, then deleted

Your privacy policy must disclose these retention periods.

5. EU AI Act Transparency Requirements

The EU AI Act (effective August 2025) introduces additional requirements:

  • AI-generated content labelling: Users must know when content is AI-generated
  • High-risk AI systems: Additional documentation, human oversight, and conformity assessments
  • General-purpose AI: Providers must publish training data summaries and comply with copyright rules

Privacy Policy Template Sections for AI

Add these sections to your existing privacy policy:

  1. "Use of Artificial Intelligence" — Describe which AI services you use and why
  2. "AI Data Processing" — What data is sent, how it's processed, and retention
  3. "Automated Decision-Making" — If applicable, explain the logic and user rights
  4. "AI Sub-Processors" — List AI providers with links to their DPAs
  5. "Your Rights Regarding AI" — Right to opt-out, right to explanation, right to human review

Frequently Asked Questions

Does using ChatGPT's API require a privacy policy update?

Yes. Any integration with OpenAI's API constitutes a data transfer to a US-based sub-processor. You must disclose this in your privacy policy, sign a DPA with OpenAI, and ensure you have a legal basis for the data transfer (the EU-US DPF covers this for OpenAI).

Can I use AI to process health or financial data?

Yes, but with extra safeguards. Health data and financial data are "special categories" under GDPR requiring explicit consent or another strong legal basis. You should also conduct aData Protection Impact Assessment (DPIA) and consider using zero-data-retention options from your AI provider.

How do I check if my website's privacy policy covers AI?

PrivacyChecker scans your website for privacy policy completeness, including AI-related disclosures. It checks for third-party connections to AI providers and flags missing transparency requirements.

Check your website now — free

Run a complete privacy audit in under 60 seconds. Get your score, find issues, and learn how to fix them.

Start Free Audit