12 min read

Acceptable Use Policy Template for SaaS & API Platforms

Your Terms of Service defines the contract. Your Acceptable Use Policy defines the rules. Here’s how to write an AUP that actually protects your platform, your users, and your business.

If you run a SaaS application or expose an API to external users, you need an Acceptable Use Policy (AUP). It’s the document that draws the line between legitimate use and abuse — and without one, you have no clear legal basis for suspending bad actors, preventing platform misuse, or protecting the experience of your other customers.

Yet many founders skip the AUP entirely. They assume their Terms of Service covers everything, or they copy a vague paragraph from a competitor’s website and call it done. That approach works fine until someone uses your API to send a million spam emails, scrapes your entire database, or trains a competing AI model on your outputs. Then you’re scrambling.

This guide covers everything you need to write a comprehensive, enforceable Acceptable Use Policy for a SaaS or API platform in 2026. We’ll walk through what an AUP is, how it differs from your Terms of Service, the essential sections every AUP must include, API-specific and AI-specific clauses, enforcement mechanisms, and common mistakes to avoid.

What Is an Acceptable Use Policy?

An Acceptable Use Policy is a document that defines the rules and restrictions governing how users may interact with your platform. It specifies what behaviours are permitted, what activities are prohibited, and what consequences follow if someone violates those rules.

Think of it this way: your Terms of Service is the overarching contract between your business and your users. Your AUP is the rulebook that sits inside (or alongside) that contract, focusing specifically on how users can and cannot use your product.

Why SaaS and API Platforms Need an AUP

SaaS and API platforms face unique risks that a standard Terms of Service alone doesn’t adequately address:

  • Shared infrastructure: Unlike installed software, your users share your servers, bandwidth, and compute resources. One bad actor can degrade the experience for everyone.
  • Programmatic access: APIs enable automated, high-volume interactions that can overwhelm your systems or extract data at scale.
  • Third-party integrations: Your users may build products on top of your platform and expose your service to their users, creating cascading liability.
  • Regulatory exposure: If users process personal data through your platform, their misuse can create GDPR, CCPA, or other regulatory liability for you.
  • Reputational risk: If your platform is used for illegal, harmful, or unethical purposes, your brand suffers even if you didn’t authorise the activity.
  • AI and generative features: If your product includes AI capabilities, users may attempt to generate harmful, misleading, or infringing content at scale.

An AUP gives you the legal foundation to act swiftly when these situations arise. Without one, suspending a user or revoking API access becomes a contractual grey area that could expose you to legal challenge.

AUP vs Terms of Service: Key Differences

Many founders conflate their Acceptable Use Policy with their Terms of Service. While they’re related, they serve different purposes:

  • Terms of Service (ToS): The master agreement covering the entire legal relationship — payment terms, intellectual property, limitation of liability, dispute resolution, data handling, warranties, and termination. It’s the contract.
  • Acceptable Use Policy (AUP): A focused set of rules about permitted and prohibited behaviour on the platform. It’s the rulebook.

Some businesses embed their AUP as a section within their Terms of Service. Others publish it as a separate, standalone document that the ToS references and incorporates by reference. Both approaches are legally valid — the standalone approach tends to work better for platforms with complex usage rules, because it’s easier to update the AUP independently without revising the entire ToS.

Best practice: Publish your AUP as a standalone page on your website, and include a clause in your Terms of Service that states: “Your use of the Service is subject to our Acceptable Use Policy at [URL], which is incorporated into these Terms by reference.”

Essential Sections Every AUP Must Include

A comprehensive AUP for a SaaS or API platform should cover the following areas. Miss any of these, and you’ll have gaps that bad actors will exploit.

1. Prohibited Activities

This is the core of your AUP. List every category of activity that users are not permitted to engage in. Be specific — vague prohibitions like “don’t do anything bad” are unenforceable. Common prohibited activities include:

  • Illegal activity: Using the platform to violate any applicable law, regulation, or legal obligation.
  • Intellectual property infringement: Uploading, distributing, or transmitting content that infringes copyrights, trademarks, patents, or trade secrets.
  • Harmful or offensive content: Distributing content that is defamatory, obscene, threatening, discriminatory, or promotes violence.
  • Spam and unsolicited communications: Using the platform to send bulk unsolicited emails, messages, or advertisements in violation of CAN-SPAM, PECR, or similar anti-spam laws.
  • Malware and security threats: Uploading viruses, trojans, ransomware, or any malicious code, or using the platform to launch cyberattacks (DDoS, phishing, credential stuffing).
  • Fraud and deception: Using the platform to impersonate others, conduct phishing, or engage in financial fraud.
  • Harassment and abuse: Using the platform to stalk, harass, bully, or threaten any individual.
  • Child safety violations: Any use involving child sexual abuse material (CSAM) or exploitation of minors.
  • Sanctions and export violations: Using the platform from or on behalf of sanctioned countries, entities, or individuals in violation of UK, EU, or US sanctions and export control laws.

2. Usage Limits and Fair Use

Even legitimate users can harm your platform by consuming excessive resources. Your AUP should establish clear boundaries:

  • Resource consumption: Do not consume excessive storage, bandwidth, or compute resources that degrade the service for other users.
  • Fair use thresholds: If your plan includes “unlimited” features, define what “reasonable use” means. Courts have consistently held that “unlimited” does not mean “infinite.”
  • Multi-account abuse: Do not create multiple accounts to circumvent usage limits, free tier restrictions, or enforcement actions.
  • Automated abuse: Do not use bots, scripts, or automated tools to interact with the platform in ways that exceed normal human usage patterns, unless explicitly permitted via the API.

3. Data Scraping and Extraction

Data scraping is one of the most common forms of platform abuse. Your AUP should explicitly address it:

  • No unauthorised scraping: Do not use automated tools, crawlers, or scripts to extract data from the platform except through officially provided APIs and within documented rate limits.
  • No database recreation: Do not systematically download or cache platform data to create a competing database or service.
  • Respect robots.txt: Any automated access must comply with robots.txt directives and published crawling guidelines.
  • API data restrictions: Data retrieved through the API may only be used for the purposes documented in the API terms and must not be resold, redistributed, or used to train machine learning models without express written permission.

4. Reverse Engineering and Security Testing

Protect your proprietary technology while leaving room for legitimate security research:

  • No reverse engineering: Do not decompile, disassemble, reverse engineer, or attempt to derive the source code, algorithms, or data structures of the platform.
  • No circumvention: Do not bypass, disable, or interfere with any security, authentication, or access control mechanisms.
  • No vulnerability exploitation: Do not exploit any vulnerability in the platform. If you discover a security issue, report it through our responsible disclosure programme at [email/URL].
  • Responsible disclosure: If you operate a bug bounty or responsible disclosure programme, reference it here and explain how security researchers can participate legitimately.

5. Account Security and Access

Users must take responsibility for securing their own accounts:

  • Credential security: Keep your login credentials, API keys, and access tokens confidential. Do not share them publicly or embed them in client-side code.
  • Unauthorised access: Do not access accounts, systems, or data that you are not authorised to access.
  • Prompt reporting: Immediately report any suspected unauthorised access to your account or any security breach you become aware of.

6. Resale and Sublicensing

Unless your business model specifically allows resale, prohibit it:

  • No resale: Do not resell, sublicense, lease, or otherwise transfer access to the platform without our prior written consent.
  • No white-labelling: Do not rebrand or white-label the platform as your own product without an explicit white-label agreement.
  • No competitive use: Do not use the platform to build, improve, or benchmark a competing product or service.

API-Specific AUP Requirements

If your platform exposes an API, your AUP needs additional provisions that address the unique risks of programmatic access.

Rate Limiting and Quotas

Clearly document your rate limits and what happens when they’re exceeded:

  • Published rate limits: State your current rate limits (e.g., “100 requests per minute per API key on the Starter plan”) or link to your API documentation where limits are published.
  • Burst handling: Explain whether you allow short bursts above the limit or enforce strict throttling.
  • Consequences of exceeding limits: State that requests exceeding rate limits will receive HTTP 429 responses and that sustained abuse may result in temporary or permanent key revocation.
  • Plan-based quotas: If different pricing tiers have different quotas, reference your pricing page and state that exceeding plan limits may require an upgrade.

Authentication and API Key Management

API keys are credentials that grant access to your platform. Your AUP must set clear rules:

  • One key per project/application: Do not share a single API key across multiple unrelated applications unless your plan allows it.
  • Server-side only: API keys must be used server-side. Do not expose API keys in client-side code, mobile applications, or public repositories.
  • Key rotation: Rotate API keys periodically and immediately if you suspect compromise.
  • Revocation: We reserve the right to revoke any API key immediately if it is associated with abusive behaviour or a security incident.

Data Handling and Storage

When users retrieve data from your API, you need rules about what they can do with it:

  • Caching: Specify whether and for how long users may cache API responses. Many platforms allow caching for a defined period (e.g., “you may cache API responses for up to 24 hours”).
  • Storage limitations: Define whether users may store API data permanently or only use it in real-time.
  • Data deletion: If a user’s API access is revoked or their account is terminated, require them to delete all data obtained through the API within a specified timeframe (e.g., 30 days).
  • Personal data compliance: If API responses include personal data, users must process that data in compliance with GDPR, CCPA, and other applicable data protection laws.

Attribution and Branding

If your API requires attribution, specify the requirements:

  • Mandatory attribution: If applicable, state that applications using your API must display attribution (e.g., “Powered by [Your Platform]”) in a visible location.
  • Trademark usage: Define how your brand name, logo, and trademarks may be used in connection with integrations. Typically, you’ll allow use of your name to accurately describe the integration but prohibit implying endorsement or partnership.

AI-Specific Clauses

If your platform includes AI or machine learning features — whether that’s generative AI, predictive analytics, content moderation, or any form of algorithmic processing — you need AUP provisions that address the unique risks of AI misuse. This is increasingly important as regulators worldwide introduce AI-specific legislation.

Prohibited AI Uses

Specify activities that are prohibited when using your AI features:

  • Harmful content generation: Do not use AI features to generate content that is illegal, defamatory, threatening, harassing, sexually explicit, or promotes violence or self-harm.
  • Deepfakes and impersonation: Do not use AI features to create realistic but fake images, audio, or video of real people without their consent.
  • Misinformation at scale: Do not use AI features to generate false news articles, fake reviews, misleading product claims, or other deceptive content intended to mislead the public.
  • Automated decision-making: Do not use AI outputs as the sole basis for decisions that have significant legal, financial, or personal impact on individuals (e.g., credit decisions, hiring, medical diagnosis) without appropriate human oversight.
  • Circumventing safety measures: Do not attempt to bypass, manipulate, or “jailbreak” content filters, safety guardrails, or moderation systems.
  • Training competing models: Do not use AI-generated outputs to train, fine-tune, or improve competing AI models or services without express written permission.

AI Output Disclaimers

Protect your business by clarifying the nature and limitations of AI outputs:

  • No guarantee of accuracy: AI-generated outputs may contain errors, omissions, or inaccuracies. Users are responsible for reviewing and verifying all AI-generated content before relying on it.
  • No professional advice: AI outputs do not constitute legal, financial, medical, or other professional advice. Users should consult qualified professionals for specific guidance.
  • Content ownership: Clearly state your position on who owns AI-generated content (typically the user, subject to your IP rights in the underlying model).

Data Usage for AI Training

Be transparent about whether user data or inputs are used to improve your AI models:

  • Training data opt-out: If you use user inputs to improve your models, state this clearly and provide an opt-out mechanism.
  • Data anonymisation: If inputs are used for training, explain how they are anonymised and aggregated before use.
  • Enterprise data isolation: If applicable, state that enterprise or paid-tier customers’ data is never used for model training.

Enforcement Mechanisms

An AUP without enforcement is just a wish list. Your policy must clearly state what happens when someone violates it. Graduated enforcement gives you flexibility while demonstrating proportionality.

Graduated Response

Define a tiered enforcement approach:

  1. Warning: For first-time or minor violations, issue a written warning explaining the violation and requesting the user to cease the prohibited activity.
  2. Temporary restriction: For repeated or moderate violations, temporarily restrict the user’s access to specific features, reduce their rate limits, or suspend their account for a defined period.
  3. Permanent suspension: For severe or repeated violations, permanently terminate the user’s account and revoke all access to the platform.
  4. Legal action: For violations that cause material harm to the platform, its users, or third parties, pursue appropriate legal remedies including injunctive relief and damages.

Immediate Suspension Rights

Reserve the right to act immediately in serious cases:

  • Imminent harm: If we reasonably believe your use poses an imminent risk of harm to the platform, other users, or third parties, we may immediately suspend your access without prior notice.
  • Legal compliance: If we receive a valid legal order or regulatory directive requiring us to restrict your access, we will comply immediately.
  • Security incidents: If your account is compromised or involved in a security incident, we may suspend access to contain the threat.

Appeals Process

A fair appeals process protects both you and your users:

  • Right to appeal: Users may appeal enforcement actions by contacting [email] within 30 days of the action.
  • Review process: Appeals will be reviewed by a member of our team not involved in the original decision.
  • Response timeline: We aim to respond to appeals within 10 business days.
  • Final decision: Our decision on appeal is final and binding.

Liability for Violations

Make clear that users bear responsibility for their violations:

  • Indemnification: Users agree to indemnify and hold you harmless from any claims, losses, damages, or expenses arising from their violation of the AUP.
  • No refunds on termination for cause: If we terminate your account for AUP violations, you are not entitled to a refund of any prepaid fees.
  • Reporting to authorities: We reserve the right to report illegal activity to relevant law enforcement agencies and cooperate with any resulting investigation.

How to Implement and Communicate Your AUP

Writing a great AUP is only half the battle. You also need to ensure that users actually see it, understand it, and can be held to it. Here’s how:

Make It Accessible

  • Dedicated URL: Publish your AUP at a permanent, easy-to-find URL (e.g., yourplatform.com/acceptable-use).
  • Footer link: Include a link to your AUP in your website footer alongside your Terms of Service and Privacy Policy.
  • API documentation: Reference your AUP prominently in your API documentation, ideally in the “Getting Started” section.
  • Developer portal: If you have a developer portal, include the AUP in the onboarding flow.

Obtain Explicit Acceptance

For your AUP to be enforceable, users need to agree to it:

  • Sign-up flow: Require users to accept the AUP (either directly or via your ToS which incorporates it) during account registration. A checkbox with a link to the document is the standard approach.
  • API key issuance: When users generate an API key, display a reminder that API usage is subject to the AUP.
  • Material changes: When you update the AUP in significant ways, notify existing users via email and require re-acceptance before continued use.

Keep It Updated

  • Version dating: Always display the “Last updated” date on your AUP.
  • Change notifications: Notify users of material changes at least 30 days before they take effect.
  • Regular review: Review your AUP at least annually, or whenever you launch new features, enter new markets, or become subject to new regulations.
  • Changelog: Consider maintaining a public changelog of AUP revisions for transparency.

Monitor and Enforce Consistently

An AUP that you never enforce is worse than not having one at all, because it creates the impression that the rules don’t apply:

  • Automated monitoring: Implement rate limiting, anomaly detection, and automated flagging to catch violations early.
  • Manual review: Have a human review process for flagged accounts before taking enforcement action.
  • Consistent application: Apply rules consistently across all users. Selective enforcement undermines the legitimacy of your AUP and can create legal exposure.
  • Documentation: Keep records of all violations and enforcement actions. You’ll need this if a terminated user challenges your decision.

Common Mistakes to Avoid

Even well-intentioned AUPs often fail because of these common mistakes:

1. Being Too Vague

Saying “do not misuse the platform” tells users nothing actionable. Specify what constitutes misuse. Instead of “excessive use,” define thresholds. Instead of “inappropriate content,” list the categories of content that are prohibited. Vague policies are both difficult to enforce and easy for bad actors to argue around.

2. Copying a Competitor’s AUP Verbatim

Your competitor’s AUP was written for their platform, their users, their risk profile, and their legal jurisdiction. Copying it wholesale means you’ll likely have provisions that don’t apply to you, miss provisions that do, and potentially inherit errors or outdated clauses. Use competitor AUPs as inspiration, but tailor yours to your specific business.

3. Burying the AUP in Your Terms of Service

If your AUP is paragraph 47 of a 12,000-word Terms of Service document, nobody will read it and you’ll have a harder time arguing that users were aware of the rules. A standalone AUP with a dedicated URL is easier to reference, easier to update, and more enforceable.

4. No Enforcement Procedures

An AUP that lists prohibited activities but says nothing about consequences is a set of suggestions, not a policy. Define what happens when rules are broken — graduated responses, timelines, appeals processes. This protects you legally and demonstrates fairness if you ever need to justify an enforcement action.

5. Forgetting API-Specific Rules

If your platform has an API, a generic AUP written for web application users is insufficient. API users interact with your platform differently — they need rules about rate limits, key management, data caching, and programmatic access patterns. Without these, you’ll struggle to enforce boundaries on automated usage.

6. Ignoring AI and Machine Learning

If your platform incorporates AI features, you need AI-specific provisions. The landscape of AI regulation is evolving rapidly — the EU AI Act, the UK’s AI regulatory framework, and various sector-specific rules all create new obligations. An AUP that doesn’t address AI use is already outdated.

7. Not Updating After Product Changes

Your AUP should evolve with your product. Launch a new feature? Add rules about how it can be used. Expand into a new market? Ensure your AUP covers local regulatory requirements. Discover a new type of abuse? Add it to the prohibited activities list. A stale AUP creates gaps that bad actors will find.

8. Overly Restrictive Policies

While it’s tempting to prohibit everything that could possibly go wrong, an overly restrictive AUP can drive away legitimate users and hinder adoption. Strike a balance: prohibit genuinely harmful activities, but don’t make normal, expected use feel like walking through a legal minefield.

Legal Considerations by Jurisdiction

Your AUP’s enforceability depends partly on where your users are located. Key considerations:

  • UK: AUP terms must be fair and transparent under the Consumer Rights Act 2015 and the Unfair Contract Terms Act 1977. Disproportionate termination provisions may be challenged as unfair terms.
  • EU: The Digital Services Act (DSA) imposes obligations on platforms regarding content moderation, transparency of enforcement actions, and user rights to appeal. Your AUP and enforcement processes must align with DSA requirements if you serve EU users.
  • US: State consumer protection laws, Section 230 of the Communications Decency Act (for user-generated content platforms), and the FTC Act all influence what you can include in your AUP and how you enforce it.
  • GDPR/UK GDPR: If your AUP enforcement involves processing personal data (e.g., monitoring user activity, logging IP addresses), you need a lawful basis under data protection law. Reference your privacy policy and explain the data processing involved in AUP enforcement.

Putting It All Together: Your AUP Checklist

Use this checklist to ensure your Acceptable Use Policy is comprehensive:

  1. Prohibited activities list (illegal activity, IP infringement, harmful content, spam, malware, fraud, harassment)
  2. Usage limits and fair use definitions
  3. Data scraping and extraction rules
  4. Reverse engineering and security testing provisions
  5. Account security requirements
  6. Resale and sublicensing restrictions
  7. API-specific rules (rate limits, key management, data handling, attribution)
  8. AI-specific clauses (prohibited uses, output disclaimers, training data policy)
  9. Enforcement mechanisms (graduated response, immediate suspension rights, appeals)
  10. Liability and indemnification
  11. Implementation plan (accessibility, acceptance, monitoring)
  12. Effective date and change notification process

Create Your AUP Today

Drafting a comprehensive Acceptable Use Policy is essential for any SaaS or API platform, but it doesn’t have to take weeks or cost thousands in legal fees. The key is to be specific, proportionate, and clear — so your users know exactly what’s expected of them and you have the legal basis to act when those expectations are violated.

You have three options for creating your AUP:

  • Hire a solicitor: Most thorough but expensive (£1,500–£5,000+ for a bespoke policy). Recommended for complex platforms with significant regulatory exposure.
  • DIY from a generic template: Free but risky. Generic templates miss platform-specific, API-specific, and AI-specific provisions that SaaS businesses need.
  • Use LegalForge: Generate a customised AUP along with your Terms of Service, Privacy Policy, and Cookie Policy — all tailored to your specific platform, features, and jurisdiction for just £19. Answer a few questions about your business, and receive professionally drafted, compliant legal documents in under 60 seconds.

Don’t wait until a bad actor exploits a gap in your policies. Protect your platform, your users, and your business with a comprehensive Acceptable Use Policy today.

Generate Your Legal Documents in 60 Seconds

LegalForge creates a customised Acceptable Use Policy, Terms of Service, Privacy Policy, and Cookie Policy tailored to your SaaS or API platform — all for a one-time payment.

Generate Your Policy Now — £19