Back to Blog
Case StudyMarch 20266 min read

$12,000 in 8 hours: what happens when your OpenAI key leaks

In January 2026, a developer shipped a Next.js app built with Cursor that called the OpenAI API directly from the browser. The API key was embedded in the client-side JavaScript bundle. Within 8 hours of deployment, the key had been extracted and used to run up $12,000 in API charges.

How it happened

The developer asked Cursor to "add a chatbot that uses GPT-4o." Cursor generated a React component that called the OpenAI API directly using fetch. The API key was stored in an environment variable prefixed with NEXT_PUBLIC_, which Next.js bundles into client-side code by design.

// This was in the client-side bundle
const response = await fetch('https://api.openai.com/v1/chat/completions', {
  headers: {
    'Authorization': `Bearer ${process.env.NEXT_PUBLIC_OPENAI_KEY}`,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({ model: 'gpt-4o', messages }),
});

The NEXT_PUBLIC_ prefix is the key detail. Any environment variable starting with this prefix is included in the browser bundle. The developer did not realize this distinction.

The timeline

  • Hour 0: App deployed to Vercel
  • Hour 1: Bots scanning JavaScript bundles for API keys found the OpenAI key
  • Hour 2: First unauthorized API calls began, using GPT-4o for bulk content generation
  • Hour 4: Usage spiked to 500+ requests per minute across multiple IP addresses
  • Hour 6: OpenAI sent a billing alert email, but the developer was asleep
  • Hour 8: Developer woke up to $12,000 in charges and immediately revoked the key

Why bots find keys so fast

Automated scanners continuously crawl the web looking for exposed API keys. They check JavaScript bundles, public GitHub repos, and even error messages. Common patterns they search for include:

  • OpenAI keys: sk- prefix
  • Stripe keys: sk_live_ or sk_test_ prefix
  • AWS keys: AKIA prefix
  • Supabase service role keys: eyJ (JWT) in combination with Supabase URLs

These bots are fast, persistent, and operate 24/7. You cannot rely on obscurity.

How to prevent this

The fix is architectural: never call third-party APIs directly from the browser. Instead, create a server-side API route that proxies the request:

// app/api/chat/route.ts (server-side, key stays on server)
import { NextResponse } from 'next/server';

export async function POST(request: Request) {
  const { messages } = await request.json();

  const response = await fetch('https://api.openai.com/v1/chat/completions', {
    headers: {
      'Authorization': `Bearer ${process.env.OPENAI_KEY}`, // No NEXT_PUBLIC_ prefix
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({ model: 'gpt-4o', messages }),
  });

  const data = await response.json();
  return NextResponse.json(data);
}

Notice the environment variable is OPENAI_KEY, not NEXT_PUBLIC_OPENAI_KEY. Without the prefix, Next.js keeps it server-side only.

Additional safeguards

  • Set billing limits: OpenAI, Anthropic, and most API providers let you set hard spending caps
  • Use restricted keys: Create API keys with the minimum permissions needed
  • Monitor usage: Set up alerts for unusual spending patterns
  • Scan before shipping: Run a Sekrd scan to detect exposed API keys and secrets in your client-side bundles before they go live

Don't ship until you're sekrd

Run a free scan to find the vulnerabilities your AI missed.

Scan Your App Free