Skip to content
tech

I Built an App Without a Backend — Just Edge Functions

·6 min read·4 views

I was building BibleFlix — a Bible reading app with AI-powered study. The app is React Native with Expo, works offline for normal reading, and when a user requests a deep study of a passage, it calls Gemini via OpenRouter to generate a full theological analysis.

The problem showed up fast: I needed an API key to call OpenRouter. And that key was hardcoded in the app source code.

// AIService.ts — what should NEVER exist in production
function resolveApiKey(): string {
  return (
    fromExpo ||
    fromNode ||
    'sk-or-v1-51219c47c368090c2f7...'  // this right here
  );
}

Anyone with the APK can extract that string in 30 seconds with strings bibleflix.apk | grep sk-or. Worse: if someone uses that key, I pay the bill.

The backend dev reflex: "I need a server"

My first instinct was the usual one — spin up an API. Laravel, Express, FastAPI, whatever. A /api/study endpoint that receives the request, calls OpenRouter with the key stored in .env, and returns the result.

Does it work? Sure. But for an app that needs a single endpoint, that means:

  • A server running 24/7 (cost)
  • Nginx + PHP-FPM or Node.js (maintenance)
  • SSL, firewall, security updates (responsibility)
  • Monitoring, logs, backups (ops overhead)

I already manage 5 projects in production on a single server. I didn't need one more.

The insight: I don't need a backend — I need a proxy

What the app actually needs is simple:

  1. Receive a request from the app
  2. Attach the API key (which the app must never know)
  3. Forward to OpenRouter
  4. Return the response

That's a proxy. And for a proxy, there's a perfect solution: edge functions.

Cloudflare Workers: the 3-file backend

A Cloudflare Worker is a JavaScript function that runs on Cloudflare's edge — across 300+ datacenters worldwide. A request from a user in São Paulo gets processed in the São Paulo datacenter. Minimal latency.

The structure is dead simple:

bibleflix-worker/
├── wrangler.toml
├── src/
   └── index.ts
└── package.json

The wrangler.toml configures the project:

name = "bibleflix-ai-proxy"
main = "src/index.ts"
compatibility_date = "2024-12-01"

[vars]
ALLOWED_ORIGIN = "https://bibleflix.app"

And the worker itself is under 50 lines:

// src/index.ts
export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    // CORS
    if (request.method === "OPTIONS") {
      return new Response(null, {
        headers: {
          "Access-Control-Allow-Origin": env.ALLOWED_ORIGIN,
          "Access-Control-Allow-Methods": "POST",
          "Access-Control-Allow-Headers": "Content-Type",
        },
      });
    }

    if (request.method !== "POST") {
      return new Response("Method not allowed", { status: 405 });
    }

    const body = await request.json();

    // Forward to OpenRouter with the secret key
    const response = await fetch("https://openrouter.ai/api/v1/chat/completions", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        "Authorization": `Bearer ${env.OPENROUTER_API_KEY}`,
        "HTTP-Referer": "https://bibleflix.app",
        "X-Title": "BibleFlix",
      },
      body: JSON.stringify(body),
    });

    return new Response(response.body, {
      status: response.status,
      headers: {
        "Content-Type": "application/json",
        "Access-Control-Allow-Origin": env.ALLOWED_ORIGIN,
      },
    });
  },
};

The API key is stored as a secret in Cloudflare:

npx wrangler secret put OPENROUTER_API_KEY
# paste the key  it's encrypted and never appears in logs

Deploy:

npx wrangler deploy
# Done. URL: https://bibleflix-ai-proxy.billy.workers.dev

On the app side, the change is surgical — swap the URL and remove the key:

// Before (insecure)
const response = await fetch("https://openrouter.ai/api/v1/chat/completions", {
  headers: { "Authorization": `Bearer ${API_KEY}` },  // key exposed
  body: JSON.stringify(payload),
});

// After (secure)
const response = await fetch("https://bibleflix-ai-proxy.billy.workers.dev", {
  body: JSON.stringify(payload),  // no key, no auth header
});

The app never sees the API key again. The Worker is the only one that knows it.

Why Cloudflare Workers and not something else?

I researched the alternatives. Here's what I found:

PlatformFree TierLatencySecretsBest for
Cloudflare Workers100k req/day~1ms (edge)EncryptedProxy, lightweight API
Vercel Functions100GB-hrs/mo~50ms (serverless)Env varsNext.js apps
AWS Lambda1M req/mo~100ms (cold start)Env vars / Secrets ManagerFull backend
Supabase Edge500k req/mo~5ms (Deno)VaultApps with Supabase DB
Deno Deploy1M req/mo~5msEnv varsDeno/TS APIs
Firebase Functions125k req/mo~200ms (cold start)Firebase configGoogle ecosystem
Netlify Functions125k req/mo~50msEnv varsNetlify sites

Cloudflare won for three reasons:

  1. 100 thousand requests per day for free. BibleFlix won't hit that anytime soon.
  2. Zero cold start. Workers run on V8 isolates, not containers. There's no 200ms+ delay on the first call.
  3. I already use Cloudflare. DNS, R2, the domain is already there. One wrangler deploy and done.

When this pattern works (and when it doesn't)

This "app + edge function as proxy" pattern works perfectly when:

  • Your app needs to call external APIs with secret keys
  • You don't need your own database (or use something like Supabase/Firebase)
  • Your request volume fits the free tier
  • You want minimal latency without managing a server

It doesn't work when:

  • You need complex business logic (authentication, queues, cron jobs)
  • You need a relational database with elaborate queries
  • You have background workers or long-running processes
  • You need a full ecosystem (auth + DB + storage + functions)

For that last case, Firebase or Supabase make more sense as a complete BaaS. But if all you need is a secure proxy between your app and an AI API, a 50-line Worker does the job.

What I learned

The backend dev tendency is to solve everything with a server. It's what we know. But for many apps — especially ones that just consume AI APIs — you're paying for hosting, configuring Nginx, and maintaining SSL for something a serverless function does better, faster, and for free.

BibleFlix has zero backend. Bible data is local JSON in the app. AI-powered study goes through a 50-line Worker. The entire app costs zero infrastructure.

If you're building something that calls AI APIs and you're thinking about spinning up a server just for that, stop. Take a look at Cloudflare Workers. In 10 minutes you'll have a secure proxy running.

And if you want to understand how tools like MCP and Claude Code are changing how devs build and operate software, those posts give good context.

Want to apply this to your project?

Career, code & digital product consulting.

Work with Billy
Billy

Billy

Full Stack Dev & Empreendedor Solo

Building products with code and AI. Creator of HubNews and Sistema Reino.