Vercel offers two function runtimes and the choice matters for performance, cost, and capability. Most developers default to Serverless Functions because that's what Next.js API routes use by default — but Edge Functions are often the better choice for AI applications and many API patterns.
The Core Difference
| Serverless Functions | Edge Functions | |
|---|---|---|
| Runtime | Node.js (full) | V8 isolates (Cloudflare Workers model) |
| Cold start | 100–800ms | 0–5ms |
| Max memory | Up to 3 GB | 128 MB |
| Max duration | 60s (Pro) / 800s (Enterprise) | 30 seconds |
| Available APIs | Full Node.js + npm ecosystem | Web APIs only — no fs, no native modules |
| Streaming | Yes (with workarounds) | Native, first-class |
| Deployment region | Single region (or multi with config) | All edge regions globally |
| Cost | GB-seconds + invocations | Invocations only (much cheaper per call) |
When to Use Edge Functions
1. AI Streaming Chat
Edge Functions were built for streaming. No cold start means the first token reaches the user in milliseconds. Use the Edge runtime for any route that streams an LLM response.
// app/api/chat/route.ts — Edge runtime
export const runtime = 'edge';
export async function POST(req: Request) {
// Native streaming — works perfectly on Edge
const stream = await openai.chat.completions.create({
model: 'gpt-4o-mini',
messages: await req.json(),
stream: true,
});
return new Response(stream.toReadableStream());
}2. Auth Middleware
Validating a JWT or session cookie on every request is cheap and fast on the Edge. No cold start means no added latency to your auth check.
// middleware.ts — runs on Edge automatically
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
export function middleware(request: NextRequest) {
const token = request.cookies.get('session')?.value;
if (!token) {
return NextResponse.redirect(new URL('/login', request.url));
}
// Validate JWT using Web Crypto API (available on Edge)
return NextResponse.next();
}
export const config = {
matcher: ['/dashboard/:path*', '/api/protected/:path*'],
};3. Geolocation and A/B Testing
Edge Functions have access to the request's geolocation headers. Use them for region-specific content, currency localisation, or A/B test routing — all at zero-cold-start speed.
When to Use Serverless Functions
1. Database Queries with Connection Pooling
Most database drivers use TCP connections, which aren't available in V8 isolates. Use Serverless Functions (Node.js runtime) for direct database access, or use an HTTP-based database client (Neon, PlanetScale, Supabase) on the Edge.
// app/api/documents/route.ts — Node.js runtime (default)
// DO NOT add export const runtime = 'edge' here
import { db } from '@/lib/db'; // Drizzle + standard pg driver
export async function GET() {
const docs = await db.select().from(documents);
return Response.json(docs);
}2. Heavy npm Packages
PDF processing, image manipulation, or any package using native Node.js modules requires the Serverless runtime. Edge Functions only support pure JavaScript/TypeScript.
3. Long-Running Operations
If a task might exceed 30 seconds, you need Serverless (up to 60s on Pro) or — better — a background job service like Trigger.dev or Inngest.
The Hybrid Pattern for AI Apps
Most production AI applications use both runtimes:
- Edge: streaming chat, auth middleware, A/B routing, geolocation
- Serverless: database reads/writes, file uploads, heavy processing, webhook handlers
// Edge: streaming chat endpoint
// app/api/chat/route.ts
export const runtime = 'edge';
// Serverless: save conversation to database
// app/api/conversations/route.ts
// (no runtime export = Node.js serverless)You can mix runtimes across routes in the same Next.js project. Set export const runtime = 'edge' only on the specific routes that benefit from it.| Metadata | Value |
|---|---|
| Title | Vercel Edge Functions vs Serverless Functions: Which to Use and When |
| Tool | Vercel |
| Primary SEO keyword | vercel edge functions vs serverless |
| Secondary keywords | vercel edge runtime, next.js edge runtime, vercel function runtime comparison |
| Estimated read time | 7 minutes |
| Research date | 2026-04-14 |