I Built an AI Text Summarizer in One Night with Claude + Next.js
It was a Monday. I had the Anthropic API key provisioned, Next.js scaffolded, and one self-imposed deadline: ship a working demo live on the public internet before midnight. A few hours later, https://ai-summarizer-next.vercel.app/ went live. You paste any block of text, pick a style — concise, bullet points, executive summary, or ELI5 — and Claude Sonnet 4.6 gives you back a clean summary in about a second. Here's how I did it, what the code looks like, and a few things that surprised me. Every day, we skim more words than we read. Articles, meeting notes, policy documents, contract clauses, research papers — the wall of text never stops. Existing summarization tools are either buried inside other products (Notion AI, Gmail) or too generic (just "tl;dr" with no control over tone or length). I wanted something trivially fast, with a few quality knobs, and completely free for the user. Next.js 14 (App Router) — one framework for UI + API in one repo TypeScript — catches the silly mistakes I always make at 10 PM Tailwind CSS — styling without leaving the TSX file Anthropic SDK — Claude Sonnet 4.6 as the brain Vercel — one git push and the world has access Total cost: $0. Anthropic gives generous free credits for new accounts, Vercel's hobby tier is free, and everything else is open-source. The entire "AI" logic is one parameterised system prompt: You are a summarization expert. Summarize in a ${style} style. Return ONLY the summary. That's it. I spent more time tuning the UI than the prompt. Claude is smart enough that you don't need to bludgeon it with instructions — you need to be specific about what you want and quiet about everything else. The line Return ONLY the summary is what keeps the model from prefacing every response with "Here's a summary:". app/api/summarize/route.ts: import Anthropic from '@anthropic-ai/sdk'; import { NextRequest, NextResponse } from 'next/server'; const client = new Anthropic(); export async function POST(req: NextRequest) { try { const { text, style } = await req.json(); const msg = await client.messages.create({ model: 'claude-sonnet-4-6', max_tokens: 1024, system: `You are a summarization expert. Summarize in a ${style || 'concise'} style. Return ONLY the summary.`, messages: [{ role: 'user', content: text }], }); return NextResponse.json({ summary: (msg.content[0] as any).text }); } catch (err: any) { return NextResponse.json({ error: err.message }, { status: 500 }); } } Thirty lines. One Claude call. That's the whole backend. app/page.tsx is a textarea, a dropdown, and a button. A feature doesn't have to look complicated to be useful: 'use client'; import { useState } from 'react'; export default function Home() { const [text, setText] = useState(''); const [style, setStyle] = useState('concise'); const [summary, setSummary] = useState(''); const [loading, setLoading] = useState(false); async function summarize() { setLoading(true); const r = await fetch('/api/summarize', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ text, style }), }); const d = await r.json(); setSummary(d.summary || d.error); setLoading(false); } return ( AI Summarizer Powered by Claude Sonnet 4.6 setText(e.target.value)} placeholder="Paste text to summarize..." /> setStyle(e.target.value)}> Concise Bullet Points Executive Summary ELI5 {loading ? 'Summarizing...' : 'Summarize'} {summary && ( {summary} )} ); } Three shell commands and one web click: git init && git add . && git commit -m "AI Summarizer on Next.js" git remote add origin https://github.com/YOUR_USERNAME/ai-summarizer-next.git git push -u origin main Then on vercel.com/new: Import the repo Add environment variable: ANTHROPIC_API_KEY = your key Click Deploy About 60 seconds later the URL is live. The single most important thing to get right: the environment variable must be set BEFORE you click Deploy. If you forget, the build succeeds but every request hits a 500. Add the env var first. How boring the code got. I expected LLM-powered apps to need careful error handling, streaming, complicated state — but Claude is quick enough on a 1k-token summary that a plain request/response flow feels instant. How clean the Anthropic SDK is. new Anthropic() just works if the env var is named ANTHROPIC_API_KEY. No config object, no initialisation dance. How much the UI matters. I spent more time on textarea padding and the button's disabled state than on the API integration. That felt wrong, but users judge trust from design, not from clever prompts. Live demo: https://ai-summarizer-next.vercel.app/ GitHub: [https://github.com/yramstech/ai-summarizer-next/] If you've been putting off your first Claude API project — it's genuinely a Tuesday-night, one-sitting build. Pick a narrow problem, resist the urge to add features, and hit deploy before midnight.
