EcoAgents — I built an AI agent that knows your carbon footprint and never lets you forget it
This is a submission for Weekend Challenge: Earth Day Edition EcoAgents is a personal AI agent that analyses your carbon footprint, builds a personalized action plan, and remembers you across sessions — so it can follow up, check in, and give advice specific to your situation. Most carbon footprint tools are one-shot: answer questions, get a number, close the tab, forget everything. EcoAgents is different because it's built around an agent architecture — it has identity, memory, reasoning, and the ability to take action over time. 🔗 Live demo: https://ecoagents.vercel.app/ GitHub: https://github.com/Navin1-11-04/Ecoagents.git Climate action at the individual level suffers from an awareness gap — people don't know their actual footprint, don't know which actions matter most, and have no system to hold them accountable over time. Existing tools either: Give you a generic number with no follow-up Show global averages with no connection to your personal situation Forget you the moment you close the browser I wanted to build something that felt more like a personal sustainability coach than a calculator. That's where the "agent" framing came in. User → Auth0 (identity) → Onboarding wizard ↓ Gemini 2.5 Flash (analysis) ↓ Backboard (memory storage) ↓ Dashboard + Chat agent + Weekly email Layer Technology Why Framework Next.js 16 (App Router) Server components + streaming Auth Auth0 for Agents v4 Agent identity + secure sessions AI Google Gemini 2.5 Flash Analysis, chat, email content Memory Backboard Persistent agent memory across sessions Email Resend Weekly agentic check-ins Charts Recharts CO₂ breakdown + global comparison OG Images Next.js ImageResponse Shareable score cards Deploy Vercel Edge functions + auto-deploy Dev assist GitHub Copilot Used throughout I used Auth0's brand new v4 SDK which completely changed the API from v3. Key differences that caught me out: Auth routes moved from /api/auth/* → /auth/* middleware.ts became proxy.ts in Next.js 16 AUTH0_BASE_URL renamed to APP_BASE_URL AUTH0_ISSUER_BASE_URL renamed to AUTH0_DOMAIN (without https://) The v4 middleware pattern for route protection: // proxy.ts export async function proxy(request: NextRequest) { const authRes = await auth0.middleware(request); if (request.nextUrl.pathname.startsWith('/auth')) return authRes; if (request.nextUrl.pathname === '/') return authRes; const session = await auth0.getSession(request); if (!session) { return NextResponse.redirect(new URL('/auth/login', request.url)); } return authRes; } The core analysis prompt asks Gemini to return a typed JSON object with CO₂ estimates per category, 6 ranked actions, and a personalised agent message: const model = genAI.getGenerativeModel({ model: 'gemini-2.5-flash' }); const prompt = ` Return JSON only (no markdown). Structure: { "totalTonnesCO2PerYear": number, "breakdown": { transport, energy, diet, shopping }, "actions": [{ id, title, description, impact, difficulty, category }], "agentMessage": "warm 2-sentence message" } `; // Always strip code fences before parsing const clean = text.replace(/\`\`\`json|\`\`\`/g, '').trim(); const analysis = JSON.parse(clean); I also used Gemini for streaming chat responses — the agent replies appear word-by-word using generateContentStream: const result = await model.generateContentStream(prompt); const stream = new ReadableStream({ async start(controller) { for await (const chunk of result.stream) { controller.enqueue(encoder.encode(chunk.text())); } controller.close(); }, }); return new Response(stream, { headers: { 'Content-Type': 'text/plain; charset=utf-8' }, }); And on the client, the chat panel reads the stream chunk by chunk: const reader = res.body.getReader(); while (true) { const { done, value } = await reader.read(); if (done) break; const chunk = decoder.decode(value, { stream: true }); // Append to the last message progressively setMessages(prev => { const updated = [...prev]; updated[updated.length - 1].text += chunk; return updated; }); } After analysis, I create a per-user Backboard assistant with memory: 'Auto' — this means Backboard automatically extracts facts and preferences from messages without me having to manage what gets stored. const assistant = await backboardPost('/assistants', { name: `EcoAgent_${userId}`, system_prompt: 'You are EcoAgent, a personal sustainability coach...', llm_provider: 'google', llm_model_name: 'gemini-2.5-flash', memory: 'Auto', }); // Store profile as memory-enabled message await backboardPost(`/threads/${thread.thread_id}/messages`, { content: `User carbon profile: ${JSON.stringify(analysis)}`, memory: 'Auto', send_to_llm: false, }); On return visits, the agent has context about what the user previously committed to — enabling genuine continuity. The most "agentic" behaviour: a Gemini-generated email personalised to each user's specific uncompleted actions, sent via Resend: const prompt = ` Write a 150-word weekly check-in for ${name}. Their footprint: ${total}t CO2/year Completed this week: ${completedActions.join(', ') || 'none yet'} Their next best action: ${nextAction.title} (saves ${nextAction.impact}t) Be warm, specific, and encouraging. `; Shareable 1200×630 OG images generated on the edge — no external dependencies: export const runtime = 'edge'; export async function GET(req: NextRequest) { return new ImageResponse(, { width: 1200, height: 630 }); } Auth0 v4 callback URL mismatch — The SDK now uses /auth/callback not /api/auth/callback. I spent 20 minutes debugging a "callback URL mismatch" error that turned out to be one character difference in the Auth0 dashboard setting. Gemini quota exhaustion — gemini-2.0-flash hit the free tier daily limit immediately (turns out it was deprecated in February 2026). Switching to gemini-2.5-flash fixed it and improved quality. Backboard API discovery — The correct base URL (https://app.backboard.io/api) wasn't obvious from the docs. Found it by reading their open-source benchmark repo. window on the server — Using window.innerWidth inside a React component for responsive layout crashes Next.js SSR. The fix was simple: use Tailwind's sm: breakpoint classes instead. Gemini streaming in Next.js — ReadableStream works differently than expected with Next.js route handlers. The key was returning a plain Response with the stream rather than NextResponse. Best use of Auth0 for Agents — v4 SDK with proxy.ts middleware for route protection, auth0.getSession() for server component auth checks, and the agent's Auth0 sub ID as the persistent user identifier across Backboard memory. Best use of Google Gemini — used in four distinct places: (1) structured JSON footprint analysis, (2) streaming chat agent with user-specific context, (3) weekly check-in email content generation, and (4) action plan ranking and descriptions. Best use of Backboard — per-user persistent assistant with memory: Auto that retains carbon profiles, committed actions, and conversation context across sessions. The agent genuinely remembers returning users. Best use of GitHub Copilot — used throughout development for TypeScript type inference, Tailwind class suggestions, and boilerplate for the Auth0 v4 migration. Particularly helpful for the Recharts configuration. Sign in with Auth0 — takes 5 seconds 4-step onboarding — transport, energy, diet, shopping. Animated step transitions, custom sliders with live fill, option cards with context labels Gemini analysis — ~15 seconds. Returns your CO₂ total, category breakdown, 6 ranked actions, and a personalised agent message Dashboard — donut chart breakdown, global comparison bar chart (your score vs India/World avg/China/USA), action checklist with CO₂ savings per item EcoAgent chat — floating chat panel, streaming responses, pre-seeded prompts. Knows your full profile so advice is specific Share score card — 1200×630 OG image generated on the edge, shareable anywhere Weekly email check-in — Gemini writes a personalised email referencing your specific uncompleted actions and sends it via Resend Reflections The most interesting technical insight was how much the "agent" framing changes the architecture compared to a regular web app. A calculator is stateless — render, compute, display. An agent needs: Identity (Auth0) — who is this person across sessions? Memory (Backboard) — what have they told me before? Reasoning (Gemini) — what should I say given everything I know? Action (Resend) — what can I do proactively on their behalf? Each piece is straightforward. The interesting work is in how they connect. Earth Day was the right theme for this kind of project. Climate change is one of those problems where individual behaviour genuinely matters at scale — and the gap between knowing you should do something and actually doing it is exactly where a persistent agent can help. Built over one weekend with Next.js, Auth0, Gemini, Backboard, and too much coffee. Happy Earth Day 🌍
