AI News Hub Logo

AI News Hub

24K visitors, 430K requests, $0 spent. I built a live election results dashboard in 2 days with Claude.

DEV Community
Karthikeyan Gopal

On May 4, 2026, while Tamil Nadu's election results were being counted, a dashboard I built from scratch was serving live data to thousands of people across 24 countries. By end of day: 24k+ visits from India, US, UK, Germany, Singapore, UAE, and 18 more countries 430k+ requests handled 8.7 GB of bandwidth served 60+ commits pushed on counting day alone Total infrastructure cost: $0 No server. No database. No paid tier of anything. The site is still live if you want to explore while you read: tinyurl.com/tn-2026 This is the story of how I built it, the architectural decisions that made it possible, and what I learned about using AI as a real engineering partner. When Tamil Nadu's assembly election date was announced, I saw an opportunity. Election result sites are universally terrible. Cluttered with ads, slow to load, impossible to search, and never mobile-friendly. I wanted to build something better. Something clean, fast, and actually useful. But I also had a second goal: I wanted to test what it actually feels like to build a production system end-to-end with AI as a coding partner. Not toy projects. Not "make me a to-do app." A real product, with real users, under real load, with a hard deadline I couldn't move. The election result was May 4. I started building May 2. Two days. No extensions. Before writing a single line of code, I set three rules: Zero cost. No paid hosting, no domain purchase. If it can't run on free tiers, find a different approach. Must handle thousands of concurrent users. Election results attract massive spikes. If it falls over at peak, the whole thing is pointless. Ship in 2 days. Not "MVP in 2 days, polish later." Live, production-ready, real-data-serving in 2 days. These constraints sound impossible together. Free tier + thousands of users + 2 days? But constraints are where good architecture comes from. A single-page dashboard that gives you everything, no clicking around, no page loads. Real-time alliance tracker with a stacked seat bar and a majority line at 118. You watch the race unfold live. Interactive Tamil Nadu map with all 234 constituencies color-coded by leading party. Click any one for full candidate-wise vote breakdown, margins, and round-by-round progress. Leader cards for the four key faces: Stalin, Vijay, EPS, and Seeman, each showing their personal constituency result alongside their alliance's overall tally. Key Races section that surfaces the closest contests automatically. Sorted by margin. Tiruppattur sat at #1 all day, margin bouncing between 0 and 5. It ended at 1. One vote. Full constituency table with search, sort, and filters by district, party, and status. Type any candidate's name, find them instantly. Counting progress widget showing rounds completed, votes counted in lakhs and crores, and estimated remaining. Dark theme. No ads. No login. Mobile-first with a dedicated Leaders tab. Auto-refreshes silently every 30 seconds. Vanilla JS, 242ms page load. Chart.js for charts, Leaflet for the map, Vite for the build. No React. No Next.js. No framework overhead. All of this in 2 days. Now let me tell you how. Here's the entire system: [My Laptop] → Python scraper (every 2 min) → POST → [Cloudflare Worker] ↓ [Workers KV] ↓ [Edge Cache (120s)] ↓ [24k+ visitors worldwide] That's it. Five components. Let me explain why each choice was deliberate. ECI doesn't have an API. Each constituency's results are a separate HTML page. 234 constituencies. 234 individual web pages. To get a complete picture, I need to hit all 234, parse the HTML tables, extract candidate-wise votes, do computations and stitch them into one JSON for the Page to render the final output. The obvious move: deploy this to AWS Lambda or a cron on a VPS. But that costs money. And now I'm monitoring infrastructure instead of building features. So I ran it on my laptop. A Python script that scrapes all 234 pages in parallel every 2 minutes, computes everything, and POSTs the result to my Cloudflare Worker. Total writes for the entire election day: 997. Less than a thousand writes to serve 24k+ visitors. KV is a key-value store that replicates to Cloudflare's 300+ edge locations globally. When someone in Chennai, Singapore, or London hits my API, they're reading from a datacenter close to them, not waiting for a round-trip to a single server. Free tier gives you 100,000 reads/day and 1,000 writes/day. I used 89,830 reads and 997 writes. Three writes to spare. I ran an entire election night within the free tier limit. Election data doesn't need to be real-time to feel real-time. If results update every 2 minutes from my scraper, but I cache the response at the CDN edge for 120 seconds, the worst case is someone sees data that's 2 minutes old. For an election where counting takes 8 hours, nobody notices. It feels instant. This one decision meant that 24k+ visitors hitting the same URL every 30 seconds translated to roughly 1 KV read every 2 minutes per edge location. Instead of millions of reads, I used 89K. The CDN absorbed the thundering herd. WebSockets seem perfect for "live" data. But they're terrible for free tier: Each connection holds a resource on the server Can't be edge-cached Complex reconnection logic Instead: the browser fetches /api/results every 30 seconds with a simple setInterval. Every request is a normal HTTP GET that hits the CDN edge cache. The server doesn't know or care how many users are connected. The server sends an X-Poll-Interval header telling the client how often to refresh. During active counting: 30 seconds. After all declared: stop polling entirely. Adaptive polling, zero configuration. Once all 234 seats were declared, I literally embedded the final JSON into the Worker code. Zero KV reads. Zero compute. The site now runs forever at zero cost. It will stay live until the internet shuts down. One more thing I didn't pay for: the URL. Cloudflare Pages gives you a free *.pages.dev subdomain. I pointed a free tinyurl shortlink at it. A custom .in or .com domain would've cost money and needed DNS setup. The pages.dev URL worked perfectly. Sometimes the free option is the right option. I used Claude as my partner for the entire project. Here's what that looked like in practice. Wrote the Python scraper (parallel ECI HTML parsing, candidate extraction, alliance computation) Built the entire frontend (vanilla JS, Chart.js, Leaflet maps, responsive CSS) Created the Cloudflare Worker API with caching logic Generated mock data for testing before election day Iterated on UI changes in minutes Every architectural decision. Scraper-on-laptop vs cloud. KV vs database. Edge caching strategy. Polling vs WebSockets. Static switch when counting ends. Every product decision. What features matter. What to skip. When "good enough" ships. Every production judgment call. Is this safe to deploy to thousands of concurrent users right now? Will this caching change cause stale data? Should I test this first or just ship it? Real-time debugging under load. Bugs surfaced live with thousands watching. I triaged, Claude fixed, deployed in minutes. More on this below. On election day alone, I pushed 60+ commits. Here's a sample of the timeline: This is what actually looked like. 1:22 AM — Someone said the close races section was hard to scan. Ten minutes later, I shipped a full sortable Key Races table: top 50 closest contests, clickable rows, party filter, alliance color-coded borders. Not a tweak. A brand new tab. 1:37 AM — "How many votes are even counted so far?" Fair question. Shipped a counting progress widget: rounds completed, total votes counted in lakhs and crores, percentage bar showing how much is left. Five minutes, idea to production. 8:14 AM — The vote percentage was showing over 100%. Postal votes. ECI counts them separately and my denominator didn't include them. Users spotted it. I fixed the math to account for 5 lakh postal ballots on top of 4.88 crore EVM votes. Deployed before the next auto-refresh. 8:42 AM — "Can I see only the seats still being counted?" Shipped a Declared/Counting status filter on the Key Races table. The person who asked saw it live within 3 minutes of their message. And much more features. All shipped while the site was live. While people were actively using it. No staging environment. No PR review. No deployment pipeline. Just me telling Claude exactly what to build, verifying it makes sense, and pushing. This is the part AI skeptics miss. The speed didn't come from Claude writing code fast. It came from me knowing instantly: what the user actually needs (not what they asked for), how it fits into the existing architecture, what could break, and whether it's safe to ship without tests to thousands of concurrent users. That's not a prompt. That's a decade of engineering judgment running on instinct. 1. Constraints breed creativity. The $0 budget forced me into an architecture that was actually better than what I'd have built with unlimited resources. No server to maintain. No database to scale. No bills to pay. Ever. 2. AI is powerful, but engineering judgment is the real multiplier. Claude can write any code you ask for. But it can't tell you what code to write. It can't tell you that WebSockets are overkill here. It can't tell you that 120-second cache staleness is acceptable for election data. It can't tell you that a Python script on your laptop is better than a Lambda function. Those calls come from experience. 3. Ship early, fix live. Mock data on day one. Real scraper on day two. Continuous features on election day. Never "done." Always shipping. 4. People share useful things. I never paid for promotion. I shared the link in whatsapp, telegram, reddit, one Slack group and on LinkedIn. Users shared it with their friends and families. 24k+ visits from organic sharing alone. 5. Free tiers are production-ready. Cloudflare's free tier served 8.7 GB to 24 countries. The infrastructure didn't blink. If you're waiting to have a budget before building, you're waiting for nothing. The site is still live with all final results: tinyurl.com/tn-2026 Full source code: github.com/csekeyan/tn-elections-2026 What would you build differently? Drop a comment. Built by Karthikeyan Gopal. Sr. SDE at Amazon. If you're building with AI and want to exchange ideas, connect on LinkedIn.