From a simple JSON formatter to a 400+ tool developer platform - the complete engineering journey, including the decisions that didn't pan out.
I needed a JSON formatter that worked offline. Not another web app that phones home, not another tool that requires a backend - just a simple, fast, client-side utility.
Three months later, that single tool became API Dev Utils: 400+ tools across 10 categories. All running 100% client-side with zero backend dependencies.
This is not a success story post. It is a breakdown of the architecture decisions that worked, the ones that needed rework, and the constraints that ended up being the best forcing functions I had.
The Origin Story
The problem that started it was straightforward. I was working on a REST API and needed to format some JSON responses. The existing online tools either:
- Required internet connectivity (useless on flights)
- Had privacy concerns (sending sensitive data to third parties)
- Were bloated with ads and tracking
- Had terrible mobile experiences
I built a formatter. Then I needed a base64 encoder. Then a UUID generator. At some point I noticed I was reaching for the same small set of utilities repeatedly - and that the pattern was consistent enough to systematize.
That insight is what turned a collection of scripts into a platform: the tools were not isolated utilities. They were a workflow. Developers move data between formats, validate it, encode it, and debug it in sequences. A platform that understands that is more useful than a folder of bookmarks.
Today, API Dev Utils spans 10 categories:
- Convert & Transform (24 tools) - Format conversions, encoding, data transformation
- Generate (32 tools) - UUID generation, data generators, content creation
- Format & Clean (22 tools) - Formatting, validation, code beautification
- Debug & Validate (31 tools) - JSON validation, testing, debugging utilities
- APIs & Networking (27 tools) - cURL conversion, API testing, request building
- Images & Media (40 tools) - Image manipulation, media conversion, optimization
- Documents & Files (27 tools) - PDF handling, document processing, file utilities
- Writing & Content (40 tools) - Text processing, content generation, markdown tools
- Calculators (18 tools) - Math tools, unit converters, calculations
- Utilities (142 tools) - Security, hashing, encoding, and miscellaneous tools
Choosing the Right Tech Stack
Astro was the right call. Here is why I chose it and what I would change if I were starting today.
Astro: The Content-First Framework
Astro's island architecture was the critical differentiator. Unlike traditional SPA frameworks, it lets me:
- Ship zero JavaScript by default - Only interactive components load JS
- Build static sites - Perfect for a tool-based content model
- Use any UI framework - I stick to vanilla JS for maximum performance
- Optimize automatically - Astro handles code splitting and bundling
// astro.config.mjs
export default defineConfig({
integrations: [tailwind()],
output: 'static',
build: {
format: 'directory'
},
site: 'https://apidevutils.com',
trailingSlash: 'never'
}); TypeScript: Type Safety at Scale
With 400+ tools, type safety is not optional. TypeScript is what makes the registry system workable - without it, every refactor across tool interfaces becomes a hunt for runtime errors rather than a compiler check. It gives me:
- Bugs caught before deployment
- Consistent interfaces across tools
- Better IDE support and autocomplete
- Confident refactoring when adding new features
Tailwind CSS: Rapid UI Development
Tailwind's utility-first approach is the practical choice for a platform of this size. Design consistency across 400+ tools is not a design problem - it is an engineering problem. Tailwind solves it by making inconsistency structurally harder than consistency:
- Consistent design across 400+ tools without a custom design system
- Responsive layouts built quickly
- Dark mode handled at the framework level
- CSS bundle sizes kept minimal through purging
The Zero-Backend Philosophy
Running 100% client-side is not the obvious choice. You give up server-side rendering, dynamic personalisation, easy file upload pipelines, and anything that requires auth. The question I had to answer honestly was whether those tradeoffs were worth it for this specific product. They were - but only because of what you gain:
The Case For It
- Zero infrastructure costs - No servers, no databases, no scaling worries
- Instant performance - No network latency for tool operations
- Real privacy - User data never leaves their browser
- Offline capability - Tools work without internet connection
- Effortless scalability - Static sites scale without architectural changes
What I Had to Solve
1. Large File Processing
The naive approach breaks on large JSON files. Streaming processing and chunking solves it:
// Streaming JSON processor
function processLargeJSON(data, chunkSize = 1000) {
const chunks = [];
for (let i = 0; i < data.length; i += chunkSize) {
chunks.push(data.slice(i, i + chunkSize));
}
return chunks;
} 2. Cross-Tool Data Sharing
LocalStorage-based session management. Simple, reliable, and works across tabs without a server.
3. Complex Computations
Web Workers for CPU-intensive tasks. The main thread stays responsive while heavy processing runs in the background.
Project Structure That Scales
I got the project structure wrong twice before landing on something that actually scales. The first version was a flat directory. The second added categories but no registry. The third - which is what runs today - treats the registry as the source of truth and derives everything else from it.
apidevutils/
├─ src/
│ ├─ components/ # Reusable components
│ │ ├─ tools/ # Tool-specific components
│ │ ├─ ToolLayout.astro
│ │ ├─ Header.astro
│ │ └─ Footer.astro
│ │
│ ├─ pages/ # Page routes
│ │ ├─ tools/ # Tool pages
│ │ │ ├─ json/
│ │ │ ├─ encoding/
│ │ │ ├─ security/
│ │ │ └─ api/
│ │ └─ index.astro
│ │
│ ├─ layouts/ # Page layouts
│ └─ data/ # Tool metadata
│ └─ tools.ts # Central tool registry
│
├─ public/ # Static assets
└─ tests/ # Test suites The key structural insight: tools are data, not just pages. Once you treat them that way, navigation, search, sitemaps, and related links all become derived outputs from a single source. Changing that mental model was what made the platform maintainable at 400+ tools.
The Tools Registry System
Without a registry, you end up with 400 tools and no programmatic way to generate navigation, sitemaps, or related links. Every time you add a tool, you update five files manually. That is not a system - it is a liability.
The registry is a single source of truth for all tool metadata:
// src/data/tools.ts
export interface Tool {
title: string;
description: string;
href: string;
icon: string;
category: string;
keywords: string[];
relatedTools: string[];
}
export const tools: Tool[] = [
{
title: 'JSON Formatter',
description: 'Format and beautify JSON data with syntax highlighting',
href: '/tools/json/formatter',
icon: '🎨',
category: 'Format & Clean',
keywords: ['json', 'format', 'beautify', 'pretty'],
relatedTools: ['json-minifier', 'json-validator', 'json-diff']
},
// ... 390+ more tools
]; Everything else is derived from this registry:
- Automatic navigation generation
- Related tool suggestions
- Search functionality
- Sitemap generation
The Tool Layout Pattern
The single most valuable architectural decision was the ToolLayout component. Every tool using the same layout means I can update the sidebar, breadcrumbs, or related tools logic once and it propagates to 400+ pages instantly. Without this pattern, any cross-cutting UI change would require touching hundreds of files individually.
---
import BaseLayout from '../../../layouts/BaseLayout.astro';
import ToolLayout from '../../../components/ToolLayout.astro';
const relatedTools = [
{ title: 'JSON Minifier', href: '/tools/json/minifier' },
{ title: 'JSON Validator', href: '/tools/json/validator' },
{ title: 'JSON to XML', href: '/tools/json/to-xml' }
];
---
<BaseLayout title="JSON Formatter" description="...">
<ToolLayout title="JSON Formatter" relatedTools={relatedTools}>
<textarea id="input" placeholder="Paste your JSON here..."></textarea>
<pre id="output"></pre>
</ToolLayout>
</BaseLayout> Component Architecture
BaseLayout.astro
Handles global layout, SEO meta tags, and theme system. Centralizing SEO here means changing a meta tag pattern is a one-file change, not a 400-file search-and-replace.
ToolLayout.astro
Standardized tool layout with breadcrumbs, related tools, and consistent styling. This is where the UI consistency guarantee lives - any tool that uses ToolLayout inherits every UX improvement automatically.
CollapsibleSidebar.astro
Advanced navigation with search, categories, and responsive behavior. The sidebar needed to handle 400+ tools without becoming unusable - the collapsible category structure and real-time search are what make it navigable at that scale.
Performance Optimization
Static sites sound simple until you have 400+ pages with different JavaScript bundles. Code splitting stops being a nice-to-have and becomes a structural requirement - without it, every tool page loads JS it does not need.
1. Code Splitting by Tool
Each tool loads only its required JavaScript:
const toolScripts = new Map();
toolScripts.set('json-formatter', '/scripts/tools/json-formatter.js');
toolScripts.set('base64-encoder', '/scripts/tools/base64-encoder.js'); 2. Critical CSS Inlining
Above-the-fold CSS is inlined for instant rendering. On a 3G connection this shaves measurable time from First Contentful Paint.
3. Font Optimization
<link rel="preload" href="/fonts/jetbrains-mono-v24.woff2" as="font" type="font/woff2" crossorigin> 4. Image Optimization
- WebP format with fallbacks
- Responsive images with srcset
- Deferred loading for below-fold images
The Navigation System
Navigating 400+ tools requires more than a long list. The navigation system needs to answer two questions fast: "what tools do you have?" and "where is the specific tool I need right now?"
Features
- Real-time Search - Filters 400+ tools instantly
- Category Organization - 10 logical groupings
- Smart Suggestions - Related tools based on usage
- Keyboard Navigation - Full keyboard accessibility
- Mobile Responsive - Collapses to hamburger menu
The Search Algorithm
The search is deliberately simple: title, description, keywords. No fuzzy matching, no Fuse.js. It runs client-side in under 5ms on the full 400-tool dataset because it does not need to be clever - the keyword index does the work:
function searchTools(query) {
const normalized = query.toLowerCase();
return tools.filter(tool =>
tool.title.toLowerCase().includes(normalized) ||
tool.description.toLowerCase().includes(normalized) ||
tool.keywords.some(keyword => keyword.includes(normalized))
);
} Material Design Implementation
I chose Material Design 3 specifically because accessibility and dark mode came with it. For a platform shipping 400 tool UIs, designing a full component system from scratch would have taken months and introduced inconsistency at every edge. MD3 gave me a proven token system I could implement once and build on:
:root {
--md-sys-color-primary: #6750A4;
--md-sys-color-on-primary: #FFFFFF;
--md-sys-color-secondary: #625B71;
--md-sys-color-surface: #FFFBFE;
} Testing Strategy
With 400 tools, test coverage below complete E2E on critical paths is just theatre. Unit tests catch logic bugs. E2E tests catch integration failures. Visual regression tests catch the kind of layout breakage that only shows up in a browser.
1. Unit Testing with Vitest
describe('JSON Formatter', () => {
it('should format valid JSON', () => {
const input = '{"name":"John","age":30}';
const output = formatJSON(input);
expect(output).toContain(' "name": "John"');
});
}); 2. E2E Testing with Playwright
test('JSON formatter works end-to-end', async ({ page }) => {
await page.goto('/tools/json/formatter');
await page.fill('#input', '{"test":"data"}');
await page.click('button:has-text("Format JSON")');
const output = await page.textContent('#output');
expect(output).toContain('"test": "data"');
}); 3. Visual Regression Testing
- Automated screenshots for all tools
- Cross-browser testing (Chrome, Firefox, Safari)
- Mobile viewport testing
SEO at Scale
400 tool pages with hand-written meta descriptions would take weeks and drift out of sync immediately. Automated generation from the registry means every new tool gets correct SEO on the first deploy, and any structural SEO change propagates across all tools automatically.
1. Automated Meta Generation
function generateMetaTags(tool: Tool) {
return {
title: `${tool.title} - Free Online Tool | API Dev Utils`,
description: tool.description,
keywords: [...tool.keywords, 'online tool', 'free'].join(', '),
};
} 2. Internal Linking Strategy
- Each tool links to 5 related tools
- Category pages link to all tools in that category
- Homepage links to all categories
- Breadcrumb navigation for context
Deployment and Scaling
Static sites on Cloudflare Pages are effectively free to operate at any scale. Zero infrastructure cost regardless of traffic - a number that would cost hundreds per month on any serverful alternative. The deployment pipeline is minimal by design:
# Build command
npm run build
# Output directory
dist/
# Automatic deployment on git push
# Cloudflare handles CDN, SSL, and scaling Core Web Vitals
- First Contentful Paint: <1.2s
- Largest Contentful Paint: <2.5s
- Cumulative Layout Shift: <0.1
- First Input Delay: <100ms
Lessons Learned
1. Add structure one step before you need it
Version 1 had 5 tools and a flat directory. I resisted adding categories until tool 30. That delay made the eventual migration painful. The right moment to add structural complexity is when you can see it becoming necessary - not after it already is.
2. Consistency beats features
The most common feedback from users is not about specific tools. It is that the platform feels coherent. Every tool following the same layout, input/output pattern, and error handling makes the whole thing easier to use than any individual feature would.
3. Performance is structural, not cosmetic
Fast tools keep users coming back. But performance at this scale is not about micro-optimizations - it is about architectural choices made early. Code splitting, registry-driven SEO, critical CSS inlining: these are structural decisions, not polish.
4. Privacy is a real competitive advantage
Being client-side is not just a technical property. It is the reason developers trust the platform with sensitive data. Users will paste API keys, credentials, and production data into a tool they trust not to phone home. That trust is hard to earn and easy to lose.
5. Automate before you feel the pain
With 400+ tools, any manual process in the development or deployment pipeline eventually becomes a bottleneck. Testing, deployment, sitemap generation, and SEO are all automated not because 400 tools demanded it - but because I could see 50 tools making it painful and got ahead of it.
What I Am Building Next
The most interesting engineering problems ahead are not about adding more tools - they are about making the existing ones more useful together. Tool chaining is the hard one: letting users pipe the output of one tool directly into another without copy-pasting. The state management model for that is more complex than anything in the current architecture.
The VS Code extension is a distribution problem more than a technical one. The tools work fine. The question is how to surface them at the moment a developer needs them inside their editor, without requiring context switching.
AI-assisted discovery is the most experimental. The keyword search works well for users who know what they are looking for. It fails when they do not. Natural language queries - "I have a JWT and I want to decode it" - require a different kind of matching than keyword inclusion.
Conclusion
The zero-backend constraint was not a limitation. It was a forcing function.
Every architectural decision that made the platform good came from defending that constraint: the registry system exists because you need a programmatic way to manage 400 tools without a database. The ToolLayout pattern exists because consistency at scale requires it to be structural, not cultural. The automated SEO pipeline exists because manual meta descriptions do not survive beyond 50 tools.
Constraints are better design drivers than requirements documents. The best architecture decisions I made were responses to specific pressures - not upfront design choices made in the abstract.
Resources & Links
- Live Site: apidevutils.com
- Tech Stack: Astro + TypeScript + Tailwind CSS
- Deployment: Cloudflare Pages
Key Metrics
- 400+ - Total developer tools
- 10 - Tool categories
- 95+ - Lighthouse performance score
- 0 - Backend servers required
Working through the challenges in this post? I help engineering leaders and CTOs navigate complex technical decisions and scale high-performing teams. Schedule a consultation →