60 days ago, the pages driving most of our clients' SEO traffic didn't exist. Today they do — and we're sharing the exact system behind it.
60 days ago, the pages driving most of our SEO traffic didn't exist.
No template spam. No city-name swaps. No thin AI content.
Instead, we built a system where content isn't written page by page — it's built like software.
13,000+ useful pages generated in under 3 hours. And the curve is still going up.
At some point it clicked: this isn't traditional programmatic SEO. This looks like the future of AI-powered programmatic SEO. We're calling it Programmatic SEO 2.0.
Most people hear "programmatic SEO" and think of:
But we went much wider. Our system generated 13,000+ pages across 6 different content categories.
Comparison pages are actually the smallest category (only 1%). Most pSEO practitioners start here — making these "obvious" plays the smallest opportunity.
Resource pages make up the bulk: 34 different content types (idea lists, checklists, calendars, guides, and templates) across 309 niches. That's how one system produces 7,600+ pages.
Free tools are interesting because they're not just text pages. They are actual working tools — each one including niche-specific examples and context. All created programmatically.
If you take just one thing from this article, make it this:
For programmatic pages, we never ask AI to write freeform content. We ask it to fill a strict JSON schema.
AI generates the data. The front end handles the presentation. These two layers never mix.
This matters because freeform AI generation breaks at scale:
Schemas solve that — every page follows the exact same structure. Which means:
When you're generating 13,000 pages, structure isn't a limitation. It's the key.
// Simplified — real schemas are longer
interface ResourceArticle {
meta: {
content_type: string;
niche: string;
};
seo: {
title: string; // templated, not AI-generated
description: string;
keywords: string[];
};
content: {
intro: string;
sections: {
heading: string;
items: { // exactly 15-20 per section
title: string;
description: string;
difficulty?: 'beginner' | 'intermediate' | 'advanced';
potential?: 'high' | 'medium' | 'standard';
}[];
}[];
pro_tips: string[]; // exactly 5
};
}That constraint forces consistent, usable output. Without it, you might get 8 items on one page and 40 on the next. With schemas, every page stays structured.
Another benefit of this architecture:
That means we can redesign the entire website without regenerating any content. We've already updated page layouts multiple times — not a single content file changed.
The real scale comes from the niche taxonomy.
We built structured context for 309 different niches, each including:
This is the most important part of the entire system — and the part most teams would underinvest in.
When the system generates something like "SEO Checklist for Travel Bloggers," it doesn't just replace the word "travel" into a generic checklist. The model receives structured context about that niche:
{
"slug": "travel",
"name": "Travel",
"context": {
"audience": "Armchair travelers, digital nomads, family vacation planners",
"pain_points": "Seasonal traffic swings, high competition for destination keywords",
"monetization": "Affiliate (booking, gear), display ads, sponsored trips",
"content_that_works": "Itineraries, cost breakdowns, off-the-beaten-path guides",
"subtopics": ["budget travel", "luxury travel", "adventure travel", "solo travel"]
}
}So instead of producing generic output, the system produces niche-specific content.
A health blogger's checklist focuses on E-E-A-T, authority signals, and YMYL compliance. A travel blogger's checklist focuses on seasonal keyword planning and destination competition. Same schema — completely different substance.
The generation system itself is surprisingly simple.
We used Gemini Flash to produce the content. At this scale, the most important factor isn't peak model quality — it's the cost-to-quality ratio. Gemini Flash supports native structured JSON output, which means the model returns valid JSON directly rather than wrapping responses in text or markdown. This eliminates a whole category of parsing issues.
The system runs 100 concurrent workers. Most AI APIs can handle far more parallel requests than teams expect. At this level of concurrency, 13,000+ pages generate in under 3 hours. The main bottleneck isn't the model — it's API rate limits.
One important detail: titles are not generated by AI. Instead, we use deterministic templates like:
A well-designed template produces better titles than AI. Consistent. Predictable. Optimised for search.
We rolled pages out progressively over several weeks and monitored indexing and traffic as we went. Here's where things landed 60 days later:
Weekly clicks went from 971 → 5,500 in 60 days. With each batch indexed, pages rank for long-tail keywords within days and add incremental traffic.
The immediate reaction is always the same: "Isn't that exactly what Google is cracking down on?"
That would be a fair assumption — but this system works differently from traditional programmatic SEO.
Most pSEO fails because pages are thin — template substitutions with the same content and different words swapped in. Our pages are structured and functional.
A page like "100 Blog Post Ideas for Finance Bloggers" includes structured sections, filtering by category and difficulty, and copy-to-clipboard functionality. The page can actually be used.
Every content type has its own purpose-built React component with filtering and search, structured tables, proper UX, schema markup, breadcrumbs, and FAQ schema. These aren't markdown pages dumped into a generic template.
For every page, we ask two questions:
For most of these pages, the answer is yes.
Not just text generated at scale. Actually useful pages generated at scale.
We're still early. Only ~50% of the pages are indexed. The niches covered are still relatively broad — there's a whole layer of deeper, more specific content we haven't generated yet. The system is already built to handle that scale with no architectural changes required.
13,000+ pages sounds impressive. But that's not the real advantage. The real gain is in the feedback loop.
Every week we learn which niches perform best, which content types attract traffic, and where the long tail actually lives. This data feeds back into the taxonomy, which improves the next generation run. The system improves as it scales.
AI works best when it operates inside constraints. Not writing freeform content — but filling structured systems designed by humans. AI content should be built, not written.

DigitalWhale builds pSEO 2.0 systems for businesses that want to own their niche. Get a free audit and see what's possible.
Get a free audit →