AI & Automation

How I Used NLP AI Tools to Build 20,000+ SEO Pages (And Why Keyword Research is Dead)


Personas

SaaS & Startup

Time to ROI

Medium-term (3-6 months)

OK, so here's something that might shock you: I haven't used traditional keyword research tools like Ahrefs or SEMrush for content creation in over 6 months. Instead, I've been using NLP AI tools for semantic keyword enrichment, and the results speak for themselves.

Last year, I worked with a B2C Shopify client who had a massive challenge - over 3,000 products across 8 languages with virtually no SEO foundation. We went from less than 500 monthly visitors to over 5,000 in just 3 months using an AI-native approach that completely revolutionized how I think about keyword optimization.

The problem with traditional keyword research? It's backwards. Most SEO professionals are still obsessing over exact-match keywords and search volumes that are usually wrong anyway. Meanwhile, search engines have evolved far beyond simple keyword matching to understand semantic relationships, user intent, and contextual meaning.

Here's what you'll learn from my experience:

  • Why traditional keyword tools are becoming obsolete for content creation

  • How NLP AI tools can identify semantic relationships traditional tools miss

  • The exact workflow I used to generate 20,000+ indexed pages using semantic enrichment

  • Real metrics from implementing this approach across multiple client projects

  • When this approach works (and when to stick with traditional methods)

This isn't about replacing human expertise - it's about amplifying it with AI that understands language the way search engines actually work in 2025. Let me show you exactly how I did it.

Industry Reality

What every SEO expert is still doing wrong

Walk into any SEO agency today, and you'll see the same workflow that's been around for a decade. Start with Ahrefs or SEMrush, export thousands of keywords, filter by search volume and difficulty, then create content around exact-match phrases. It's mechanical, predictable, and increasingly ineffective.

The industry loves this approach because it feels scientific. You can show clients spreadsheets with search volumes, create content calendars based on keyword difficulty scores, and track rankings for specific terms. It looks professional, measurable, and strategic.

Here's the conventional wisdom most SEO professionals still follow:

  1. Start with seed keywords - Begin with broad terms related to your niche

  2. Use keyword research tools - Export hundreds of related keywords with search volumes

  3. Filter by metrics - Focus on high volume, low competition keywords

  4. Create one-to-one content - Write one piece of content per target keyword

  5. Track exact-match rankings - Monitor how each page ranks for its target keyword

This approach exists because it mirrors how search engines worked 10 years ago, when exact keyword matching was the primary ranking factor. Tools like Ahrefs and SEMrush built their entire business models around this methodology, and most SEO education still teaches it as gospel.

But here's where it falls short: search engines don't work this way anymore. Google's algorithms understand semantic relationships, user intent, and contextual meaning. A page optimized for "email marketing software" can rank for "automated email platforms," "newsletter tools," and dozens of related terms without ever mentioning them explicitly.

Meanwhile, those search volume numbers everyone obsesses over? They're estimates based on limited data, often completely wrong, and don't account for the hundreds of long-tail variations that actually drive traffic. I've seen keywords with "0 searches" according to tools drive 100+ monthly visitors consistently.

The transition to semantic SEO requires a completely different mindset - one that focuses on topics, intent, and comprehensive coverage rather than individual keyword optimization.

Who am I

Consider me as your business complice.

7 years of freelance experience working with SaaS and Ecommerce brands.

The realization hit me during a project with a B2C Shopify store that had over 3,000 products. They needed SEO content across 8 different languages, which meant potentially 24,000+ pages of unique, optimized content. The traditional approach would have taken months just for keyword research, let alone content creation.

My first instinct was to follow the standard playbook. I fired up SEMrush and started pulling keyword data for their product categories. After hours of work, I had spreadsheets with thousands of keywords, search volumes, and difficulty scores. It looked impressive, but I quickly realized three major problems.

First, the keyword data was completely generic. These tools showed me what everyone else was targeting, not the specific semantic relationships relevant to this client's unique product mix. Second, the multi-language aspect made traditional keyword research almost impossible - most tools have limited data for non-English keywords, especially for niche product categories.

Third, and most importantly, I was thinking about this backwards. Instead of asking "what keywords should I target," I should have been asking "how can I comprehensively cover the semantic space around each product category in a way that matches how people actually search?"

The client had quality products and a solid catalog structure, but their content was essentially invisible to search engines. They needed breadth and depth of semantic coverage, not just individual keyword targeting. Traditional keyword research would have given me a fraction of the semantic relationships that actually mattered for their business.

That's when I decided to experiment with something different. Instead of starting with keyword tools, I started with the products themselves and used NLP AI tools to understand the semantic relationships, user intent patterns, and contextual variations that traditional tools completely miss.

The goal wasn't to replace human expertise - it was to use AI that actually understands language the way modern search engines do. This meant moving from exact-match keyword optimization to semantic topic coverage that could capture the full spectrum of how people search for these products.

My experiments

Here's my playbook

What I ended up doing and the results.

Here's the exact workflow I developed that generated over 20,000 indexed pages and transformed this client's organic visibility. This isn't theoretical - it's the step-by-step process I use for semantic keyword enrichment.

Step 1: Product-Based Semantic Mapping

Instead of starting with keyword tools, I began with the client's actual product catalog. I exported all product data into CSV format and used this as the foundation for semantic analysis. For each product category, I used AI to identify the core semantic concepts, related terminology, and contextual variations that people might use when searching.

The key insight here: products have inherent semantic relationships that keyword tools don't capture. A "vintage leather bag" relates to dozens of search terms like "retro handbag," "classic purse," "antique satchel," and "timeless tote" - connections that become obvious when you analyze language semantically rather than through search volume data.

Step 2: NLP-Powered Intent Analysis

Using natural language processing tools, I analyzed the different ways people express intent around each product category. This went far beyond traditional keyword variations to understand the actual questions, problems, and contexts that drive searches.

For example, someone searching for a laptop bag might use terms like "professional briefcase," "computer carrier," "work tote," or even "meeting bag" depending on their context and background. Traditional keyword tools would treat these as separate, unrelated terms. NLP analysis reveals them as semantic variations of the same core intent.

Step 3: Contextual Content Architecture

Rather than creating individual pages for specific keywords, I developed a content architecture based on semantic clusters. Each product page became the center of a semantic web that could naturally rank for dozens of related terms without keyword stuffing or awkward optimization.

The content structure focused on comprehensive topic coverage: what the product is, how it's used, who it's for, what problems it solves, how it compares to alternatives, and what contexts it fits into. This semantic richness allowed pages to rank for search queries I never explicitly optimized for.

Step 4: Multi-Language Semantic Consistency

The real challenge was maintaining semantic consistency across 8 languages. Traditional translation often loses semantic nuance, but NLP tools helped identify equivalent semantic relationships in each language. This wasn't just translation - it was semantic localization that preserved the intent patterns and contextual relationships in each market.

Step 5: Automated Semantic Enrichment

I built custom AI workflows that could automatically enrich product descriptions with semantically related terms, contextual variations, and intent-based content. This wasn't about keyword density - it was about creating comprehensive semantic coverage that matched how people actually think and search.

The workflow analyzed each product's attributes, identified semantic gaps in the existing content, and suggested enrichments that would improve topic coverage without compromising readability or user experience.

Semantic Mapping

Building comprehensive topic coverage from product attributes rather than keyword lists

NLP Intent Analysis

Understanding search intent patterns through natural language processing instead of search volume data

Content Architecture

Creating semantic clusters that naturally rank for multiple related terms without keyword stuffing

Automated Enrichment

Using AI workflows to systematically expand semantic coverage across thousands of pages

The results completely validated this semantic approach over traditional keyword research. Within 3 months, the site went from under 500 monthly organic visitors to over 5,000 - a 10x increase that came from semantic coverage rather than individual keyword targeting.

More importantly, the traffic quality was significantly higher. Instead of ranking for random keywords that didn't convert, the site attracted visitors with clear commercial intent who were actually looking for the products being sold. The semantic approach naturally aligned content with purchase-ready search queries.

Here's what really surprised me: Google indexed over 20,000 of the generated pages, and many started ranking for search terms I never explicitly optimized for. The semantic richness created natural ranking opportunities that traditional keyword research would have missed entirely.

The multi-language implementation worked even better than expected. Each language market showed similar growth patterns, proving that semantic relationships transcend specific keyword translations. This approach scaled across markets in ways that traditional keyword research simply couldn't match.

From a time perspective, this workflow reduced content creation time by roughly 80% compared to traditional keyword research and manual content creation. Instead of spending weeks researching keywords and months creating individual pieces of content, I could generate comprehensive semantic coverage in days.

Learnings

What I've learned and the mistakes I've made.

Sharing so you don't make them.

  1. Search engines understand topics, not just keywords - Semantic coverage consistently outperformed exact-match optimization

  2. Product attributes contain semantic goldmines - The best keyword insights come from understanding what you're actually selling, not what tools suggest

  3. Context matters more than search volume - Low-volume terms with high commercial intent outperform high-volume informational keywords

  4. AI excels at pattern recognition - NLP tools identify semantic relationships that humans miss and traditional tools can't measure

  5. Scale changes everything - This approach works best when you need to optimize hundreds or thousands of pages, not individual pieces of content

  6. Quality still matters - Semantic enrichment enhances good content but can't save fundamentally weak products or poor user experience

  7. International SEO is different - Semantic relationships vary by culture and language in ways that direct translation can't capture

The biggest lesson? Stop optimizing for search engines and start optimizing for the way people actually think and search. When you align content with natural language patterns and semantic relationships, rankings follow naturally.

How you can adapt this to your Business

My playbook, condensed for your use case.

For your SaaS / Startup

For SaaS startups, focus on semantic enrichment around problem-solution relationships. Use NLP tools to identify how prospects describe their pain points and map content to those semantic patterns. Explore more SaaS growth strategies.

For your Ecommerce store

For ecommerce, apply semantic enrichment to product attributes, use cases, and customer contexts. Build comprehensive semantic coverage around each product category rather than optimizing for individual keywords. Discover more ecommerce optimization tactics.

Get more playbooks like this one in my weekly newsletter