AI & Automation
Personas
SaaS & Startup
Time to ROI
Medium-term (3-6 months)
Last year, I watched something weird happen with one of my e-commerce clients. We'd built solid traditional SEO foundations - keyword-optimized pages, technical optimization, the whole nine yards. But then something unexpected started showing up in our analytics: mentions in ChatGPT and Claude responses, despite being in a niche where most people weren't even using AI tools yet.
This discovery sent me down a rabbit hole that changed how I think about search entirely. While everyone's debating whether "SEO is dead" because of ChatGPT, I've been quietly testing what actually happens when you optimize for both traditional search engines and AI systems.
The reality? It's not about choosing sides. It's about understanding that we're in a transition period where both systems matter, and the businesses that figure out this hybrid approach first will dominate their niches.
Here's what you'll learn from my real-world experiments:
Why SEO fundamentals became more important with AI, not less
The specific content structure changes that get you mentioned in AI responses
How we adapted our AI content strategy for both Google and ChatGPT
Real metrics from transitioning to GEO (Generative Engine Optimization)
The framework I now use for all client projects in 2025
Industry Reality
What the SEO community is saying about AI search
Walk into any SEO conference or forum today, and you'll hear two extreme camps shouting at each other. Camp one insists that "SEO is completely dead" because ChatGPT will replace Google search entirely. Camp two argues that "nothing has changed" and we should keep optimizing exactly like we did in 2020.
Both camps are missing the point entirely.
The traditional SEO community typically recommends doubling down on what's always worked:
Technical SEO perfection - faster page speeds, better Core Web Vitals
Traditional keyword targeting - keep building pages around search volume data
Link building campaigns - chase domain authority like it's 2015
Content volume - publish more articles targeting long-tail keywords
SERP feature optimization - compete for featured snippets and knowledge panels
Meanwhile, the AI-first crowd preaches the opposite extreme - abandon SEO entirely and focus solely on training datasets and LLM optimization techniques that most businesses can't even implement.
Here's why both approaches fail: we're not in an either/or situation. We're in a transition period where both traditional search engines and AI systems are sourcing information simultaneously. The businesses that recognize this and adapt their strategy accordingly are the ones capturing traffic from both channels.
The problem with sticking to traditional SEO? You're optimizing for yesterday's search behavior. The problem with going AI-only? You're betting on tomorrow's technology while ignoring today's revenue opportunities.
Consider me as your business complice.
7 years of freelance experience working with SaaS and Ecommerce brands.
The breakthrough came when I was working with an e-commerce Shopify client last year. We needed a complete SEO overhaul - this was a traditional optimization project, nothing fancy. Standard technical audit, content optimization, the usual playbook I'd run dozens of times before.
But something strange started happening about three months into the project. Even though we were in a pretty niche industry where you wouldn't expect heavy AI tool usage, we began tracking mentions in ChatGPT and Claude responses. Not a ton - maybe a couple dozen per month - but consistent enough to notice.
This was completely accidental. We weren't optimizing for AI at all. These mentions were happening as a natural byproduct of solid, comprehensive content that followed traditional SEO best practices.
That's when I realized everyone was asking the wrong question. Instead of "Is SEO dead because of AI?" the real question was: "What content structure works for both traditional search engines AND AI systems?"
I started researching this overlap and discovered that teams at AI-first startups like Profound and Athena were dealing with the same puzzle. Everyone was figuring this out in real-time because there was no established playbook yet.
The key insight hit me during a conversation with one of these teams: LLMs don't consume pages the same way traditional search engines do. Google reads pages linearly and ranks them against queries. AI systems break content into chunks, synthesize information from multiple sources, then reconstruct answers.
This means the content structure that works for both systems isn't just "good SEO content" - it's content architected to be useful at the chunk level while maintaining overall page authority for traditional ranking factors.
Here's my playbook
What I ended up doing and the results.
Instead of choosing between SEO and AI optimization, I developed what I call the "SEO+GEO Hybrid Framework" - treating Generative Engine Optimization as a layer on top of solid SEO fundamentals, not a replacement for them.
Here's exactly how I restructured our approach:
Foundation Layer: SEO Fundamentals (Non-Negotiable)
Before touching any AI optimization, we ensured bulletproof traditional SEO foundations. This includes technical optimization, proper site architecture, and content that targets actual search queries. Why? Because LLM robots still need to crawl and index your content through traditional methods.
The difference is in execution. Instead of optimizing individual pages in isolation, we started thinking about topic clusters that could serve both traditional queries and AI synthesis.
Content Architecture: Chunk-Level Thinking
This was the game-changer. We restructured content so each section could stand alone as valuable information while contributing to the overall page topic. Every paragraph became a potential "snippet" that could be extracted and synthesized by AI systems.
Practically, this meant:
Self-contained sections - Each heading and paragraph combo answers a specific sub-question
Clear logical flow - Information follows a structure that makes sense when extracted partially
Factual accuracy - Every claim is verifiable because AI systems prioritize authoritative sources
Multiple angles - Comprehensive coverage of topics rather than thin, keyword-stuffed pages
Implementation: The Five-Layer Optimization
Working with my e-commerce client, we implemented five specific optimizations that I now use for all projects:
1. Chunk-Level Retrieval: Made each content section valuable in isolation. If an AI system extracted just one paragraph, it would still provide complete, useful information.
2. Answer Synthesis Readiness: Structured information in logical hierarchies that AI systems could easily parse and reconstruct for user queries.
3. Citation-Worthiness: Focused on factual accuracy and clear attribution, knowing AI systems favor authoritative, verifiable information.
4. Topical Breadth and Depth: Instead of thin pages targeting individual keywords, we created comprehensive resources covering all facets of topics.
5. Multi-Modal Integration: Included charts, tables, and structured data that both Google and AI systems could parse effectively.
The key insight: good GEO optimization IS good SEO optimization, just structured differently. The content that AI systems extract and cite is typically the same high-quality, comprehensive content that ranks well in traditional search.
Foundation First
SEO fundamentals became MORE critical with AI, not less important. LLM systems still crawl through traditional indexing.
Chunk Structure
Each content section must work independently while contributing to overall page authority and topical coverage.
Testing Protocol
We tracked both traditional rankings AND AI mentions using manual monitoring across ChatGPT, Claude, and Perplexity.
Hybrid Measurement
Success metrics include traditional SEO KPIs plus AI mention frequency and context quality across multiple platforms.
The results validated our hybrid approach. Within six months of implementing the SEO+GEO framework:
Traditional SEO metrics stayed strong: Organic traffic continued growing through traditional Google searches. The comprehensive, well-structured content actually performed better in SERPs because it provided more complete answers to user queries.
AI mention frequency increased: From those initial couple dozen monthly mentions, we tracked consistent citations across ChatGPT, Claude, and Perplexity. More importantly, the context of these mentions was positive and accurate.
Cross-channel reinforcement: Users who discovered the brand through AI recommendations would often search for it directly on Google, creating a positive feedback loop between channels.
The most surprising result? The content structured for AI extraction also improved traditional conversion rates. When information is organized clearly enough for AI systems to parse and synthesize, it's also easier for human visitors to scan and understand.
This isn't about massive traffic shifts or revolutionary changes. It's about positioning for a gradual transition while maintaining current performance. The businesses doing this now will have a significant advantage as AI adoption accelerates.
What I've learned and the mistakes I've made.
Sharing so you don't make them.
After implementing this hybrid approach across multiple client projects, here are the seven key lessons that changed how I think about search optimization:
1. Quality compounds across channels. Content good enough for AI citation is typically excellent for traditional SEO. The standards reinforce each other rather than compete.
2. Structure beats volume. A few comprehensive, well-structured pages outperform dozens of thin, keyword-targeted pages in both traditional and AI search contexts.
3. The transition is gradual, not sudden. We're not seeing overnight shifts from Google to ChatGPT. Both channels matter, and will continue to matter for the foreseeable future.
4. Manual monitoring is essential. Unlike traditional SEO where you can track rankings automatically, AI mention tracking still requires manual monitoring across multiple platforms.
5. Accuracy trumps optimization. AI systems heavily penalize inaccurate information. Better to be conservative and factual than optimized and wrong.
6. Topical authority transfers. Brands that build strong traditional SEO authority are more likely to be cited by AI systems. The reputation compounds.
7. Early adoption advantage is real. While competition for AI mentions is still low, establishing presence now creates advantages as adoption increases.
The biggest mistake I see? Treating this as an either/or decision. The winning strategy isn't choosing between SEO and GEO - it's building a foundation that works for both, then optimizing for the specific requirements of each channel.
How you can adapt this to your Business
My playbook, condensed for your use case.
For your SaaS / Startup
For SaaS companies adapting to AI-driven search:
Focus on use-case content that answers specific "how to" questions
Structure feature pages with clear, self-contained benefit explanations
Create integration guides that work as standalone references
Optimize case studies for chunk-level extraction and synthesis
For your Ecommerce store
For e-commerce stores transitioning to hybrid optimization:
Enhance product descriptions with comprehensive, factual specifications
Create buying guides structured for both search and AI synthesis
Optimize category pages with self-contained educational content
Build comparison content that works at the individual chunk level