Many PPC advertisers obsess over click-through rates, using them as a quick measure of ad performance.
But CTR alone doesn’t tell the whole story – what matters most is what happens after the click. That’s where many campaigns go wrong.
The problem with chasing high CTRs
Most advertisers think the ad with the highest CTR is often the best. It should have a high Quality Score and attract lots of clicks.
However, in most cases, lower CTR ads usually outperform higher CTR ads in terms of total conversions and revenue.
If all I cared about was CTR, then I could write an ad:
“Free money.”
“Claim your free money today.”
“No strings attached.”
That ad would get an impressive CTR for many keywords, and I’d go out of business pretty quickly, giving away free money.
When creating ads, we must consider:
Type of searchers we want to attract.
Ensure the users are qualified.
Set expectations for the landing page.
I can take my free money ad and refine it:
“Claim your free money.”
“Explore college scholarships.”
“Download your free guide.”
I’ve now:
Told searchers they can get free money for college through scholarships if they download a guide.
Narrowed down my audience to people who are willing to apply for scholarships and willing to download a guide, presumably in exchange for some information.
If you focus solely on CTR and don’t consider attracting the right audience, your advertising will suffer.
While this sentiment applies to both B2C and B2B companies, B2B companies must be exceptionally aware of how their ads appear to consumers versus business searchers.
B2B companies must pre-qualify searchers
If you are advertising for a B2B company, you’ll often notice that CTR and conversion rates have an inverse relationship. As CTR increases, conversion rates decrease.
The most common reason for this phenomenon is that consumers and businesses can search for many B2B keywords.
B2B companies must try to show that their products are for businesses, not consumers.
For instance, “safety gates”is a common search term.
The majority of people looking to buy a safety gate are consumers who want to keep pets or babies out of rooms or away from stairs.
However, safety gates and railings are important for businesses with factories, plants, or industrial sites.
These two ads are both for companies that sell safety gates. The first ad’s headlines for Uline could be for a consumer or a business.
It’s not until you look at the description that you realize this is for mezzanines and catwalks, which is something consumers don’t have in their homes.
As many searchers do not read descriptions, this ad will attract both B2B and B2C searchers.
The second ad mentions Industrial in the headline and follows that up with a mention of OSHA compliance in the description and the sitelinks.
While both ads promote similar products, the second one will achieve a better conversion rate because it speaks to a single audience.
We have a client who specializes in factory parts, and when we graph their conversion rates by Quality Score, we can see that as their Quality Score increases, their conversion rates decrease.
They will review their keywords and ads whenever they have a 5+ Quality Score on any B2B or B2C terms.
This same logic does not apply to B2B search terms.
Those terms often contain more jargon or qualifying statements when looking for B2B services and products.
B2B advertisers don’t have to use characters to weed out B2C consumers and can focus their ads only on B2B searchers.
How to balance CTR and conversion rates
As you are testing various ads to find your best pre-qualifying statements, it can be tricky to examine the metrics. Which one of these would be your best ad?
15% CTR, 3% conversion rate.
10% CT, 7% conversion rate.
5% CTR, 11% conversion rate.
When examining mixed metrics, CTR and conversion rates, we can use additional metrics to define our best ads. My favorite two are:
Conversion per impression (CPI): This is a simple formula dividing your conversion by the number of impressions (conversions/impressions).
Revenue per impression (RPI): If you have variable checkout amounts, you can instead use your revenue metrics to decide your best ads by dividing your revenue by your impressions (revenue/impressions).
You can also multiply the results by 1,000 to make the numbers easier to digest instead of working with many decimal points. So, we might write:
CPI = (conversions/impressions) x 1,000
By using impression metrics, you can find the opportunity for a given set of impressions.
CTR
Conversion rate
Impressions
Clicks
Conversions
CPI
15%
3%
5,000
750
22.5
4.5
10%
7%
4,000
400
28
7
5%
11%
4,500
225
24.75
5.5
By doing some simple math, we can see that option 2, with a 10% CTR and a 7% conversion rate, gives us the most total conversions.
How do you dissuade users who don’t fit your audience from clicking on your ads?
How do you attract your qualified audience?
Are your ads setting proper landing page expectations?
By considering each of these questions as you create ads, you can find ads that speak to the type of users you want to attract to your site.
These ads are rarely your best CTRs. These ads balance the appeal of high CTRs with pre-qualifying statements that ensure the clicks you receive have the potential to turn into your next customer.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/10/OSHA-compliance-Google-Ads-hT8Y2X.webp?fit=871%2C574&ssl=1574871http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-27 13:00:002025-10-27 13:00:00Why a lower CTR can be better for your PPC campaigns
The web’s purpose is shifting. Once a link graph – a network of pages for users and crawlers to navigate – it’s rapidly becoming a queryable knowledge graph.
For technical SEOs, that means the goal has evolved from optimizing for clicks to optimizing for visibility and even direct machine interaction.
Enter NLWeb – Microsoft’s open-source bridge to the agentic web
At the forefront of this evolution is NLWeb (Natural Language Web), an open-source project developed by Microsoft.
NLWeb simplifies the creation of natural language interfaces for any website, allowing publishers to transform existing sites into AI-powered applications where users and intelligent agents can query content conversationally – much like interacting with an AI assistant.
Developers suggest NLWeb could play a role similar to HTML in the emerging agentic web.
Its open-source, standards-based design makes it technology-agnostic, ensuring compatibility across vendors and large language models (LLMs).
This positions NLWeb as a foundational framework for long-term digital visibility.
Schema.org is your knowledge API: Why data quality is the NLWeb foundation
NLWeb proves that structured data isn’t just an SEO best practice for rich results – it’s the foundation of AI readiness.
Its architecture is designed to convert a site’s existing structured data into a semantic, actionable interface for AI systems.
In the age of NLWeb, a website is no longer just a destination. It’s a source of information that AI agents can query programmatically.
The NLWeb data pipeline
The technical requirements confirm that a high-quality schema.org implementation is the primary key to entry.
Data ingestion and format
The NLWeb toolkit begins by crawling the site and extracting the schema markup.
The schema.org JSON-LD format is the preferred and most effective input for the system.
This means the protocol consumes every detail, relationship, and property defined in your schema, from product types to organization entities.
For any data not in JSON-LD, such as RSS feeds, NLWeb is engineered to convert it into schema.org types for effective use.
Semantic storage
Once collected, this structured data is stored in a vector database. This element is critical because it moves the interaction beyond traditional keyword matching.
Vector databases represent text as mathematical vectors, allowing the AI to search based on semantic similarity and meaning.
For example, the system can understand that a query using the term “structured data” is conceptually the same as content marked up with “schema markup.”
This capacity for conceptual understanding is absolutely essential for enabling authentic conversational functionality.
Every NLWeb instance operates as an MCP server, an emerging standard for packaging and consistently exchanging data between various AI systems and agents.
MCP is currently the most promising path forward for ensuring interoperability in the highly fragmented AI ecosystem.
The ultimate test of schema quality
Since NLWeb relies entirely on crawling and extracting schema markup, the precision, completeness, and interconnectedness of your site’s content knowledge graph determine success.
The key challenge for SEO teams is addressing technical debt.
Custom, in-house solutions to manage AI ingestion are often high-cost, slow to adopt, and create systems that are difficult to scale or incompatible with future standards like MCP.
NLWeb addresses the protocol’s complexity, but it cannot fix faulty data.
If your structured data is poorly maintained, inaccurate, or missing critical entity relationships, the resulting vector database will store flawed semantic information.
This leads inevitably to suboptimal outputs, potentially resulting in inaccurate conversational responses or “hallucinations” by the AI interface.
Robust, entity-first schema optimization is no longer just a way to win a rich result; it is the fundamental barrier to entry for the agentic web.
By leveraging the structured data you already have, NLWeb allows you to unlock new value without starting from scratch, thereby future-proofing your digital strategy.
NLWeb vs. llms.txt: Protocol for action vs. static guidance
The need for AI crawlers to process web content efficiently has led to multiple proposed standards.
A comparison between NLWeb and the proposed llms.txt file illustrates a clear divergence between dynamic interaction and passive guidance.
The llms.txt file is a proposed static standard designed to improve the efficiency of AI crawlers by:
Providing a curated, prioritized list of a website’s most important content – typically formatted in markdown.
Attempting to solve the legitimate technical problems of complex, JavaScript-loaded websites and the inherent limitations of an LLM’s context window.
In sharp contrast, NLWeb is a dynamic protocol that establishes a conversational API endpoint.
Its purpose is not just to point to content, but to actively receive natural language queries, process the site’s knowledge graph, and return structured JSON responses using schema.org.
NLWeb fundamentally changes the relationship from “AI reads the site” to “AI queries the site.”
Attribute
NLWeb
llms.txt
Primary goal
Enables dynamic, conversational interaction and structured data output
Improves crawler efficiency and guides static content ingestion
Operational model
API/Protocol (active endpoint)
Static Text File (passive guidance)
Data format used
Schema.org JSON-LD
Markdown
Adoption status
Open project; connectors available for major LLMs, including Gemini, OpenAI, and Anthropic
Proposed standard; not adopted by Google, OpenAI, or other major LLMs
Strategic advantage
Unlocks existing schema investment for transactional AI uses, future-proofing content
Reduces computational cost for LLM training/crawling
The market’s preference for dynamic utility is clear. Despite addressing a real technical challenge for crawlers, llms.txt has failed to gain traction so far.
NLWeb’s functional superiority stems from its ability to enable richer, transactional AI interactions.
It allows AI agents to dynamically reason about and execute complex data queries using structured schema output.
The strategic imperative: Mandating a high-quality schema audit
While NLWeb is still an emerging open standard, its value is clear.
It maximizes the utility and discoverability of specialized content that often sits deep in archives or databases.
This value is realized through operational efficiency and stronger brand authority, rather than immediate traffic metrics.
Several organizations are already exploring how NLWeb could let users ask complex questions and receive intelligent answers that synthesize information from multiple resources – something traditional search struggles to deliver.
The ROI comes from reducing user friction and reinforcing the brand as an authoritative, queryable knowledge source.
For website owners and digital marketing professionals, the path forward is undeniable: mandate an entity-first schema audit.
Because NLWeb depends on schema markup, technical SEO teams must prioritize auditing existing JSON-LD for integrity, completeness, and interconnectedness.
Publishers should ensure their schema accurately reflects the relationships among all entities, products, services, locations, and personnel to provide the context necessary for precise semantic querying.
The transition to the agentic web is already underway, and NLWeb offers the most viable open-source path to long-term visibility and utility.
It’s a strategic necessity to ensure your organization can communicate effectively as AI agents and LLMs begin integrating conversational protocols for third-party content interaction.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/10/ChatGPT-Image-The-agentic-web-is-here-Why-NLWeb-makes-schema-your-greatest-SEO-asset-VCuOt6.png?fit=1920%2C1080&ssl=110801920http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-27 12:00:002025-10-27 12:00:00The agentic web is here: Why NLWeb makes schema your greatest SEO asset
We are excited to announce Query groups, a powerful Search Console Insights feature that groups
similar search queries. One of the challenges when analyzing search performance data is that there
are many different ways to write the same query: you might see a dozen different variations for a
single user question – including common misspellings, slightly different phrasing, and different languages.
Query groups solve this problem by grouping similar queries.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2021/12/web-design-creative-services.jpg?fit=1500%2C600&ssl=16001500http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-27 07:00:002025-10-27 07:00:00Introducing Query groups in Search Console Insights
Nearly 90% of businesses are worried about losing organic visibility as AI transforms how people find information, according to a new survey by Ann Smarty.
Why we care. The shift from search results to AI-generated answers seems to be happening faster than many expected, threatening the foundation of how companies are found online and drive sales. AI is changing the customer journey and forcing an SEO evolution.
By the numbers. Most prefer to keep the “SEO” label – with “SEO for AI” (49%) and “GEO” (41%) emerging as leading terms for this new discipline.
87.8% of businesses said they’re worried about their online findability in the AI era.
85.7% are already investing or plan to invest in AI/LLM optimization.
61.2% plan to increase their SEO budgets due to AI.
Brand over clicks. Three in four businesses (75.5%) said their top priority is brand visibility in AI-generated answers – even when there’s no link back to their site.
Just 14.3% prioritize being cited as a source (which could drive traffic).
A small group said they need both.
Top concerns. “Not being able to get my business found online” ranked as the biggest fear, followed by the total loss of organic search and loss of traffic attribution.
About the survey. Smarty surveyed 300+ in-house marketers and business owners, mostly from medium and enterprise companies, with nearly half representing ecommerce brands.
http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png00http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-24 17:10:572025-10-24 17:10:5790% of businesses fear losing SEO visibility as AI reshapes search
It’s easy to fall into doom and gloom that AI is replacing content marketers. It’s really replacing outdated workflows, though.
Over 90 percent of large marketing teams now use AI to generate content. They’re moving faster, publishing more, and rethinking production from the ground up. But speed alone won’t make content perform.
Audiences tune out shallow, generic material. Human creativity still drives differentiation. Strategy, originality, and clear brand perspective separate useful content from noise.
The teams that win combine AI’s efficiency with human insight. That requires knowing where automation fits and where it doesn’t. If you haven’t defined how to use AI for content creation inside your workflow, now’s the time.
This piece explores what effective AI vs human content looks like today and how to build it without losing your edge.
Key Takeaways
Most companies have already integrated AI into their content workflows, but don’t fall in the trap of treating them as shortcuts rather than systems.
Content that earns visibility today is structured, specific, and backed by human perspective, not just keyword targeting.
Strategic AI use supports ideation, formatting, optimization, and repurposing, but quality control stays human.
Personalization, brand voice, and original data continue to drive trust and engagement.
Success comes from balancing scale with clarity. The best content performs because it’s relevant, not frequent.
Managing The AI Flood
AI-generated content has reshaped digital publishing. Brands produce more blog posts, email copy, and landing pages than ever. But volume brings saturation and diminishing returns.
Not all AI content is low quality, but much of it reads identically. Teams optimize for speed without strategy. The result? More output, less substance.
Content that still works doesn’t feel mass-produced. It stands out by doing one or more of these things:
Offers a clear point of view or original framework
Goes deeper than surface-level summaries
Reflects genuine understanding of the audience
Adds context, nuance, or experience AI can’t fake
Search engines adapt to this shift. Platforms like Google and Perplexity look at content with structure, specificity, and trust signals over keyword stuffing or volume. AI tools are more likely to cite content that demonstrates expertise and clarity.
The opportunity isn’t to publish more. Build better systems for quality and relevance at scale. Winning teams won’t lean on AI to fill gaps, but reinforce strengths.
Human guidance makes the difference. Without it, content becomes another drop in the flood.
Rebuilding The Content Workflow
AI accelerates content production. It also forces teams to rethink how work gets done.
Instead of replacing content professionals, AI shifts where their time and value go. Manual tasks like keyword clustering, formatting, or metadata writing now run through automation. What remains critical is work AI can’t do well: aligning content to business goals, telling compelling stories, and capturing audience nuance.
How does this work in practice? Writers, strategists, and editors move upstream. They spend more time setting direction, defining tone, and curating inputs. Downstream, AI helps turn those inputs into faster iterations, formatted assets, and scalable deliverables.
This shift creates a more responsive content engine. One that reaches insight faster. One that makes room for testing and repurposing without burning out your team.
The result? More consistent output, more flexibility, fewer bottlenecks.
To get there, rebuild the workflow around what your team does best, not just what AI does quickly.
The sections below break down how to apply this shift at each stage, from ideation to optimization, so you can create a system that scales without sacrificing value.
Ideation
Strong content starts with strong ideas. That’s still a human job.
AI makes the early stages faster. Instead of starting from scratch, marketers use AI to scan top-performing content, surface related questions, and generate keyword clusters in seconds. Tools like ChatGPT, Ubersuggest, and BuzzSumo help teams quickly identify gaps, trends, and angles worth exploring.
But ideation is only useful when it’s aligned with strategy. AI should support the process, not drive it. You need that human point of view as a starting point.
Real-Time Performance Feedback
AI doubles as a smart editor.
Tools like Clearscope, MarketMuse, and Surfer SEO give real-time scoring on keyword coverage, topic depth, readability, and search intent. You can spot weak sections, catch missing subtopics, and verify your draft aligns with how people actually search.
Instead of waiting for performance to drop before making updates, fix issues before content even publishes. That means fewer rewrites and better outcomes from day one.
Brand Voice Support
One of the biggest risks with AI content? Sounding like everyone else. Brand voice systems help.
Feed AI tools with examples of your tone, preferred phrases, and messaging guardrails to guide outputs toward consistent brand reflection. Prompt libraries, templates, and style frameworks give AI clearer direction and reduce heavy editing later.
But it’s not set-and-forget. Someone still needs to review and fine-tune. AI can help scale your voice, but it won’t define it for you.
Content Repurposing
Most content teams don’t need more ideas. They need more mileage from content they already have.
AI makes breaking down webinars, blog posts, or whitepapers into new formats easier. With the right content repurposing plan, turn a single piece into multiple social posts, email sequences, video scripts, or short-form summaries in minutes.
This approach saves time and extends the reach of your core ideas. The key is setting rules around tone and structure so AI keeps output aligned with your original intent.
Graphics
Visual content used to slow down many content workflows. Not anymore.
AI-powered design tools like Canva, Midjourney, and Runway help marketers produce branded graphics, thumbnails, and motion assets much faster. Instead of waiting days for design resources, teams create visuals in parallel with written content without sacrificing quality.
This means faster turnarounds on social content, better visual support for blog posts, and more consistency across formats. As with writing, human review remains necessary, but AI handles much of the heavy lifting.
SEO Formatting
Formatting for SEO used to eat up hours, particularly at scale. AI tools now handle much of that backend work.
From writing meta descriptions and alt text to adding schema markup and internal links, automation streamlines the technical side of publishing. Tools like SEO.ai and Surfer can also suggest keyword tweaks and intent matches based on real-time SERP data.
This doesn’t replace SEO strategy, but it cuts down the grunt work. Teams can focus more on aligning content with search intent, not just checking boxes.
The New Age of AI-Optimized Content: What Does It Look Like?
The rise of AI hasn’t lowered the bar for content quality. It’s raised it.
With machine-generated content flooding every channel, visibility now depends on value, not volume. Search engines and users reward content that brings clarity, trust, and depth.
Your content strategy needs to shift focus. Specificity, structure, and perspective matter more than keyword counts and content frequency.
AI-optimized content that performs well today typically checks a few key boxes:
Built around real expertise, often supported by proprietary data or firsthand experience
Clearly structured, using headings, bullets, and schema markup to improve readability and search parsing
Leads with utility, helping readers solve problems, take action, or understand something faster
Reflects your brand’s voice and positioning, not a generic blend of scraped internet copy
Human content professionals have leverage here. AI can get a draft to 70 percent, but that last 30 percent (the part that connects, converts, or earns backlinks) still requires human input.
One of the most overlooked opportunities right now? Simply tightening your structure. Clear formatting helps search engines surface your content and makes it easier for generative tools like ChatGPT and Perplexity to cite and summarize it correctly.
AI can help get content out the door faster. But if you want that content to show up, earn trust, and drive results, human oversight isn’t optional. It’s the differentiator.
Multimedia Integration
A well-placed visual can do more than dress up a page. It boosts visibility, extends engagement, and increases the odds of being cited by generative search engines.
Search engines also reward content that blends formats. Multimedia helps break up long blocks of text, reinforces key takeaways, and signals structure that AI engines can easily parse.
To make it work, start planning visuals alongside your copy, not after the fact. That upfront alignment leads to stronger storytelling and assets that actually support performance, not just polish the page.
AI’s Impact on Content Distribution
Content doesn’t drive results if no one sees it. That’s always been true. What’s changed is how distribution works and who you’re optimizing for.
Today, your audience includes both people and machines. The rise of generative search and large language models (LLMs) means your content isn’t just being read by humans. It’s being crawled, summarized, and cited by AI systems that prioritize structure, metadata, and clarity.
To stay visible, your distribution strategy needs to reflect that.
Start with metadata. Schema markup, structured tags, and optimized alt text all help AI tools understand and surface your content across search, snippets, and summaries. This isn’t just a technical checkbox. It’s the infrastructure that supports discoverability.
Then think about format. Repurpose long-form assets into LinkedIn posts, email sequences, YouTube Shorts, or Reddit threads. Tailor messaging by platform. Adjust tone for different audiences. A one-size-fits-all approach wastes reach.
Finally, use automation to your advantage. Tools like Buffer, Zapier, and Hootsuite can help schedule, adapt, and push updates across multiple channels at once. That frees your team from repetitive tasks and ensures consistency wherever your audience finds you.
Distribution used to be about checking the promotion box. Now it’s a system with humans on one end and AI on the other.
Done well, distribution doesn’t just get more eyes on your content. It makes sure the right people and the right algorithms see it in the right place, at the right time.
Staying Ahead of the Content Curve
Predictability used to be a strength in content planning. But with AI constantly changing how content is created, distributed, and discovered, agility matters just as much.
Keeping your edge means paying attention to two things: where AI is going, and how your audience is reacting right now.
Start by tracking signals. Tools like Exploding Topics, Glimpse, and SparkToro help identify early trends and shifts in search behavior before they hit the mainstream. Combined with real-time performance data from platforms like GA4 or social analytics, you can spot what’s resonating and what’s falling flat while there’s still time to act.
Adaptability is key. A/B testing thumbnails, headlines, or messaging lets you make micro-adjustments without overhauling your entire campaign. And monitoring where and how AI engines cite your content can highlight gaps worth closing or opportunities to double down on.
Future-proofing doesn’t mean locking in a rigid plan. It means building a system that can flex with your audience and the algorithms that serve them.
FAQs
Can AI-generated content rank in search engines?
Yes, but only if it’s high quality. Google doesn’t penalize AI content specifically. What matters is whether the content provides value, demonstrates expertise, and meets user intent. AI-assisted content that’s edited and enhanced by humans typically performs better than purely AI-generated material.
How do I balance AI vs human-generated content in my strategy?
Use AI for tasks like ideation, outlining, formatting, and repurposing. Keep humans involved in strategy, editing, brand voice, and final review. A good rule: AI can get you to 70 percent, but humans should handle the final 30 percent that makes content distinctive and valuable.
What are the risks of using too much AI in content creation?
Over-reliance on AI leads to generic, samey content that doesn’t stand out. Other risks include factual errors, lack of brand voice, and content that sounds robotic. Users and search engines increasingly favor content with clear human expertise and originality.
How is human vs AI content different in terms of engagement?
Human-created or human-edited content typically generates higher engagement because it includes personal experiences, emotional resonance, and authentic storytelling. AI content often lacks nuance and personality, which can reduce trust and engagement rates.
Conclusion
The shift to AI-assisted content isn’t slowing down. But speed and automation aren’t enough to drive results on their own. The real differentiator is how well your system blends efficiency with insight.
Human-led strategy still drives the most meaningful outcomes, whether that’s developing a content plan built around real audience data or shaping assets to align with how search and generative engines work today.
If you haven’t revisited your content approach recently, now’s the time. You can start by refining your SEO content strategy or building smarter processes around AI content optimization.
In a space full of content, only the most useful, intentional, and well-structured will rise to the top.
http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png00http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-23 19:00:002025-10-23 19:00:00AI vs Content Marketers: The New Content Marketing Formula
What Is AI Optimization (And Why You Should Care)?
AI optimization is the process of making your website accessible and understandable to AI-powered search tools. Like ChatGPT, Claude, Gemini, Perplexity, Google AI Overview, and Bing Copilot.
Some call it “AI search optimization.” Others “AI content optimization.”
Terminologies vary, but they’re all about the same thing:
Make your site easy for large language models (LLMs) to find, understand, and reference in their answers.
It’s not a brand-new strategy. It’s built on the core SEO principles.
Only now, you’re optimizing for tools that pull, summarize, and use your information — not just rank.
But why is AI optimization so important now?
AI tools are expected to drive more traffic than traditional search engines by 2028.
And here’s the kicker:
This traffic pool is only getting bigger.
Over 700 million people use ChatGPT every week. Millions more use Perplexity, Gemini, and other AI platforms.
Google’s AI Mode already has more than 100 million monthly active users. And that’s just in the US and India.
As it rolls out globally, adoption will only grow.
AI search optimization helps you be visible to these users.
It ensures your site appears in AI-powered search results, increasing your chances of getting referral traffic and finding new customers.
How AI Search Works
LLMs find relevant content across the web based on users’ prompts, then combines it into one comprehensive answer with source links.
There are three broad steps:
1. Understanding Your Prompt
First, AI interprets what you’re asking.
Some platforms (and specific models) may even expand or tweak your query for better results.
For instance, if I search “best sneakers,” ChatGPT’s o3 model searches for more specific phrases like “best running shoes 2025.”
2. Retrieval
Next, the AI platform searches for information in real time.
Different platforms use different sources (Google’s index, Bing, curated databases, etc.). But they all work the same way.
They gather relevant content from across the web for your expanded query.
3. Synthesis
Finally, AI decides which sources to include.
How?
The exact criteria aren’t public. But these factors seem to matter the most:
Authority: Recognized brands (entities it knows) and established experts
Structure: Clear, scannable content with direct answers
Context: Content that covers topics semantically (related concepts, not just keyword matches)
The most relevant sources get cited. The rest get ignored.
Which means ranking well isn’t enough. Your content also needs to be properly structured for AI systems.
I Analyzed 10 Queries Across Multiple AI Search Platforms: Here’s What I Found
Before we move forward to discuss how to optimize for AI search, I wanted to understand three things:
Do different AI platforms cite different types of content?
Which domains consistently appear across platforms?
Does multi-platform presence actually matter for AI visibility?
So I ran a simple experiment.
I searched 10 queries across ChatGPT 5, Claude Sonnet 4, Perplexity (Sonar model), Gemini 2.5 Flash, and Google’s AI Mode — a mix of commercial, informational, local, and trending topics.
And I found some interesting insights.
How Each Platform Chooses Sources
Platforms
Citation Behavior
ChatGPT
Acts like a community aggregator. Mixes Reddit discussions with Wikipedia and review sites.
Claude
Prefers recent, authoritative sources. Zero Reddit citations. Focuses on 2024-2025 content
Perplexity
Most diverse. Balances buying guides, YouTube reviews, and some Reddit.
Gemini
Relies mostly on training data. And since there’s no option to turn on web search, you can’t get it to cite sources for most of your queries.
Google AI Mode
Pulls from beyond top search results. 50% of citations weren’t on page one of Google.
The “Citation Core” Effect
Certain domains have achieved what we call the “citation core” status.
Citation core (n.): A small group of sites and brands that every major AI search tool trusts, cites, and uses as default sources.
Wikipedia showed up 16 times. Mayo Clinic owned health queries. RTINGS controlled electronics reviews.
These sites have become AI’s default sources.
What This Means for Brand Sites
One pattern jumped out: Official brand websites were underrepresented.
In my test, they made up around ~10% of all citations.
But that doesn’t mean your site doesn’t matter for informational or educational queries.
It means most sites aren’t yet AI-friendly. And that’s the opportunity.
When your site is structured, detailed, and optimized, it becomes one of the few brand-owned sources AI can actually cite for product specs, features, case studies, and stats. Information third-party sites can’t provide.
Think of it like this: Your website gives you the authoritative base layer. Off-site presence just amplifies it.
These findings aren’t surprising. But they reinforce what we’ve suspected all along.
In fact, a lot of what we do here at Backlinko aligns with these patterns:
Google’s guideline says good SEO is good AI optimization.
Their official guidelines mostly rehash standard SEO practices, with a few AI-specific points. Like using preview controls and ensuring structured data matches visible content.
But the foundation to make your site AI search-ready starts with three teams working in sync:
Developers: They make your site technically accessible to AI crawlers
SEOs: They structure content so AI can extract and understand it
Content teams: They create information worth extracting
Most companies treat these as separate projects.
That’s a mistake.
Leigh McKenzie, Head of SEO at Backlinko, explains why:
“Ranking in Google doesn’t guarantee you’ll show up in AI tools. SEO is still table stakes. But generative engines don’t just lift the top results. They scan at a semantic level, fan queries out into dozens of variants, and stitch together answers from multiple sources.”
You’ll need a coordinated effort to execute.
Let’s look at exactly what each team needs to do for effective AI search optimization.
Note: Most traditional SEO practices work for AI optimization too.
I’m not covering the basics here, like using sitemaps and including metadata. You should already be doing those.
Instead, I’m focusing on factors that specifically impact AI search visibility. These are insights based on my own experience, analyzing what’s working across different sites, and comparing notes with other SEOs.
Want the complete list?
I’ve created an AI Search Engine Optimization Checklist that covers everything — the well-known tactics, the experimental ones, and the “can’t hurt to try” optimizations that might give you an edge.
Developer Tasks
Understanding how to optimize for AI search starts with your developers. Because they control whether AI can actually access and understand your content.
No access means no citations.
Here’s what they need to check:
1. Make Your Site Accessible to AI Crawlers
AI crawlers need permission to access your site through your robots.txt file.
If you block them, your content won’t appear in AI search results.
Here are the main AI crawlers:
GPTBot (OpenAI/ChatGPT)
Google-Extended (Google’s AI Overview)
Claude-Web (Anthropic/Claude)
PerplexityBot (Perplexity)
To check if you’re blocking them, go to yoursite.com/robots.txt.
Look for any lines that say “Disallow” next to these crawler names.
If you find them blocked (or want to make sure they’re allowed), add these lines to your robots.txt:
Your developers handled the technical requirements. AI can now access your site.
But access doesn’t guarantee visibility in AI results.
Your SEO team controls how AI discovers, understands, and prioritizes your content.
Here’s what they need to control in your AI SEO strategy:
7. Structure Pages for Fragment-Friendly Indexing
AI pulls specific fragments from your pages — sentences and paragraphs it can use in responses.
Your page structure affects how easily AI can extract these fragments.
Start with a clean heading hierarchy.
Proper H2s and H3s help AI (and your readers) understand where one idea ends and another begins.
Go a step further by breaking big topics into unique subsections.
Instead of one giant guide to “healthy recipes,” create separate sections for “healthy breakfast recipes,” “healthy lunch recipes,” and “healthy dinner recipes.”
That way, you match the variations people actually search for.
Pro tip: Don’t bury your best insights in long paragraphs.
Use callouts (like this one)
Add short lists and bullets
Drop quick tables for comparisons
That’s how you turn raw text into structured fragments AI can actually use.
When your content is structured this way, every section becomes a potential answer.
8. Build Topic Clusters That Signal Full Coverage
Internal linking creates topical connections across your site.
When you link related pages together, you’re building topic clusters that show comprehensive coverage.
This is standard SEO practice that also helps AI discovery.
Create pillar pages for your main topics. These are comprehensive guides that link out to all related content.
For “project management,” your pillar would link to:
Task automation guide
Team collaboration tools
Workflow optimization
Resource planning
Each supporting page links back to the pillar and to other relevant pages in the cluster.
This helps both users and AI understand page relationships.
The cluster structure accomplishes two things:
First, it improves crawl efficiency. AI finds your hub and immediately discovers all related content through the links.
Second, it demonstrates topical depth. Organized clusters show comprehensive coverage better than scattered pages.
This structural approach helps organize your site architecture to showcase expertise through strategic internal linking.
9. Add Schema Markup to Label Your Content
When AI crawls your page, it sees text.
But it doesn’t know (without natural language processing) if that text is a recipe, a review, or a how-to guide.
Schema explicitly labels each element of the page.
It makes data more structured and easier to understand.
There are several types of schema markups.
I’ve found the FAQ schema particularly effective for AI search visibility.
Here’s how it looks:
json{
"@type": "FAQPage",
"mainEntity": [{
"@type": "Question",
"name": "What is churn rate?",
"acceptedAnswer": {
"@type": "Answer",
"text": "Churn rate is the percentage of customers who cancel during a specific period."
}
}]
}
This markup tells AI exactly where to find questions and answers on your page.
The Q&A format matches how AI structures many of its responses, making your content easy to process.
Depending on the content management system (CMS) you’re using, you can add schema using plugins, add-ons, or manually.
For instance, WordPress has several good plugins.
After implementation, you can test it at validator.schema.org to ensure it’s working properly.
Note: Schema is just one type of metadata. Others include title tags, meta descriptions, and Open Graph tags.
Keeping them accurate and consistent may help AI platforms interpret your content correctly.
You can check your metadata using browser dev tools or SEO extensions, like SEO META in 1 CLICK.
10. Add Detailed Content to Category and Product Pages
Most category pages are just product grids. That’s a missed opportunity for AI search optimization.
The same goes for individual product pages with just specs and a buy button.
These pages get tons of commercial searches.
But they lack substantial content.
So, AI has limited information to work with when answering product queries.
You want to add buyer-focused information directly on these pages, like this:
They can cover:
Feature comparison tables
Common questions with clear answers
Use cases and industry applications
Technical specifications that matter
For product pages, go beyond basic descriptions.
Include materials, dimensions, compatibility, warranties, reviews — whatever matters to your buyers.
For example, GlassesUSA.com has several details on its product pages than just product specifications.
They include information that AI can use when answering specific questions.
Similarly, for category pages, add content that helps buyers choose.
What’s the difference between options? What should they consider? Which product fits which need?
Eyewear retailer Frames Direct does this well.
It has detailed content at the end of its category pages.
The key is putting this information directly on the page. Not hiding it behind tabs or “read more” buttons.
When someone asks AI about products in your category, you want substance it can quote. Not just a grid of images it can’t interpret.
11. Track Where AI Mentions Your Brand (and Where It Doesn’t)
You need to know where AI is mentioning your brand and where it isn’t.
Because if competitors appear in AI results and you don’t, they’re capturing the traffic you should be getting.
You can try checking this manually.
Run your target queries (e.g., “nutrition tracking app 2025”) across different AI platforms.
Scan the answers. And see if your brand shows up.
But that’s slow. And you’ll only catch a small slice of what’s happening.
It tracks how often your brand appears in AI-generated answers across various platforms like ChatGPT, Google AI Mode, and Google AI Overview. (In the “Visibility Overview” report.)
You can see exactly which topics and prompts your brand appears for.
And which prompts your competitors appear for, but you don’t. (In the “Competitor Research” report.)
For instance, if you find that AI cites competitors for “Cats and Feline Care” but skips your brand, that’s a clear signal to create or optimize a page targeting that exact query.
You also get strategic recommendations. So you can spot gaps, fix weak content, and double down where you’re already winning. (In the “Brand Performance” reports.)
With a tool like AI SEO Toolkit, you’re not guessing about your AI search visibility.
You’re improving based on real AI visibility data.
12. Optimize for Natural Language Prompts, Not Just Keywords
But they ask AI, “What’s the warmest jacket for Chicago winters under $300?”
Your content needs to match these natural language patterns.
Start by identifying how people actually phrase questions in your industry.
Use the AI SEO Toolkit to find high-value prompts in your industry.
Go to the “Narrative Drivers” report.
And scroll down to the “All Questions” section to see which prompts mention your brand and where competitors appear instead.
Document these prompt patterns.
Share them with your content team to create pages that answer these specific questions — not just target the base keyword.
The goal isn’t abandoning keywords.
It’s expanding from “winter jacket” pages to content that answers “warmest jacket for Chicago winters under $300.”
Content Tasks
Your site is technically ready. Your SEO is taken care of.
Now your content team needs to create valuable information and build presence across the web.
Here’s how to optimize content for AI search:
13. Publish Original Content with Data, Examples, and Insights
Generic blog posts restating common knowledge rarely perform well in AI search results.
But content with fresh angles and concrete examples does.
At Backlinko, we focus on publishing content that provides unique value through examples, original research, and exclusive insights.
Like this article:
And even if we’re talking about a common topic (e.g., organic traffic), we add fresh examples.
So how do you make your content stand out?
Run small studies or polls to produce original data. Even simple numbers can set your content apart.
Use screenshots, case studies, and workflows from real projects.
Back up your points with stats and cite credible sources.
Add expert quotes to strengthen authority.
Test tools or strategies yourself, and share the actual results.
AI systems look for concrete details they can pull into answers.
The more unique evidence, examples, and voices you add, the better.
14. Embed Your Brand Name in Context-Inclusive Copy
Context-inclusive copy means writing sentences that make sense on their own.
Each line should carry enough detail that an AI system understands it without needing the surrounding text.
But take that a step further.
Don’t just make your sentences self-contained.
Embed your brand name inside them so when AI reuses a fragment, your company is part of the answer.
Instead of: “Our tool helped increase conversions by 25%”
Write: “[Product] helped [client] increase checkout completions by 25%”
The second version keeps your brand attached to the insight when AI extracts it.
So how do you do this in practice?
With data: Tie your brand name directly to research findings or surveys you publish
With comparisons: Mention your brand alongside alternatives, so it’s always part of the conversation
With tutorials: Show steps using your product or service in real workflows
With results: Attach your brand name to case studies and examples
Here’s an example from Semrush, using their brand name vs. “we”:
The goal is simple:
Every quotable fragment should carry both context and your brand name.
That way, when AI pulls it into an answer, your company is mentioned too.
15. Create Pages for Every Use Case, Feature, and Integration
Specific pages are more likely to appear in AI responses than generic ones.
So, don’t bundle all features on one page.
Create dedicated pages for each major feature, use case, and integration.
Here’s an example of JustCall doing it right with unique pages for each of its main features and use cases:
The strategy is simple: match how people actually search.
For instance, someone looking for “Slack integration” wants a page about that specific integration. Not a features page where Slack is item #12 in a list.
Structure these pages to answer real questions, like:
What problem does this solve?
Who typically uses it?
How does it actually work?
What specific outcomes can they expect?
Get granular with your targeting. Instead of broad topics, focus on specific scenarios.
For example:
→ Ecommerce sites can create pages for each product application
→ Service businesses can detail each service variation
→ Publishers can target specific reader scenarios
The depth of coverage signals expertise while giving AI exact matches for detailed queries.
This specificity is what makes AI content optimization work. You’re creating exactly what AI systems need to cite
16. Expand Your Reach Through Non-Owned Channels
AI engines lean heavily on third-party sources. Which means your brand needs to show up in places you don’t fully control.
This goes beyond your on-site efforts.
But it’s still part of the bigger AI visibility play. And your content team can drive it by publishing externally and fueling PR.
Take this example: when I search “best duffel bags for men 2025” in Claude, it references an Outdoor Gear Lab roundup of top bags.
If you sell duffels, you’d want to be in that article.
There are two ways to expand your presence on non-owned channels.
One is publishing on other sites yourself — guest posts, bylined articles, or original research placed on authority blogs and industry outlets.
These extend your reach, position you as an expert, and increase your AI search visibility.
You’ll find guest post opportunities in several well-known sites. Like Fast Company here, which has an authority score of 67.
The other way to build visibility is getting featured by others.
Think reviews, roundups, and product comparisons that highlight your solution.
This usually involves working closely with your PR team.
But the content team fuels those opportunities with the data, case studies, and assets that make the pitch worth covering.
Either way, the goal of this AI content strategy is the same: substantive coverage.
A one-line mention usually isn’t enough. You need full features, detailed reviews, or exclusive insights that stand out.
Because the more credible coverage you earn (whether you wrote it or someone else did), the more evidence AI has to pull into its answers.
http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png00http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-22 14:05:192025-10-22 14:05:19AI Optimization: How to Rank in AI Search (+ Checklist)
The message from this month’s SEO Update is clear: AI and data accuracy are reshaping how we plan, optimize, and measure SEO. This is not just a slate of updates, but a signal to rethink impressions, content creation, and tooling so you stay effective. Chris Scott, Yoast’s Senior Marketing Manager, hosted the session. Alex Moss and Carolyn Shelby shared deep dives on AI trends, Google updates, and Yoast product news.
Data and rankings in flux
A key shift centers on data. Google removed the num=100 parameter, which changed how much ranking data shows up per page in Google Search Console. The result isn’t a sudden performance drop; it’s a correction. Impressions can look lower because the data is being cleaned up, and that matters more than the raw numbers. Paid search data stays solid, since ads rely on precise counting for financial reasons.
AI content and media: use it, don’t rely on it
Sora 2 can generate short videos from text prompts, providing handy visuals to accompany blog posts. Use AI visuals to complement your core messaging, not to replace it. In e-commerce, Walmart, WooCommerce, and Shopify are testing AI-enabled shopping features. Don’t rush a full switch before major buying events.
Local SEO and engines beyond Google
Bing’s Business Manager now has a refreshed UI focused on local listings, signaling a push into local search. Diversifying beyond Google can reveal new AI-powered opportunities. It’s about testing where AI-driven search and shopping perform best, not moving budgets blindly.
AI mode and how people behave
Research into AI-dominant sessions shows a distinct pattern: users linger 50 to 80 seconds on AI-generated text, and clicks tend to be transactional. Intent patterns shift, too. Now, comparisons lead to review sites, decisive purchases land on product pages, and local tasks point to maps and assets.
Meta descriptions and AI generation
Google tested AI-generated descriptions for threads lacking meta content, but meta descriptions aren’t obsolete. Best practice is to lean on Yoast’s default meta templates (like %excerpt%) as a reliable fallback. Write with an inverted pyramid in mind, which puts key information first, so AI can extract it cleanly. Keep a fallback description in Yoast SEO so automation stays under your control.
AI in everyday workflows
ChatGPT updates push toward more human-to-human interactions, and tools like Slack can summarize threads and search discussions by meaning, not just keywords. Growth in AI usage feels steadier now; some younger users opt for other AI tools.
Insights from Microsoft and Google
The core rules haven’t changed: concise, unique, value-packed content wins. Shorter, focused writing works best for AI synthesis; trim fluff and sharpen clarity. The message is simple because clarity beats complexity, especially as AI becomes more central to how content is consumed.
Yoast product updates to watch
The Yoast SEO AI+ bundle adds AI Brand Insights to track mentions and citations in AI outputs, and pronoun support has been added to schema markup for inclusivity. If you’re tracking AI relevance beyond traditional signals, this bundle can be a smart addition.
http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png00http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-22 13:11:592025-10-22 13:11:59A recap of the October 2025 SEO Update by Yoast
Yelp just unveiled its 2025 Fall Product Release, a sweeping AI-driven update that turns the local discovery platform into a more conversational, visual, and intelligent experience.
Driving the news: Yelp’s rollout includes over 35 new AI-powered features, headlined by:
Yelp Assistant, an upgraded chatbot that instantly answers customer questions about restaurants, shops, or attractions—citing reviews and photos.
Menu Vision, which lets users scan menus to see photos, reviews, and dish details in real time.
Yelp Host and Yelp Receptionist, AI-powered call solutions that handle reservations, collect leads, and answer questions with natural, customizable voices.
Natural language and voice search, allowing users to search conversationally (“best vegan sushi near me”) for smarter, more relevant results.
Popular Offerings, which highlights a business’s most-mentioned services, products, or experiences.
Why we care. Yelp’s new AI tools make it easier to capture and convert high-intent customers at the moment of discovery. With features like Yelp Assistant, AI-powered call handling, and natural language search, businesses can respond instantly, stay visible in smarter search results, and never miss a lead. The update turns Yelp from a review site into an always-on customer engagement platform—giving advertisers more efficient ways to connect, communicate, and close.
What’s next. Yelp plans to make its AI assistant the primary interface for discovery and transactions in 2026, merging instant answers, booking, and customer messaging into one seamless experience.
The bottom line. Yelp’s latest AI release gives brands smarter tools to engage customers in real time—turning everyday search and service interactions into instant connections.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/10/Screenshot-2025-10-21-at-19.18.04-kjKpB5.png?fit=776%2C1386&ssl=11386776http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-21 18:25:342025-10-21 18:25:34Yelp’s new tools help brands connect faster and engage customers in real time
OpenAI announced the launch of its first web browser, which they named ChatGPT Atlas. Atlas is currently available on Mac only right now and has all the features you would expect from an AI browser. But the most surprising part is that its built-in search features seem to be powered by Google and not Microsoft Bing, its early partner and one of its largest investors.
How to download Atlas. If you are on a Mac, you can download ChatGPT Atlas at chatgpt.com/atlas. From there, the web browser will download to your computer, you double click on the installer and then drag the application to your application folder.
What Atlas does. It is a web browser, first and foremost. You can go directly to web pages and browse them, but as you do that, there is ChatGPT available on the sidebar, like other AI powered web browsers. You can ask ChatGPT questions, you can have it re-write your content in Gmail and other tabs, offers personalization and memory, plus it will help you complete tasks, code and even shop using agentic features.
Search in Atlas. The interesting thing is that when you search in ChatGPT Atlas, it gives you a ChatGPT like response but also adds search vertical tabs to the top, like you have in other search engines. Like web, images, videos, news and more. Then when you go to those tabs, there is a link at the top of each set of search results to Google.
Here are screenshots:
More details. ChatGPT Atlas is launching worldwide on macOS today to Free, Plus, Pro, and Go users. Atlas is also available in beta for Business, and if enabled by their plan administrator, for Enterprise and Edu users. Experiences for Windows, iOS, and Android are coming soon.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/10/chatgpt-atlas-search-web-1-58f6Gh.png?fit=1862%2C1302&ssl=113021862http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-10-21 17:58:422025-10-21 17:58:42OpenAI launches a web browser – ChatGPT Atlas
Google Merchant Center is rolling out a new Issue Details Page (IDP) to help advertisers more easily diagnose and resolve account or product-level problems.
How it works:
Located under the “Needs attention” tab, the page provides a consolidated overview of current issues.
It surfaces recommended actions, business impact metrics, and sample affected products — giving merchants a clearer sense of what to fix first.
Why we care. Until now, identifying and fixing issues in Merchant Center often required navigating multiple sections and reports. The new Issue Details Page (IDP) in Google Merchant Center gives advertisers a single place to view and fix account or product issues.
It highlights the problem’s impact, recommends actions, and shows affected products, helping advertisers resolve issues faster and keep listings active. In short, it saves time, improves visibility, and helps prevent lost sales.
The big picture. The update is part of Google’s broader push to improve Merchant Center usability ahead of the holiday shopping season, when product accuracy and uptime are critical for advertisers.
The bottom line.Google’s new IDP could save advertisers time and guesswork by putting all issue diagnostics and solutions in one place.
First seen. The newly released help doc was spotted by PPC News Feed founder, Hana Kobzová