The AI resume has become a C-suite-level asset that reflects your entire digital strategy.
To use it effectively, we first need to understand where AI is deploying it across the user journey.
How AI has rewritten the user journey
For years, our strategies were shaped by the inbound methodology.
We built content around a user-driven path through awareness, consideration, and decision, with traditional SEO acting as the engine behind those moments.
That journey has now been fundamentally reshaped.
AI assistive engines – conversational systems like Gemini, ChatGPT, and Perplexity – are collapsing the funnel.
They move users from discovery to decision within walled-garden environments.
It’s what I call the BigTech walled garden AI conversational acquisition funnel.
For marketers, that shift can feel like a loss of control.
We no longer own the click, the landing page, or the carefully engineered funnel.
But from the consumer perspective, the change is positive.
Our job is to align with this best-service model by proving to the AI that our brand is the most credible answer.
That requires updating the ultimate goal.
For commercial queries, the win is no longer visibility.
It’s earning the perfect click – the moment when an AI system acts as a trusted advisor and chooses your brand as the best solution.
To get there, we have to broaden our focus from explicit branded searches to the three modes of research AI uses today:
Explicit.
Implicit.
Ambient.
Together, they define the new strategic landscape and lead to one truth.
In an AI-driven ecosystem, brand is what matters most.
3 types of research redefining what search is
These three behaviors reveal how users now discover, assess, and choose brands through AI.
Explicit research (brand): The final perfect click
Explicit research is any query that includes your brand name, such as:
Searches for your name.
“Brand name reviews.”
“Brand vs. competitor.”
They represent deliberate, high-stakes moments when a potential client, partner, or investor is actively researching your brand.
It’s the decision stage of the funnel, where they look for specific information about you or your services, or conduct a final AI-driven due diligence check before committing.
A strong AI assistive engine optimization (AIEO) strategy secures these bottom-of-funnel moments first.
You must engineer an AI resume – the AI equivalent of a brand SERP – that is positive, accurate, and convincing so the prospect who is actively looking for you converts.
Branded terms are the lowest-hanging fruit, the most critical conversion point in the new conversational funnel, and the foundation of AIEO.
Implicit research (industry/topic/comparison): Being top of algorithmic mind
Implicit research includes any topical query that does not contain a brand name.
These are the “best of” comparisons and problem-focused questions that happen at the top and middle of the funnel.
To win this part of the journey, your brand must be top of algorithmic mind, the state where an AI instinctively selects you as the most credible, relevant, and authoritative answer to a user’s query.
Consideration: When a user asks, “Who are the best personal injury law firms in Los Angeles?”, the AI builds a shortlist, and you cannot afford to be missing.
Awareness: When a user asks, “Give me advice about personal injury legal options after a car accident,” your chance to be included depends on whether the AI already understands and trusts your brand.
Implicit research is not about keywords. It is about being understood by the algorithms, demonstrating credibility, and building topical authority.
Here’s how it works:
The algorithms understand who you are.
They can effectively apply credibility signals. (An expanded version of Google’s E-E-A-T framework, N-E-E-A-T-T, incorporates notability and transparency.)
You have provided the content that demonstrates topical authority.
If you meet these three prerequisites, you can become top of algorithmic mind for user-AI interactions at the top and middle of the funnel, where implicit research happens.
Brand is the one constant across all three research modes. AI:
Recommends you in explicit research because it understands your brand’s facts.
Recommends you in implicit research because it trusts your credibility on a topic.
Advocates for you in ambient research because it has learned your brand is the most helpful default solution.
By building understandability, credibility, and deliverability, you are not optimizing for one type of search.
You are systematically teaching the AI to trust your brand at every possible interaction.
The brands that become the best teachers will be the ones an AI recommends across all three research modes.
It’s time to update your strategy or risk being left out of the conversation entirely.
Your final step: The strategic roadmap
You now understand the what – the AI resume – and the where – the three research modes.
Finally, we’ll cover the how: the complete strategic roadmap for mastering the algorithmic trinity with a multi-speed approach that systematically builds your brand’s authority.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/11/The-three-AI-research-modes-redefining-search-E28094-and-why-brand-wins-irqfZ5.webp?fit=1920%2C1080&ssl=110801920http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-18 14:00:002025-11-18 14:00:00The three AI research modes redefining search – and why brand wins
By now, we’re all familiar with Google AI Overviews. Many queries you search on Google now surface responses through this quick and prominent search feature.
But AI Overview results aren’t always reliable or accurate.
Google’s algorithms can promote negative or misleading content, making online reputation management (ORM) difficult.
Here’s how to stay on top of AI Overviews and your ORM – by removing, mitigating, or addressing negative content.
How AI Overviews source information
AI Overviews relies on a mix of data sources across Google and the open web, including:
Google’s Knowledge Graph: The Knowledge Graph is Google’s structured database of facts about people, places, and things. It’s built from a range of licensed data sources and publicly available information.
Google’s tools and databases: Google also draws on structured data from its own systems. This includes information from:
Business Profiles.
The Merchant Center.
Other Google-managed datasets that commonly appear in search results.
Websites: AI Overviews frequently cites content from websites across the open web. The links that appear beside answers point to a wide variety of sources, ranging from authoritative publishers to lower-quality sites.
User-generated content (UGC): UGC can also surface in AI Overviews. This may include posts, reviews, photos, or publicly available content from community-driven platforms like Reddit.
Several other factors influence how this data is organized into answers, including topical relevance, freshness, and the authority of the source.
However, even with relevance and authority taken into consideration, harmful or false content can still appear in results.
This can happen for a variety of reasons, including:
Where the information is sourced.
How Google’s AI fills in gaps.
Instances where it may misunderstand the context of a user’s query.
Damages, or some harm caused to the reputation of the person or entity who is the subject of the statement.
Defamation standards vary by jurisdiction, and public figures may face a higher legal standard.
Because of this, proper documentation and professionalism are essential when filing a lawsuit, and working with a legal professional is likely in your best interest.
The other (and perhaps easier) route to take is working with an online reputation management specialist.
These teams are extremely well-versed at handling the multi-layered process of removals.
In an online crisis, they have the tools to respond and mitigate damage. They’re also trained to balance ethical considerations you might not always account for.
Clearer signals make it easier for AI Overview to present your brand correctly. Focus on the following areas.
Strengthening signals through publishing
One effective method is strategic publishing.
This means building a strong, positive presence around your company, business, or personal brand so AI Overviews have authoritative information to draw from.
A few approaches support this:
Publishing on credible domains: ORM firms often publish content on platforms like Medium, LinkedIn, and reputable industry sites. This strengthens your presence in trusted environments.
Employing consistent branding and factual accuracy: Content must also be factual and consistently branded. This reinforces authority and signals reliability.
Leveraging press releases and thought leadership: Press releases, thought leadership pieces, and expert commentary help create credible backlinks and citations across the web.
Supporting pages that build the narrative: ORM specialists also create supporting pages that reinforce key narratives. With the right linking and content clusters, AI Overviews is more likely to surface this material.
Leveraging structured data and E-E-A-T
Another effective method to establish credibility on AI Overviews is to focus on technical enhancements and experience, expertise, authoritativeness, and trustworthiness (E-E-A-T).
ORM specialists typically focus on two areas:
Structured data and schema markup: This involves adding more context about your brand online by:
Enhancing author bios.
Highlighting positive reviews.
Reinforcing signals that reflect credibility.
Establishing E-E-A-T signals: This includes building a trusted online presence by:
Referencing work published in reputable outlets.
Highlighting real client examples.
Showcasing customer relationships.
Outlining accolades and expertise through your bio.
Monitoring AI Overviews and detecting issues early
A final key aspect of staying on top of AI Overviews is to monitor the algorithm and detect issues early.
Using tools to track AI Overviews is extremely efficient, and these systems can help business owners monitor keywords and detect potential damage.
For instance, you might use these tools to track your brand name, executive names, or even relevant products.
As discussed, it’s also crucial to have a plan in place in case a crisis ever hits.
This means establishing press outreach contact points and a legal department, and knowing how to suppress content via the suppression methods already mentioned.
Ethical considerations
Online reputation management isn’t just generating think pieces. It’s a layered process grounded in ethical integrity and factual accuracy.
To maintain a truthful and durable strategy, keep the following in mind:
Facts matter: Don’t aim to manipulate or deceive. Focus on promoting factual, positive content to AI Overview.
Avoid aggression: Aggressive tactics rarely work in ORM. There’s a balance between over-optimization and under-optimization, and an ORM firm can help you find it.
Think long-term: You may want negative or false content removed immediately, but lasting suppression requires a long-term plan to promote positive content year after year.
Managing how AI Overviews presents your brand
AI Overviews is already a dominant part of the search experience.
But its design means negative or false content can still rise to the top.
As AI Overviews become more prominent, business owners need to monitor their online reputation and strengthen the positive signals that surface in these results.
Over time, that requires strategic publishing, long-term planning, the right technical signals, and a commitment to factual, honest content.
By following these principles, AI Overviews can become an asset for growth instead of a source of harm.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/11/How-to-remove-or-suppress-negative-content-from-AI-Overviews-dqkp94.webp?fit=1920%2C1080&ssl=110801920http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-18 13:00:002025-11-18 13:00:00Google AI Overviews: How to remove or suppress negative content
Picture a chocolate company with an elaborate recipe, generations old. They ask an AI system to identify which ingredients they could remove to cut costs. The AI suggests one. They remove it. Sales hold steady. They ask again. The AI suggests another. This continues through four or five iterations until they’ve created the cheapest possible version of their product. Fantastic margins, terrible sales. When someone finally tastes it, the verdict is immediate: “This isn’t even chocolate anymore.”
Aly Blawat, senior director of customer strategy at Blain’s Farm & Fleet, shared this story during a recent MarTech webinar to illustrate why 82% of marketing teams are failing at AI adoption: automation without human judgment doesn’t just fail. It compounds failure faster than ever before. And that failure has nothing to do with the technology itself.
The numbers tell the story. In a Forrester study commissioned by Optimove, only 18% of marketers consider themselves at the leading edge of AI adoption, even though nearly 80% expect AI to improve targeting, personalization and optimization. Forrester’s Rusty Warner, VP and principal analyst, puts this in context: only about 25% of marketers worldwide are in production with any AI use cases. Another third are experimenting but haven’t moved to production. That leaves more than 40% still learning about what AI might do for them.
“This particular statistic didn’t really surprise me,” Warner said. “We find that a lot of people that are able to use AI tools at work might be experimenting with them at home, but at work, they’re really waiting for their software vendors to make tools available that have been deemed safe to use and responsible.”
The caution is widespread. IT teams have controls in place for third-party AI tools. Even tech-savvy marketers who experiment at home often can’t access those tools at work until vendors embed responsible AI, data protections and auditability directly into their platforms.
The problem isn’t the AI tools available today. It’s that marketing work is still structured the same way it was before AI existed.
The individual vs. the organization
Individual marketers are thirsty for AI tools. They see the potential immediately. But organizations are fundamentally built for something different: control over brand voice, short-term optimization and manual processes where work passes from insights teams to creative teams to activation teams, each handoff adding days or weeks to cycle time.
Most marketing organizations still operate like an assembly line. Insights come from one door, creative from another, activation from a third. Warner called this out plainly: “Marketing still runs like an assembly line. AI and automation break that model, letting marketers go beyond their position to do more and be more agile.”
The assembly line model is excellent at governance and terrible at speed. By the time results return, they inform the past more than the present. And in a world where customer behavior shifts weekly, that lag becomes fatal.
The solution is “Positionless Marketing,” a model where a single marketer can access data, generate brand-safe creative and launch campaigns with built-in optimization, all without filing tickets or waiting for handoffs. It doesn’t mean eliminating collaboration. It means reserving human collaboration for major launches, holiday campaigns and sensitive topics while enabling marketers to go end-to-end quickly and safely for everything else.
Starting small, building confidence
Blain’s Farm & Fleet, a 120-year-old retail chain, began its AI journey with a specific problem: launching a new brand campaign and needing to adapt tone consistently across channels. They implemented Jasper, a closed system where they could feed their brand tone and messaging without risk.
“We were teaching it a little bit more about us,” Blawat said. “We wanted to show up cohesively across the whole entire ecosystem.”
Warner recommends this approach. “Start small and pick something that you think is going to be a nice quick win to build confidence,” he said. “Audit your data, make sure it’s cleaned up. Your AI is only going to be as good as the data that you’re feeding it.”
The pattern repeats: start with a closed-loop copy tool, then add scripts to clean product data, then layer in segmentation. Each step frees time, shortens cycles, and builds confidence.
Where data meets speed
Marketers aren’t drowning in too little data. They’re drowning in too much data with too little access. The 20% of marketing organizations that move fast centralize definitions of what “active customer,” “at risk,” and “incremental lift” actually mean. And they put those signals where marketers work, not in a separate BI maze.
“There’s massive potential for AI, but success hinges on embracing the change required,” Warner said. “And change is hard because it involves people and their mindset, not just the technology.”
The adoption lag isn’t about technology readiness. It’s about organizational readiness.
Balancing automation and authenticity
Generative AI took off first in low-risk applications: creative support, meeting notes, copy cleanup. Customer-facing decisions remain slower to adopt because brands pay the price for mistakes. The answer is to deploy AI with guardrails in the highest-leverage decisions, prove lift with holdouts and expand methodically.
Blawat emphasized this balance. “We need that human touch on a lot of this stuff to make sure we’re still showing up as genuine and authentic,” she said. “We’re staying true to who our brand is.”
For Blain’s Farm & Fleet, that means maintaining the personal connection customers expect. The AI handles the mechanics of targeting and timing. But humans ensure every message reflects the values and voice customers’ trust.
The future of marketing work
AI is moving from analysis to execution. When predictive models, generative AI and decisioning engines converge, marketers stop drawing hypothetical journeys and start letting the system assemble unique paths per person.
What changes? Less canvas drawing, more outcome setting. Less reporting theater, more lift by cohort. Fewer meetings, faster iterations.
Warner points to a future that’s closer than most organizations realize. “Imagine a world where I don’t come to your commerce site and browse. Instead, I can just type to a bot what it is I’m looking for. And I expect your brand to be responsive to that.”
That kind of conversational commerce will require everyone in the organization to become a customer experience expert. “It doesn’t matter what channel the customer uses,” Warner explained. “They’re talking to your brand.”
The path forward
There is no AI strategy without an operating model that can use it. The fix requires three fundamental changes: restructure how marketing work flows, measure lift instead of activity and enable marketers to move from idea to execution without handoffs.
The path forward requires discipline. Pick one customer-facing use case with clear financial upside. Define the minimum signals, audiences and KPIs needed. Enforce holdouts by default. Enable direct access to data, creative generation and activation in one place. Publish weekly lift by cohort. Expand only when lift is proven.
Warner expects adoption to accelerate significantly in 2026 as more vendors embed AI capabilities with proper guardrails. For brands like Blain’s Farm & Fleet, that future is already taking shape. They started with copywriting, proved value and are now expanding. The key was finding specific problems where AI could help and measuring whether it actually did.
AI will not fix a slow system. It will amplify it. Teams that modernize the way work gets done and lift the language of decisions will see the promise translate into performance.
As Blawat’s chocolate story reminds us, automation without judgment optimizes for the wrong outcome. The goal isn’t the cheapest product or the fastest campaign. It’s the one that serves customers while building the brand. That requires humans in the loop to point AI in the ri
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/11/Optimove-20251118-v2-on6doc.jpg?fit=1920%2C1080&ssl=110801920http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-18 12:00:002025-11-18 12:00:0082% of marketers fail AI adoption (Positionless Marketing can fix it) by Optimove
Google is preparing a new Search bidding model called Journey Aware Bidding, designed to factor in the entire customer journey — not just the final biddable conversion — to improve prediction accuracy and campaign performance.
How it works:
Journey Aware Bidding learns from your primary conversion goal plus additional, non-biddable journey stages.
Advertisers who fully track and properly categorize each step of their purchase funnel stand to benefit the most.
Google recommends mapping the entire journey — from lead submission to final purchase — and labeling all touchpoints as conversions within standard goals.
Why we care. Performance advertisers have long struggled with fragmented signals across the funnel. Journey Aware Bidding brings more of their conversion funnel into Google’s prediction models, potentially improving efficiency for long, multi-step journeys like lead gen.
Instead of optimizing on a single end-stage signal, Google can learn from every meaningful touchpoint, leading to smarter bids and better alignment with real business outcomes. This update rewards advertisers with strong tracking and could deliver a meaningful performance lift once fully launched.
What advertisers need to do:
Choose a single KPI-aligned stage (e.g., purchase, qualified lead) as the optimization target.
Mark other journey stages as primary conversions, but exclude them from campaign-level or account-default bidding optimization.
Ensure clean tracking and clear categorization of every step.
Pilot status. A closed pilot is due to launch this year for a small group of advertisers, with broader availability expected afterward as Google refines the model.
The bottom line. Journey Aware Bidding could represent a major shift in Search optimization: Google wants its bidding systems to understand not just what converts — but how users get there.
First seen. The details of this new bidding model was shared by Senior Consultant Georgi Zayakov on LinkedIn, amongst other products that were featured at Think Week 2025.
Google submitted a compliance plan to the European Commission that proposes changes to its ad-tech operations — but rejects calls to break up its business
How it works:
Google is offering product-level changes — for example, giving publishers the ability to set different minimum prices for different bidders in Google Ad Manager.
It’s also proposing greater interoperability between Google’s tools and those of rivals, in order to give publishers and advertisers more flexibility.
The company says these tweaks would resolve the European Commission’s concerns without a “disruptive break-up.”
Why we care. Google’s proposed “non-disruptive” fixes could preserve platform stability and avoid the turbulence of a forced breakup — but they may also shape future auction dynamics, pricing transparency, and access to competitive tools. In short, the outcome will influence how much control, choice, and cost efficiency advertisers have in Europe’s ad ecosystem.
Between the lines. Google is leaning on technical fixes rather than major structural overhaul — but critics argue that without deeper reform, the power dynamics in ad tech may not fundamentally shift.
The bottom line. Google is trying to strike a compromise: addressing the EU’s antitrust concerns while keeping its integrated ad-tech business intact. Regulators now face a choice: accept the tweaks — or push harder for a breakup.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/11/google-eu7-ss-1920-800x450-XvXTPc.jpg?fit=800%2C450&ssl=1450800http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-17 18:38:342025-11-17 18:38:34Google offers a “less disruptive” fix to EU ad-tech showdown
Undoubtedly, one of the hot topics in SEO over the last few months has been how to influence LLM answers. Every SEO is trying to come up with strategies. Many have created their own tools using “vibe coding,” where they test their hypotheses and engage in heated debates about what each LLM and Google use to pick their sources.
Some of these debates can get very technical, touching on topics like vector embeddings, passage ranking, retrieval-augmented generation (RAG), and chunking. These theories are great—there’s a lot to learn from them and turn into practice.
However, if some of these AI concepts are going way over your head, let’s take a step back. I’ll walk you through some recent tests I’ve run to help you gain an understanding of what’s going on in AI search without feeling overwhelmed so you can start optimizing for these new platforms.
Create branded content and check for results
A while ago, I went to Austin, Texas, for a business outing. Before the trip, I wondered if I could “teach” ChatGPT about my upcoming travels. There was no public information about the trip on the web, so it was a completely clean test with no competition.
I asked ChatGPT, “is Gus Pelogia going to Austin soon?” The initial answer was what you’d expect: He doesn’t have any trips planned to Austin.
That same day, a few hours later, I wrote a blog post on my website about my trip to Austin. Six hours after I published the post, ChatGPT’s answer changed: Yes, Gus IS going to Austin to meet his work colleagues.
ChatGPT prompts with a blog post published in between queries, which was enough to change a ChatGPT answer.
ChatGPT used an AI framework called RAG (Retrieval Augmented Generation) to fetch the latest result. Basically, it didn’t have enough knowledge about this information in its training data, so it scanned the web to look for an up-to-date answer.
Interestingly enough, it took a few days until the actual blog post with detailed information was found by ChatGPT. Initially, ChatGPT had found a snippet of the new blog post on my homepage and reindexed the page within the six-hour range. It was using just the blog post’s page title to change its answer before actually “seeing” the whole content days later.
Some learnings from this experiment:
New information on webpages reaches ChatGPT answers in a matter of hours, even for small websites. Don’t think your website is too small or insignificant to get noticed by LLMs—they’ll notice when you add new content or refresh existing pages, so it’s important to have an ongoing brand content strategy.
The answers in ChatGPT are highly dependent on the content published on your website. This is especially true for new companies where there are limited sources of information. ChatGPT didn’t confirm that I had upcoming travel until it fetched the information from my blog post detailing the trip.
Use your webpages to optimize how your brand is portrayed beyond showing up in competitive keywords for search. This is your opportunity to promote a certain USP or brand tagline. For instance, “The Leading AI-Powered Marketing Platform” and “See everyday moments from your close friends” are used, respectively, by Semrush and Instagram on their homepages. While users probably aren’t searching for these keywords, it’s still an opportunity for brand positioning that will resonate with them.
Win every search with AI visibility + traditional SEO
Built for how people search today. Track your brand across Google rankings and AI search in one place.
Try free for 14 days
Get started with
Test to see if ChatGPT is using Bing or Google’s index
The industry has been ringing alarm bells about whether ChatGPT uses Google’s index instead of Bing. So I ran another small test to find out: I added a <meta name=”googlebot” content=”noindex”> tag on the blog post, allowing only Bingbot for nine days.
If ChatGPT is using Bing’s index, it should find my new page when I prompt about it. Again, this was on a new topic and the prompt specifically asked for an article I wrote, so there wouldn’t be any doubts about what source to show.
The page got indexed by Bing after a couple of days, while Google wasn’t allowed to see it.
New article has been indexed by Bingbot
I kept asking ChatGPT, with multiple prompt variations, if it could find my new article. For nine days, nothing changed—it couldn’t find the article. It got to a point that ChatGPT hallucinated (actually, tried its best guess) a URL.
ChatGPT made-up URL: https://www.guspelogia.com/learnings-from-building-a-new-product-as-an-seo Real URL: https://www.guspelogia.com/learnings-new-product-seo
GSC shows that it can’t index the page due to “noindex” tag
I eventually gave up and allowed Googlebot to index the page. A few hours later, ChatGPT changed its answer and found the correct URL.
On the top, ChatGPT’s answer when Googlebot was blocked. On the bottom, ChatGPT’s answer after Googlebot was allowed to see the page.
Interestingly enough, the link to the article was presented on my homepage and blog pages, yet ChatGPT couldn’t display it. It only found that the blog post existed based on the text on those pages, even though it didn’t follow the link.
Yet, there’s no harm in setting up your website for success on Bing. They’re one of the search engines that adopted IndexNow, a simple ping that informs search engines that a URL’s content has changed. This implementation allows Bing to reflect updates in their search results quickly.
While we all suspect (with evidence) that ChatGPT isn’t using Bing’s index, setting up IndexNow is a low effort task that’s worthwhile.
We’re all talking about being included in AI search results, but what happens when a company or product loses a mention on a page? Imagine a specific model of earbuds is removed from a “top budget earbuds” list—would the product lose its mention, or would Google find a new source to back up its AI answer?
While the answer could always be different for each user and each situation, I ran another small test to find out.
In a listicle that mentioned multiple certification courses, I identified one course that was no longer relevant, so I removed mentions of it from multiple pages on the same domain. I did this to keep the content relevant, so measuring the changes in AI Mode was a side effect.
Initially, within the first few days of the course getting removed from the cited URL, it continued to be part of the AI answer for a few pre-determined prompts. Google simply found a new URL in another domain to validate its initial view.
However, within a week, the course disappeared from AI Mode and ChatGPT completely. Basically, even though Google found another URL validating the course listing, because the “original source” (in this case, the listicle) was updated to remove the course, Google (and, by extension, ChatGPT) subsequently updated its results as well.
This experiment suggests that changing the content on the source cited by LLMs can impact the AI results. But take this conclusion with a pinch of salt, as it was a small test with a highly targeted query. I specifically had a prompt combining “domain + courses” so the answer would come from one domain.
Nonetheless, while in the real world it’s unlikely one citation URL would hold all the power, I’d hypothesize that losing a mention on a few high-authority pages would have the side effect of losing the mention in an AI answer.
Test small, then scale
Tests in small and controlled environments are important for learning and give confidence that your optimization has an effect. Like everything else I do in SEO, I start with an MVP (Minimum Viable Product), learn along the way, and once/if evidence is found, make changes at scale.
Do you want to change the perception of a product on ChatGPT? You won’t get dozens of cited sources to talk about you straight away, so you’d have to reach out to each single source and request a mention. You’ll quickly learn how hard it is to convince these sources to update their content and whether AI optimization becomes a pay-to-play game or if it can be done organically.
Perhaps you’re a source that’s mentioned often when people search for a product, like earbuds. Run your MVPs to understand how much changing your content influences AI answers before you claim your influence at scale, as the changes you make could backfire. For example, what if you stop being a source for a topic due to removing certain claims from your pages?
There’s no set time for these tests to show results. As a general rule, SEOs say results take a few months to appear. In the first test on this article, it took just a few hours to see results.
Running LLM tests with larger websites
Working in large teams or on large websites can be a challenge when doing LLM testing. My suggestion is to create specific initiatives and inform all stakeholders about changes to avoid confusion later, as they might question why these changes are happening.
By changing the footer, ChatGPT 5 started mentioning its new tagline within 36 hours for a prompt like “tell me about Seer Interactive.” I’ve checked, and while every time the answer is different, they still mention the “97% retention rate.”
Imagine if you decide to change the content on a number of pages, but someone else has an optimization plan for those same pages. Always run just one test per page, as results will become less reliable if you have multiple variables.
Make sure to research your prompts, have a tracking methodology, and spread the learnings across the company, beyond your SEO counterparts. Everyone is interested in AI right now, all the way up to C-levels.
Another suggestion is to use a tool like Semrush’s AI SEO toolkit to see the key sentiment drivers about a brand. Start with the listed “Areas for Improvement”—this should give you plenty of ideas for tests beyond “SEO Reason,” as it reflects how the brand is perceived beyond organic results.
Checklist: Getting started with LLM optimization
Things are changing fast with AI, and it’s certainly challenging to keep up to date. There’s an overload of content right now, a multitude of claims, and, I dare to say, not even the LLM platforms running them have things fully figured out.
My recommendation is to find the sources you trust (industry news, events, professionals) and run your own tests using the knowledge you have. The results you find for your brands and clients are always more valuable than what others are saying.
It’s a new world of SEO and everyone is trying to figure out what works for them. The best way to follow the curve (or stay ahead of it) is to keep optimizing and documenting your changes.
To wrap it up, here’s a checklist for your LLM optimization:
Before starting a test, make sure your selected prompts consistently return the answer you expect (such as not mentioning your brand or a feature of your product). Otherwise, the new brand mention or link could be a coincidence, not a result of your work.
If the same claim is made on multiple pages on your website, update them across the board to increase chances of success
Use your own website and external sources (e.g., via digital PR) to influence your brand perception. It’s unclear if users will cross-check AI answers or just trust what they’re told.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/11/chatgpt-changed-answer-scaled-0vWeR4.webp?fit=1525%2C2048&ssl=120481525http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-17 18:17:242025-11-17 18:17:24Small tests to yield big answers on what influences LLMs
A leaked file reveals the user interactions that OpenAI is tracking, including how often ChatGPT displays publisher links and how few users actually click on them.
By the numbers. ChatGPT shows links, but hardly anyone clicks on them. For one top-performing page, the OpenAI file reports:
610,775 total link impressions
4,238 total clicks
0.69% overall CTR
Best individual page CTR: 1.68%
Most other pages: 0.01%, 0.1%, 0%
ChatGPT metrics. The leaked file breaks down every place ChatGPT displays links and how users interact with them. It tracks:
Date range (date partition, report month, min/max report dates)
Publisher and URL details (publisher name, base URL, host, URL rank)
Impressions and clicks across:
Response
Sidebar
Citations
Search results
TL;DR
Fast navigation
CTR calculations for each display area
Total impressions and total clicks across all surfaces
Where the links appear. Interestingly, the most visible placements drive the fewest clicks. The document broke down performance by zone:
Main response: Huge impressions, tiny CTR
Sidebar and citations: Fewer impressions, higher CTR (6–10%)
Search results: Almost no impressions, zero clicks
Why we care. Hoping ChatGPT visibility might replace your lost Google organic search traffic? This data says no. AI-driven traffic is rising, but it’s still a sliver of overall traffic – and it’s unlikely to ever behave like traditional organic search traffic.
About the data. It was shared on LinkedIn by Vincent Terrasi, CTO and co-founder of Draft & Goal, which bills itself as “a multistep workflow to scale your content production.”
Microsoft Advertising is rolling out Image Animation, a new Copilot-powered feature that automatically converts static images into short, dynamic video assets — giving advertisers a faster path into video without traditional production.
How it works:
Copilot transforms existing static images into scroll-stopping animated video formats.
The tool extends the lifespan of strong image creatives by repurposing them for video placements across Microsoft’s global publisher network.
The feature is now in global pilot (excluding mainland China) and accessible through Ads Studio’s video templates.
Why we care. Video continues to dominate digital attention, with the average American now watching more than four hours of digital video per day. As video becomes essential in performance campaigns, advertisers need scalable ways to produce it — especially when budgets or resources are tight.
This update reduces production barriers, extends the value of top-performing images, and unlocks broader inventory across Microsoft’s premium video network.
Between the lines. For many advertisers, the biggest bottleneck to entering video isn’t strategy — it’s production. Microsoft is positioning Copilot as a creative multiplier, letting performance marketers upgrade image-based campaigns with lightweight, AI-generated motion.
We’re always looking for new ways to help you understand your data and make smarter decisions when it comes
to Google Search. That’s why we’re happy to announce a new feature within the Search Console performance
reports: Custom annotations. This feature is designed to empower you to add your own contextual notes directly
to your performance charts. Think of it as a personal notebook for your Search data.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2021/12/web-design-creative-services.jpg?fit=1500%2C600&ssl=16001500http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-17 05:00:002025-11-17 05:00:00Adding context to your Search Console data with custom annotations
Did you know that even a one-second delay in page loading speed can cause up to 11% fewer page views? That’s right, you might have the best content strategy and a solid plan to drive traffic, but visitors won’t stay long if your site lags. Page speed is one of the biggest factors in keeping users engaged and converting.
In this guide, we’ll uncover the most common causes of slow websites and explore proven ways to boost website performance. Whether your site feels sluggish or you simply want to make it faster, these insights will help you identify what’s holding it back and how to fix it.
Key takeaways
Page speed significantly affects user experience and conversion rates, with even minor delays leading to increased bounce rates
What do we mean by ‘website performance’ and why is it important for you?
Website performance is all about how efficiently your site loads and responds when someone visits it. It’s not just about how fast a page appears; it’s about how smoothly users can interact with your content across devices, browsers, and locations. In simple terms, it’s the overall quality of your site’s experience that should feel fast, responsive, and effortless to use.
When your page loading speed is optimized, you’re not only improving the user experience but also setting the foundation for long-term website performance.
Here’s why it matters for every website owner:
Fast-loading sites have higher conversion rates and lower bounce rates
Attention spans are notoriously short. As the internet gets faster, they’re getting shorter still. Numerous studies have found a clear link between the time it takes a page to load and the percentage of visitors who become impatient while waiting.
By offering a fast site, you encourage your visitors to stay longer. Not to mention, you’re helping them complete their checkout journey more quickly. That helps improve your conversion rate and build trust and brand loyalty. Think of all the times you’ve been cursing the screen because you had to wait for a page to load or were running in circles because the user experience was atrocious. It happens so often, don’t be that site.
A fast page improves user experience
Google understands that the time it takes for a page to load is vital to the overall user experience. Waiting for content to appear, the inability to interact with a page, and even noticing delays create friction.
That friction costs time, money, and your visitor’s experience. Research shows that the level of stress from waiting for slow mobile results can be more stressful than watching a horror movie. Surely not, you say? That’s what the fine folks at Ericsson Research found a few years back.
Ericsson Mobility Report MWC Edition, February 2016
Improving your site speed across the board means making people happy. They’ll enjoy using your site, make more purchases, and return more frequently. This means that Google will view your site as a great search result because you are delivering high-quality content. Eventually, you might get a nice ranking boost.
Frustration hurts your users and hurts your rankings
It’s not just Google – research from every corner of the web on all aspects of consumer behavior shows that speed has a significant impact on outcomes.
Nearly 70% of consumers say that page speed impacts their willingness to buy (unbounce)
20% of users abandon their cart if the transaction process is too slow (radware.com)
The BBC found that they lost an additional 10% of users for every additional second their site took to load
These costs and site abandonment happen because users dislike being frustrated. Poor experiences lead them to leave, visit other websites, and switch to competitors. Google easily tracks these behaviors (through bounces back to search engine results pages, short visits, and other signals) and is a strong indicator that the page shouldn’t be ranking where it is.
Google needs fast sites
Speed isn’t only good for users – it’s good for Google, too. Slow websites are often inefficient. They may load too many large files, haven’t optimized their media, or fail to utilize modern technologies to serve their page. That means that Google has to consume more bandwidth, allocate more resources, and spend more money.
Across the whole web, every millisecond they can save, and every byte they don’t have to process, adds up quickly. And quite often, simple changes to configuration, processes, or code can make websites much faster with no drawbacks. That may be why Google is so vocal about its education on performance.
A faster web is better for users and significantly reduces Google’s operating costs. Either way, that means that they’re going to continue rewarding fast(er) sites.
Improving page speed helps to improve crawling for search engines
Modern sites are incredibly wieldy, and untangling that mess can make a big difference. The larger your site is, the greater the impact page speed optimizations will have. That not only impacts user experience and conversion rates but also affects crawl budget and crawl rate.
When a Googlebot comes around and crawls your webpage, it crawls the HTML file. Any resources referenced in the file, like images, CSS, and JavaScript, will be fetched separately. The more files you have and the heavier they are, the longer it will take for the Googlebot to go through them.
On the flip side, the more time Google spends on crawling a page and its files, the less time and resources Google has to dedicate to other pages. That means Google may miss out on other important pages and content on your site.
Optimizing your website and content for speed will provide a good user experience for your visitors and help Googlebots better crawl your site. They can come around more often and accomplish more.
Page speed is a ranking factor
Google has repeatedly said that a fast site helps you rank better. It’s no surprise, then, that Google has been measuring the speed of your site and using that information in its ranking algorithms since 2010.
In 2018, Google launched the so-called ‘Speed Update,’ making page speed a ranking factor for mobile searches. Google emphasized that it would only affect the slowest sites and that fast sites would not receive a boost; however, they are evaluating website performance across the board.
In 2021, Google announced the page experience algorithm update, demonstrating that page speed and user experience are intertwined. Core Web Vitals clearly state that speed is an essential ranking factor. The update also gave site owners metrics and standards to work with.
Of course, Google still wants to serve searchers the most relevant information, even if the page experience is somewhat lacking. Creating high-quality content remains the most effective way to achieve a high ranking. However, Google also states that page experience signals become more important when many pages with relevant content compete for visibility in the search results.
Google mobile-first index
Another significant factor in page speed for ranking is Google’s mobile-first approach to indexing content. That means Google uses the mobile version of your pages for indexing and ranking. This approach makes sense as we increasingly rely on mobile devices to access the internet. In recent research, Semrush found out that 66% of all website visits come from mobile devices.
To compete for a spot in the search results, your mobile page needs to meet Core Web Vitals standards and other page experience signals. And this is not easy at all. Pages on mobile take longer to load compared to their desktop counterparts, while attention span stays the same. People might be more patient on mobile devices, but not significantly so.
Take a look at some statistics:
The average website loading time is 2.5 seconds on desktop and 8.6 seconds on mobile, based on an analysis of the top 100 web pages worldwide (tooltester)
The average mobile web page takes 15.3 seconds to load (thinkwithgoogle)
On average, webpages on mobile take 70.9% longer to load than on desktop (tooltester)
A loading speed of 10 seconds increases the probability of a mobile site visitor bouncing by 123% compared to a one-second loading speed (thinkwithgoogle)
All the more reasons to optimize your website and content if your goal is to win a spot in the SERP.
Understanding the web page loading process
When you click a link or type a URL and press Enter, your browser initiates a series of steps to load the web page. It might seem like magic, but behind the scenes, there’s a lot happening in just a few seconds. Understanding this process can help you see what affects your page loading speed and what you can do to boost website performance.
The process of loading a page can be divided into three key stages:
Network stage
This is where the connection begins. When someone visits your site, their browser looks up your domain name and connects to your server. This process, known as DNS lookup and TCP connection, enables data to travel between your website and the visitor’s device.
You don’t have much direct control over this stage, but technologies like content delivery networks (CDNs) and smart routing can make a big difference, especially if you serve visitors from around the world. For local websites, optimizing your hosting setup can still help improve overall page loading speed.
Server response stage
Once the connection is established, the visitor’s browser sends a request to your server asking for the web page and its content. This is when your server processes that request and sends back the necessary files.
The quality of your hosting, server configuration, and even your website’s theme or plugins all influence how quickly your server responds. A slow response is one of the most common issues with slow websites, so investing in a solid hosting environment is crucial if you want to boost your website’s performance.
One popular choice is Bluehost, which offers reliable infrastructure, SSD storage, and built-in CDN support, making it a go-to hosting solution for many website owners.
Browser rendering stage
Now it’s time for the browser to put everything together. It retrieves data from your server and begins displaying it by loading images, processing CSS and JavaScript, and rendering all visible elements.
Browsers typically load content in order, starting with what’s visible at the top (above the fold) and then proceeding down the page. That’s why optimizing the content at the top helps users interact with your site sooner. Even if the entire page isn’t fully loaded yet, a quick initial render can make it feel fast and keep users engaged.
Key causes that are causing your website to slow down
While you can’t control the quality of your visitors’ internet connection, most slow website issues come from within your own setup. Let’s examine the key areas that may be hindering your site’s performance and explore how to address them to enhance your website’s performance.
Your hosting service
Your hosting plays a big role in your website’s performance because it’s where your site lives. The speed and stability of your host determine how quickly your site responds to visitors. Factors such as server configuration, uptime, and infrastructure all impact this performance.
Choosing a reliable host eliminates one major factor that affects speed optimization. Bluehost, for example, offers robust servers, reliable uptime, and built-in performance tools, making it a go-to hosting choice for anyone serious about speed and stability.
Your website theme
Themes define how your website looks and feels, but they also impact its loading speed. Some themes are designed with clean, lightweight code that’s optimized for performance, while others are heavy with animations and complex design elements. To boost website performance, opt for a theme that prioritizes simplicity, efficiency, and clean coding.
Large file size
From your HTML and CSS files to heavy JavaScript, large file sizes can slow down your website. Modern websites often rely heavily on JavaScript for dynamic effects, but overusing it can cause your pages to load slowly, especially on mobile devices. Reducing file sizes, compressing assets, and minimizing unnecessary scripts can significantly improve the perceived speed of your pages.
Badly written code
Poorly optimized code can cause a range of issues, from JavaScript errors to broken layouts. Messy or redundant code makes it harder for browsers to load your site efficiently. Cleaning up your code and ensuring it’s well-structured helps improve both performance and maintainability.
Images and videos
Unoptimized images and large video files are among the biggest causes of slow websites. Heavy media files increase your page weight, which directly impacts loading times. If your header image or hero banner is too large, it can delay the appearance of the main content. Optimizing your media files through compression, resizing, and Image SEO can dramatically improve your website’s speed.
Too many plugins and widgets
Plugins are what make WordPress so flexible, but adding too many can slow down your site. Each plugin adds extra code that your browser needs to process. Unused or outdated plugins can also conflict with your theme or other extensions, further reducing performance. Audit your plugins regularly and only keep the ones that truly add value.
Absence of a CDN
A content delivery network (CDN) helps your website load faster for users worldwide. It stores copies of your site’s static content, such as images and CSS files, across multiple servers located in different regions. This means that users access your site from the nearest available server, reducing loading time. If your audience is global, using a CDN is one of the easiest ways to boost website performance.
Redirects
Redirects are useful for managing URLs and maintaining SEO, but too many can slow down your site. Each redirect adds an extra step before reaching the final page. While a few redirects won’t hurt, long redirect chains can significantly affect performance. Whenever possible, try to link directly to the final URL to maintain consistent page loading speed.
For WordPress users, the redirect manager feature in Yoast SEO Premium makes handling URL changes effortless and performance-friendly. You can pick from redirect types such as 301, 302, 307, 410, and 451 right from the dashboard. Since server-side redirects tend to load faster than PHP-based ones, Yoast lets you choose the type your stack supports, allowing you to avoid slow website causes and boost website performance.
A smarter analysis in Yoast SEO Premium
Yoast SEO Premium has a smart content analysis that helps you take your content to the next level!
How to measure page speed and diagnose performance issues
Before you can improve your website performance, you need to know how well (or poorly) your pages are performing. Measuring your page speed helps you identify what’s slowing down your website and provides a direction for optimization.
What is page speed, really?
Page speed refers to how quickly your website’s content loads and becomes usable. But it’s not as simple as saying, ‘My website loads in 4 seconds.’ Think of it as how fast a visitor can start interacting with your site.
A page might appear to load quickly, but still feel slow if buttons, videos, or images take time to respond. That’s why website performance isn’t defined by one single metric — it’s about the overall user experience.
Did you know?
There is a difference between page speed and site speed. Page speed measures how fast a single page loads, while site speed reflects your website’s overall performance. Since every page behaves differently, measuring site speed is a more challenging task. Simply put, if most pages on your website perform well in terms of Core Web Vitals, it is considered fast.
Core metrics that define website performance
Core Web Vitals are Google’s standard for evaluating how real users experience your website. These metrics focus on the three most important aspects of page experience: loading performance, interactivity, and visual stability. Improving them helps both your search visibility and your user satisfaction.
Largest Contentful Paint (LCP): Measures how long it takes for the main content on your page to load. Aim for LCP within 2.5 seconds for a smooth loading experience
Interaction to Next Paint (INP): Replaces the older First Input Delay metric and measures how quickly your site responds to user interactions like taps, clicks, or key presses. An INP score under 200 milliseconds ensures your site feels responsive and intuitive
Cumulative Layout Shift (CLS): Tracks how stable your content remains while loading. Elements shifting on screen can frustrate users, so keep CLS below 0.1 for a stable visual experience
How to interpret and improve your scores
Perfection is not the target. Progress and user comfort are what count. If you notice issues in your Core Web Vitals report, here are some practical steps:
If your LCP is slow: Compress images, serve modern formats like WebP, use lazy loading, or upgrade hosting to reduce load times
If your INP score is high: Reduce heavy JavaScript execution, minimize unused scripts, and avoid main thread blocking
If your CLS score is poor: Set defined width and height for images, videos, and ad containers so the layout does not jump around while loading
If your TTFB is high: Time to First Byte is not a Core Web Vital, but it still impacts loading speed. Improve server performance, use caching, and consider a CDN
Remember that even small improvements create a noticeable difference. Faster load times, stable layouts, and quicker interactions directly contribute to a smoother experience that users appreciate and search engines reward.
Tools to measure and analyze your website’s performance
Here are some powerful tools that help you measure, analyze, and improve your page loading speed:
Google PageSpeed Insights
Google PageSpeed Insights is a free tool from Google that provides both lab data (simulated results) and field data (real-world user experiences). It evaluates your page’s Core Web Vitals, highlights problem areas, and even offers suggestions under ‘Opportunities’ to improve load times.
Google Search Console (Page Experience Report)
The ‘Page Experience’ section gives you an overview of how your URLs perform for both mobile and desktop users. It groups URLs that fail Core Web Vitals, helping you identify whether you need to improve LCP, FID, or CLS scores.
Lighthouse (in Chrome DevTools)
Lighthouse is a built-in auditing tool in Chrome that measures page speed, accessibility, SEO, and best practices. It’s great for developers who want deeper insights into what’s affecting site performance.
WebPageTest
WebPage Test lets you test how your website performs across various networks, locations, and devices. Its ‘waterfall’ view shows exactly when each asset on your site loads, perfect for spotting slow resources or scripts that delay rendering.
Chrome Developer Tools (Network tab)
If you’re hands-on, Chrome DevTools is your real-time lab. Open your site, press F12, and monitor how each resource loads. It’s perfect for debugging and understanding what’s happening behind the scenes.
A quick checklist for diagnosing performance issues
Use this checklist whenever you’re analyzing your website performance:
Run your URL through PageSpeed Insights for Core Web Vitals data
Check your Page Experience report in Google Search Console
Use Lighthouse for a detailed technical audit
Review your WebPageTest waterfall to spot bottlenecks
Monitor your server performance (ask your host or use plugins like Query Monitor)
Re-test after every major update or plugin installation
Speed up, but with purpose
As Mahatma Gandhi once said, ‘There is more to life than increasing its speed.’ The same goes for your website. While optimizing speed is vital for better engagement, search rankings, and conversions, it is equally important to focus on creating an experience that feels effortless and meaningful to your visitors. A truly high-performing website strikes a balance between speed, usability, accessibility, and user intent.
When your pages load quickly, your content reads clearly, and your navigation feels intuitive, you create more than just a fast site; you create a space where visitors want to stay, explore, and connect.
http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png00http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-11-13 12:22:002025-11-13 12:22:00From slow to super fast: how to boost site speed the right way