Google AI Overviews caught linking back to its own search results

Google is testing placing special and very clickable links in its AI Overviwws, but not to publishers or your own website, but rather back to its own search results. You got that right, Google is testing linking the AI Overviews back to new search queries on Google.com.

What it looks like. I posted a screenshot and video from Sachin Patel who spotted this change on the Search Engine Roundtable – here they are:

Clicking on those underlined links in the text of the AI Overview, both at the top and in the middle section, will take you back to a new Google Search. The smaller link icons take you to the side panel links, those go to publishers and external websites.

Here is a video:

Why we care. All that talk, even recently with Google’s new AI Mode, around “prominently surfaces relevant links to help people find web pages and content they may not have discovered before,” and I quoted that from Google, means what here?

Those link icons in AI Overviews have shown to result in a drop in CTR from Google Search to external websites. We’ve seen a large company sue Google over traffic drops due to AI Overviews.

These links back to Google results in Google search volume growing, but do they actually help the searcher or they only help Google’s bottom line.

What about Google saying they value independent publishers and are prioritizing ways to send traffic to publishers in AI Overviews?

Read more at Read More

Bing Webmaster Tools Copilot feature now available to all users

Microsoft released Copilot within Bing Webmaster Tools to all users today, the company announced today. Initially, Microsoft had a beta release to only 10,000 trusted users of Bing Webmaster Tools but now the feature is generally available to all users.

What is Copilot in Bing Webmaster Tools. Copilot in Bing Webmaster Tools promises to provide “instant, accurate and contextually relevant answers to [user] queries.” It is engineered to provide “streamline workflows, enhance productivity, and provide valuable insights,” Microsoft initially said.

Copilot in Bing Webmaster Tools aims at “helping webmasters manage their search optimization efforts more efficiently, improving workflows, productivity, and providing actionable insights,” the company said.

Top features. Microsoft outlined these four features as the top features for Copilot in Bing Webmaster Tools:

  • Real-Time Question/Answer Chat: Offers personalized, instant, and accurate responses tailored to your specific site, enhancing efficiency and effectiveness in managing your SEO.
  • Deep Data Insights: Provides comprehensive, site-specific insights into performance, helping you quickly identify and resolve issues.
  • Learning Resources: Access built-in guidance to perform tasks, utilize tools, and improve your website’s visibility and performance.
  • User Feedback Integration: Continuously improves based on user feedback, ensuring the tool evolves to meet user needs.

English only. This feature is currently only available in English only, but Microsoft is working to expand it to more languages in the future.

What it looks like. Here are two screenshots of Copilot in Bing Webmaster Tools:

Why we care. Copilot in Bing Webmaster Tools may help you find answers to your data questions on how your site is performing in Bing Search, what issues your site may be experiencing, and how to use Bing Webmaster Tools.

It seems this feature has improved over the last few months and hopefully will continue to improve over time.

Read more at Read More

Bing Webmaster Tools Search Performance reports gains comparisons

Microsoft has added the ability to compare date ranges within the Search Performance report in Bing Webmaster Tools. This allows you to compare metrics and data points such as clicks, impressions, CTR, keywords, and pages across various date ranges.

What Microsoft said. Microsoft wrote:

“These enhancements enable in-depth analysis, providing invaluable insights for optimizing online presence and improving visibility. This means businesses can now make more informed decisions, identify key trends, and effectively adjust their marketing strategies to achieve better results.”

Microsoft also made user experience improvements in time filters, the company added.

What it looks like. Here is a screenshot showing the new “compare” option in the Search Performance report in Bing Webmaster Tools:

Why is this helpful. Microsoft posted a few ways these comparisons helps you, including:

  • Identifying key trends and patterns
  • Evaluating the effectiveness of their marketing campaigns
  • Gaining insights from seasonal search data
  • Setting benchmarks and goals using historical data
  • Identifying areas for improvement in search performance

Why we care. Google has offered comparing data in Search Console for a while, plus all the third-party tools offer this.

But now you can compare this data directly in Bing Webmaster Tools without having to export this data to a third-party tool. This should save you a lot of time, by not having to use another tool to compare your Bing Search data.

Read more at Read More

No rookies. No fluff. Just the most advanced SMX agenda ever.

The wait is over: I am beyond thrilled to finally reveal the agenda for SMX Advanced – June 11-13 in Boston.

Still here? Weird. You should already be knee-deep in the 500-level program by now… but hey, if you want a preview, I’ve got some mind-melting sessions to share, including:

And that’s just the start. There’s also an exclusive keynote conversation with Google’s Ginny Marvin, an opening keynote with the one and only Wil Reynolds, and of course, our hands-on Q&A-fueled clinics that address your specific needs and curiosities.

Still here? Okay then. You’ll also be the first to see the results of a groundbreaking study by Search Engine Land and Fractl on how consumers and marketers are adapting in the age of AI-driven search.

And you’ll have the chance to participate in all-new Mastermind Sessions, no-holds-barred 10-person roundtables that deliver uncensored, practical advice on what really works – hosted by iconic industry experts. Stay tuned for the complete lineup!

Networking! I can’t believe I didn’t mention networking yet. We’re bringing a fabulous mix of both structured and serendipitous networking experiences to Boston, including:

  • The ever-classic Meet & Greet Reception and Networking Happy Hour
  • Casual cocktails with the SMX crew
  • A scenic morning jog along the harbor
  • Morning mindfulness to set you up for a day of success
  • Topic-driven lunch discussion tables
  • SEO and PPC meetups with your friends from Search Engine Land
  • A Magic the Gathering game night

For nearly 20 years, 200k+ search marketers from around the world have attended SMX to learn game-changing tactics and make career-defining connections. This is your chance to join them.

Super Early Bird rates – $500 off on-site prices – expire next Saturday, March 29, so get a move on and secure your spot today

Read more at Read More

4 SEO practices with diminishing returns

4 SEO practices with diminishing returns

Whether it’s time, money, or expertise, marketing resources are finite.

In today’s world, with advances in AI and more efficient tools than ever, businesses expect better results with fewer resources.

That means every second spent on an SEO campaign matters.

To keep up, SEOs must focus on actions that truly move the needle – without wasting time or unnecessary effort.

This article highlights SEO activities that have diminishing returns – where the effort eventually outweighs the benefits – and offers tips on optimizing more effectively.

1. Page speed improvements

Not long ago, I had a client who was obsessed with page speed

The site’s page speed was excellent, with 100% of its pages passing Core Web Vitals on desktop and more than 95% passing on mobile.

And yet, they still wanted every URL to be rated “good” for mobile and every page speed test score to be a perfect 100/100. 

Achieving that would have required painstaking hours digging through code and cleaning up things like unused JavaScript to rework how pages loaded.

Brand with a good CWV score on desktop

Let’s be clear – page speed is important! 

Plenty of data shows that improving page speed can boost conversion rates, especially for ecommerce stores. 

But if a page loads in under two seconds, is interactive quickly, and doesn’t have disruptive layout shifts, a site can gain only minor performance boosts by shaving off additional milliseconds.

Once a site meets Core Web Vitals standards, further page speed optimizations have diminishing returns. 

Unless a business handles a high volume of on-page transactions daily, it’s usually better to focus on other areas for improvement.

Now, if your page takes 10 seconds to load, has poor interactivity, and webpage elements move around while users try to click, then this should be a priority. 

But if most of your pages pass Core Web Vitals and the user experience is solid, agonizing over page speed makes no sense.

The client in the example above had much bigger priorities, like cleaning up rampant over-indexing or pruning their stockpile of old, outdated content. 

Although this was explained to them, they ignored it, despite efforts to prevent them from wasting time on a dead end.

2. Increasing backlink authority

Many businesses fixate on backlink authority, seeing it as a silver bullet for rankings while overlooking issues like:

  • Poor keyword targeting.
  • Content cannibalization.
  • Weak internal linking.

Link building is not dead. Plenty of studies and anecdotal evidence prove it.

Building link authority can be the missing piece that pushes rankings onto Page 1, especially for new websites or brand-new pages on an existing site.

But if your website’s authority is well established and content ranks the second it’s published, then building links can have diminishing returns.

It can even start to work against a site that:

  • Chases poor-quality link opportunities.
  • Gets overly spammy with anchor text.
  • Pursues links from places that aren’t relevant to its content.

Does that mean you should stop link building altogether? 

No! Always take a quality link if you can get it. 

But depending on a website’s level of authority, more success may come from focusing efforts elsewhere – such as content strategy, public relations, or conversion rate optimization (CRO).

Get the newsletter search marketers rely on.



3. Publishing net-new content

Another client insisted on producing only new content.

Refreshing old articles or removing underperforming pages was off the table.

For a while, the strategy worked, and the website saw gains in nonbranded performance.

However, there was a tipping point – continuously adding new content eventually led to performance declines.

Publishing net-new content - tipping point

Why?

Writing only net-new content for years caused internal cannibalization, as multiple articles covered similar topics. 

Relevant topics also became scarce, leading to content that was only loosely related – or even unrelated – to the company’s services.

HubSpot has been in SEO news recently for this exact reason. 

When a marketing automation software company writes on topics around business credit cards, performance starts to move in the wrong direction.

HubSpot's blog posts with credit card topics
HubSpot's organic performance

So what does this all mean? 

Writing net-new content can not only have diminishing returns but, if done incorrectly, can actually hurt performance. 

Instead, it may be more effective to focus on:

4. Refreshing old content

I know – I just said refreshing old content is often better than focusing only on new content.

But stick with me.

There’s no debate: Refreshing old content is both a successful and necessary SEO tactic. 

Search engines prioritize fresh content, and some of the easiest and most effective wins come from updating existing content to be more accurate and relevant.

We surveyed about 850 enterprise-level marketers and found that updating existing content provided a greater performance lift than creating new content.

It should come as no surprise that the biggest growth comes from content refreshes. 

As noted in the previous section, creating new content was cited as the largest source of traffic loss between the two strategies.

NP Digital's survey on creating vs. updating content

However, even content refreshes have diminishing returns. 

The Pareto Principle, also known as the 80/20 rule, states that roughly 80% of results come from 20% of the inputs. 

This applies to on-site content – about 80% of traffic or conversions typically come from just 20% of a website’s pages.

Naturally, returns will be much higher when focusing on top-performing pages and gradually decline as you move further down the list.

Does that mean you should only focus on the top 20% of your content and ignore the rest?

Absolutely not! 

But it does help put things into perspective. 

While it may be tempting to squeeze more performance out of every article, refreshing low-volume or low-relevance content eventually will not provide value. 

Sometimes, it’s best to leave an article alone or retire it – and that’s OK.

Dig deeper: SEO prioritization – How to focus on what moves the needle

Where should you spend your time?

Where to focus SEO efforts depends on various factors. 

A newer website, for example, will likely need to prioritize link building and generating new content. 

Meanwhile, a website with a large library of existing content and strong authority should focus more on content refreshes and leveraging existing traffic through UX and CRO strategies.

With so many ranking factors at play, no single tactic ensures success. 

Winning in SEO requires a combination of strategies and tactics. 

It’s on you to allocate resources wisely. Make sure every effort contributes meaningful value, avoiding the law of diminishing returns.

Dig deeper: Prioritizing SEO strategies: Where to focus your efforts

Read more at Read More

Bluesky for PPC: What you need to know

Bluesky for PPCs- What you need to know

Bluesky is quickly gaining attention as a growing social media platform – but what does that mean for the PPC community? 

Here’s what you need to know and how to prepare for potential ad opportunities

What is Bluesky?

Bluesky is a social media platform that was originally founded as a research initiative at Twitter in 2019. 

After becoming an independent company in 2021, it launched its invite-only beta in 2023 before opening to the public in February 2024.

As of writing, Bluesky has a total of 33.1 million registered users, and its growth shows no signs of slowing down.

How does Bluesky differ from other social media platforms?

The platform, which describes itself as “social media as it should be,” has a similar look and feel to how X appeared back when it was Twitter. 

Like most social media platforms, it allows users to post, repost, like, and share content.

The most significant difference between Bluesky and many other platforms is that it is decentralized, meaning it allows users to host their own data and create custom feeds. 

This helps reduce the risk of data breaches and puts control back in the hands of users.

This decentralized approach, combined with the platform’s strong content moderation tools, is highly appealing to social media users who have become discouraged by recent changes to platforms such as X.

An example of content moderation controls available for Bluesky users
Above: An example of content moderation controls available for Bluesky users

So, how can I advertise on Bluesky?

Slow your horses!

There is no advertising on Bluesky.

While they haven’t ruled out the possibility of ads being available in the future, advertisers won’t be able to add Bluesky to their PPC strategy anytime soon.

Even if or when Bluesky does introduce advertising, it is likely to look different from the models used by other social media platforms.

Speaking to CNBC in November, COO Rose Wang stated that while the company is open to the idea of including ads on the platform, they don’t want it to be an “intrusive experience.”

She referenced Reddit’s advertising model as an example of a more intent-based approach to ads.

Dig deeper: PPC keyword strategy: How to align search intent with funnel stages

If there are no ads, then how is Bluesky making money?

Bluesky is funded by investors and venture capital firms.

In November, Bluesky announced it was developing an optional subscription model. Subscriptions are rumored to provide users access to additional features, such as the ability to use a custom domain in their username and higher video upload limits.

Why ads on Bluesky would be a good thing

While there aren’t any opportunities to advertise on Bluesky, that doesn’t mean the time won’t come. 

With a growing user base and increasing operational costs, Bluesky will likely to introduce some.

When that happens, here are some reasons why PPC advertisers might want to give it a try:

  • PPC advertisers are becoming increasingly frustrated with the developments – or lack thereof – on platforms like Meta and LinkedIn, which have long dominated the market.
  • The introduction of Bluesky to advertisers’ PPC strategy would allow them to try something new and potentially avoid the grievances and pitfalls they are experiencing on other advertising platforms.
  • Early adopters are likely to gain an advantage over competitors if they quickly embrace any potential advertising openings.
  • Opportunities to be the first business in your niche to advertise on a platform are rare, so advertisers could benefit from staking their claim early and cementing their presence.
  • Bluesky’s strong focus on privacy, moderation, and protection from misinformation could offer advertisers a more brand-safe advertising opportunity.

With recent reports of Google Ads placements appearing on illegal and compromising websites, PPC advertisers have increasing and legitimate concerns regarding where their ads are displayed.

Get the newsletter search marketers rely on.



Why ads on Bluesky might not be a good thing

If Bluesky introduces advertising, PPC advertisers should also consider potential drawbacks. These concerns include:

  • One of the biggest challenges in advertising on Bluesky stems from what makes it unique – its decentralized approach. While this is appealing and important for users, it may complicate advertising due to data and privacy restrictions.
  • Depending on the type of advertising model offered, PPC advertisers may need to rethink their targeting and messaging strategies, which could limit their efficiency.
  • Many Bluesky users have migrated from X, seeking a platform that values privacy over profit. As a result, Bluesky users are more likely to resist advertising, which could impact ad performance in terms of driving leads or sales.
  • Early adopters of new technology often face challenges such as technical bugs and unknown performance benchmarks. Not all businesses may be in a position to take on such risks if or when advertising reaches the Bluesky platform.

What should PPC advertisers be doing now?

In preparation for the likely launch of some form of advertising model on the platform, there are steps PPC advertisers can take now:

1. Claim your handle

Advertisers should secure the handle for their business or clients to ensure they have access to usernames that accurately represent their brand. 

For example: 

  • https://bsky.app/profile/searchengineland.bsky.social 
A screenshot of the Search Engine Land account on Bluesky
Above: A screenshot of the Search Engine Land account on Bluesky

2. Explore the platform

Advertisers should spend time navigating Bluesky to:

  • Understand its interface.
  • Identify the types of organic content that perform well.
  • Become familiar with the platform’s extensive moderation tools.

3. Build an organic presence

By posting content and engaging with the community, advertisers can build trust and recognition for their brand. 

This proactive approach can help businesses connect with Bluesky’s user base – especially if the audience remains resistant to traditional ads.

What’s next?

PPC advertisers will need to hold off a little longer before adding Bluesky to their PPC strategy.

However, with the platform growing rapidly, some form of advertising model is likely to arrive within the next 18 months – and we’ll be sure to keep you updated.

Dig deeper: Here’s why PPC now looks more like paid social and what it means

Read more at Read More

How to use Google Search Console for keyword research

How to use Google Search Console for keyword research

When it comes to keyword research, SEO professionals often rely on expensive tools to find the right keywords. 

However, Google Search Console (GSC) provides a completely free way to access insights straight from Google itself.

GSC is a powerful and often underused tool that shows exactly what is working on a site and where improvements can be made.

Unlike other keyword research tools that provide generic suggestions and estimated data, GSC delivers real-life search data based on actual searches leading to a website. It can often uncover interesting insights.

Here’s how to use GSC to find valuable keyword opportunities and improve rankings.

Why use Google Search Console for keyword research?

Google Search Console is a goldmine for keyword insights. Here’s why you should use it.

  • Free and requires no subscriptions: Many SEO tools require costly subscriptions, but GSC is completely free, making it accessible to businesses of all sizes.
  • Provides real keyword performance data: Most keyword research tools provide estimated search volumes, but GSC shows actual data on searches that lead users to your site, ensuring accuracy.
  • Helps identify keywords with high optimization potential: Analyzing existing keyword rankings allows you to optimize content and improve visibility with small tweaks.
  • Uncovers content gaps and new topic opportunities: GSC reveals queries that may not have been intentionally targeted but are already driving traffic, providing ideas for new content.
  • Tracks keyword performance over time: You can monitor how rankings fluctuate, which keywords are growing in importance, and how search behavior is evolving.
  • Helps understand search intent: By analyzing query data, you can refine content to better match user intent and increase engagement.
  • Provides device-specific insights: Performance can vary between desktop and mobile users, and GSC helps fine-tune SEO strategies accordingly.

Dig deeper: 6 vital lenses for effective keyword research

5-step process for using Google Search Console

Step 1: Discover what you’re already ranking for

Rather than focusing solely on new keywords, GSC helps identify keywords that are already ranking but could perform better with some optimization.

How to find ranking keywords

  • Log into Google Search Console and select a website.
  • Click on Performance > Search Results.
  • Scroll down to the Queries section to see the search terms leading visitors to the site.

What to look for

  • Keywords ranking in positions 11-30 (Pages 2-3 of Google). These have potential to break into Page 1 with slight optimizations.
  • Unexpected keywords that weren’t intentionally targeted but are ranking anyway, presenting new content opportunities.
  • High-impression, low-CTR keywords, indicating that page titles or meta descriptions may need optimization to improve click-through rates.
  • Seasonal search trends, allowing content to be optimized ahead of high-traffic periods.
  • Queries with declining CTRs, which may indicate changing search intent or increased competition.

Dig deeper: Why traditional keyword research is failing and how to fix it with search intent

Step 2: Find new blog topics

GSC is useful for content ideation by revealing long-tail search queries that are ideal for blog topics.

How to do it

  • Navigate to the Performance report and look for long-tail search queries.
  • Identify keywords that are not well-covered on the site.
  • Create dedicated blog posts answering those exact queries.
  • Optimize existing content by incorporating these new long-tail keywords naturally.
  • Cross-link between related blog posts to build topical authority.

Get the newsletter search marketers rely on.



Step 3: Identify and manage irrelevant keywords

GSC can also reveal irrelevant search terms bringing traffic to a site. Some queries may drive traffic that does not align with the intended audience, leading to vanity traffic that skews reports.

How to manage irrelevant search terms

  • Identify keywords bringing in non-relevant traffic that do not contribute to conversions or engagement.
  • Adjust on-page content and metadata to clarify the intent of the page.
  • Use negative keywords in paid search campaigns if these terms are also appearing in PPC reports.
  • Monitor engagement rates and session duration for traffic from these terms to assess engagement levels.

Example

  • If a bathroom renovation site ranks for “how to clean a kitchen splashback,” that traffic is unlikely to convert into meaningful engagement.
  • Identifying and minimizing such cases ensures that a site is optimized for relevant search terms.

Step 4: Track overall keyword performance

GSC provides detailed performance tracking without the need for a paid keyword tracking tool.

Key metrics to check

  • Total clicks: The number of visitors coming from search results.
  • Total impressions: The number of times a site appears in search results.
  • Click-through rate (CTR): The percentage of users who click after seeing a result.
  • Average position: The ranking in Google search results.
  • Branded vs. non-branded search terms: Understanding the balance between brand visibility and new audience acquisition.
  • Device-specific performance: Identifying whether certain keywords perform better on mobile vs. desktop.

Unlike most SEO tools that limit the number of keywords tracked, GSC offers unlimited data on how a site is performing.

Step 5: Monitor and adjust regularly

SEO requires ongoing monitoring and adjustments

Google’s algorithms evolve, competitors optimize, and search trends change. 

Regularly checking GSC data helps refine content strategies over time.

Quick SEO wins using GSC

  • Find and optimize underperforming pages with high impressions but low CTR.
  • Prioritize content updates for topics already driving traffic.
  • Fix technical SEO issues flagged in Search Console, such as slow-loading pages or mobile usability errors.
  • Create new content based on keyword discoveries.
  • Identify and address keyword cannibalization by ensuring the right page is ranking for a given query.
  • Use internal linking strategically to strengthen authority for key landing pages.

Dig deeper: How to use Google Search Console to unlock easy SEO wins

More advanced strategies

While not strictly related to keyword research, ensuring that any errors are dealt with can help with achieving a deeper understanding of keywords. 

After all, if a page isn’t indexed due to a technical issue, then no amount of tweaking to the content will help.

Identifying errors

The URL Inspection tool in GSC is invaluable for understanding how Google views and indexes a specific page. 

You can use it to troubleshoot indexing problems and ensure that pages are properly crawled.

How to use it

  • Open Google Search Console and navigate to the URL Inspection Tool.
  • Enter the URL of the page you want to inspect.
  • Click Enter to see the latest indexing status.
  • Review key insights, including:
    • Indexing status: Is the page indexed or not?
    • Crawl errors: Identifies issues preventing the page from appearing in search results.
    • Canonical URL: Ensures that Google recognizes the correct version of the page.
    • Last crawl date: Shows when Google last crawled the page.
    • Rendered page view: Displays how Googlebot sees the page.

If the page is not indexed, click Request Indexing to prompt Google to crawl it again.

If there are errors, follow the recommendations provided and resolve issues such as robots.txt blocking, noindex tags, or canonical conflicts.

By regularly inspecting URLs, you can ensure that critical pages are properly indexed and visible in search results.

Dig deeper: How to fix ‘Crawled – Currently not indexed’ error in Google Search Console

International SEO

For websites targeting multiple countries, understanding geographic search performance can help refine international SEO strategies and localize content for different markets.

How to use it

  • In Google Search Console, navigate to Performance > Search Results.
  • Click on the Countries tab to see a breakdown of traffic by region.
  • Identify which countries are driving the most organic traffic and how search trends vary between locations.

Dig deeper: Advanced SEO: How to level up your keyword strategy

Harness Google Search Console for SEO wins

Google Search Console is a powerful and often overlooked tool for keyword research.

It provides real data directly from Google, showing exactly how a site is performing in search.

  • Use it to find quick-win keyword opportunities.
  • Identify new content ideas based on real user searches.
  • Eliminate vanity traffic that does not convert.
  • Track performance trends and adjust SEO strategies accordingly.

By using GSC effectively, you can uncover high-impact opportunities, refine strategies, and drive meaningful improvements in search performance – all without spending a penny on keyword research tools.

Read more at Read More

Generative AI use surging among consumers for online shopping: Report

AI retail traffic

Traffic from generative AI surged to U.S. retail sites over the holiday season and that trend has continued into 2025, according to new Adobe data.

Between Nov. 1 and Dec. 31, traffic from generative AI sources increased by 1,300% compared to the year prior (up 1,950% YoY on Cyber Monday). 

This trend continued beyond the holiday season, Adobe found. In February, traffic from generative AI sources increased by 1,200% compared to July 2024. 

The percentages are high because generative AI tools are so new. ChatGPT debuted its research preview on Nov. 30. 2022. Generative AI traffic remains modest compared to other channels, such as paid search or email, but the growth is notable. It’s doubled every two months since September 2024.

By the numbers. Findings from Adobe’s survey of 5,000 U.S. consumers found AI generates more engaged traffic:

  • 39% used generative AI for online shopping, with 53% planning to do so in 2025. 
  • 55% of respondents) use generative AI for conducting research.
  • 47% use it for product recommendations.
  • 43% use generative AI for seeking deals.
  • 35% for getting gift ideas.
  • 35% for finding unique products. 
  • 33% for creating shopping lists.

One of the most interesting findings from Adobe covers what happens once generative AI users land on a retail website. Compared to non-AI traffic sources (including paid search, affiliates and partners, email, organic search, social media), generative AI traffic shows:

  • More engagement: Adobe found 8% higher engagement as individuals linger on the site for longer. 
  • More pages: Generative AI visitors browse 12% more pages per visit
  • Fewer bounces: They have a 23% lower bounce rate. 

Yes, but. While engaged traffic is good, conversions are better.

  • Adobe found that traffic from generative AI sources is 9% less likely to convert than traffic from other sources.
  • However, the data shows that this has improved significantly since July 2024, which indicates growing comfort.

Generative AI for travel planning. In February 2025, traffic to U.S. travel, leisure and hospitality sites (including hotels) from generative AI sources increased by 1,700% compared to July 2024. In Adobe’s survey, 29% have used generative AI for travel-related tasks, with 84% saying it improved their experience. 

The top use cases amongst AI users include:

  • General research, 54% of respondents.
  • Travel inspiration, 43%.
  • Local food recommendations, 43%.
  • Transportation planning, 41%.
  • Itinerary creation, 37%.
  • Budget management, 31%.
  • Packing assistance, 20%. 

Once users land on a travel site, Adobe Analytics data shows a 45% lower bounce rate.

Gen AI for financial services research. In February 2025, traffic to U.S. banking sites from generative AI sources increased by 1,200% compared to July 2024. 

Adobe’s survey of U.S. consumers found 27% have used generative AI for banking and financial needs. The top use cases include:

  • Recommendations for checking and savings accounts, 42%.
  • Asking for explainers on investment strategies and terminology, 40%.
  • Creating a personalized budget, 39%.
  • Understanding the tax implications of financial decisions, 35%. 

Once generative AI traffic lands on a banking site, visitors spend 45% more time browsing (versus non-AI sources).  

About the data. Adobe’s data comes from the company’s Adobe Analytics platform and is based on more than 1 trillion visits to U.S. retail sites. Adobe also launched a companion survey of more than 5,000 U.S. respondents to understand how they use AI daily.

Read more at Read More

AI search engines often make up citations and answers: Study

AI search engines and chatbots often provide wrong answers and make up article citations, according to a new study from Columbia Journalism Review.

Why we care. AI search tools have ramped up the scraping of your content so they can serve answers to their users, often resulting in no clicks to your website. Also, click-through rates from AI search and chatbots are much lower than Google Search, according to a separate, unrelated study. But hallucinating citations makes an already bad situation even worse.

By the numbers. More than half of the responses from Gemini and Grok 3 cited fabricated or broken URLs that led to error pages. Also, according to the study:

  • Overall, chatbots provided incorrect answers to more than 60% of queries:
    • Grok 3 (the highest error rate) answered 94% of the queries incorrectly.
    • Gemini only provided a completely correct response on one occasion (in 10 attempts).
    • Perplexity, which had the lowest error rate, answered 37% of queries incorrectly.

What they’re saying. The study authors (Klaudia Jaźwińska and Aisvarya Chandrasekar), who also noted that “multiple chatbots seemed to bypass Robot Exclusion Protocol preferences,” summed up this way:

“The findings of this study align closely with those outlined in our previous ChatGPT study, published in November 2024, which revealed consistent patterns across chabots: confident presentations of incorrect information, misleading attributions to syndicated content, and inconsistent information retrieval practices. Critics of generative search like Chirag Shah and Emily M. Bender have raised substantive concerns about using large language models for search, noting that they ‘take away transparency and user agency, further amplify the problems associated with bias in [information access] systems, and often provide ungrounded and/or toxic answers that may go unchecked by a typical user.’” 

About the comparison. This analysis of 1,600 queries compared the ability of generative AI tools (ChatGPT search, Perplexity, Perplexity Pro, DeepSeek search, Microsoft CoPilot, xAI’s Grok-2 and Grok-3 search, and Google Gemini) to identify an article’s headline, original publisher, publication date, and URL, based on direct excerpts of 10 articles chosen at random from 20 publishers.

The study. AI Search Has A Citation Problem

Read more at Read More

As AI scraping surges, AI search traffic fails to follow: Report

AI search crawlers, user agents, and bots

AI-powered search engines (e.g., OpenAI’s ChatGPT, Perplexity) are failing to drive meaningful traffic to publishers while their web scraping activities increase. That’s one big takeaway from a recent report from TollBit, a platform that says it helps publishers monetize their content.

CTR comparison. Google’s average search click-through rate (CTR) was 8.63%, according to the report. However, the CTR for AI search engines was 0.74% and 0.33% CTR for AI chatbots. That means AI search sends 91% fewer referrals and chatbots send 96% less than traditional search.

Why we care. This is bad news for publishers because it shows AI search won’t replace traditional search traffic. As AI-generated answers replace direct website visits, you should expect to see this trend continue.

By the numbers. AI bot scraping doubled (+117%) between Q3 and Q4 2024. Also:

  • The average number of scrapes from AI bots per website for Q4 was 2 million, with another 1.89 million done by hidden AI scrapers.
  • 40% more AI bots ignored robots.txt in Q4 than in Q3.
  • ChatGPT-User bot activity skyrocketed by 6,767.60%, making it the most aggressive scraper.
  • Top AI bots by share of scraping activity:
    • ChatGPT-User (15.6%)
    • Bytespider (ByteDance/TikTok) (12.44%)
    • Meta-ExternalAgent (11.34%)
  • PerplexityBot continued sending referrals to sites that had explicitly blocked it, raising concerns about undisclosed scraping.

Context. One company, Chegg, is attempting to sue Google over AI Overviews. Chegg claims Google’s search feature has severely damaged its traffic and revenue.

About the data. There’s no methodology section, so it’s not entirely clear how many websites were analyzed, just that it’s based on “all onboarded ToolBit sites in Q4.” Toolbit says it “helps over 500 publisher sites.”

The report. TollBit State of the Bots – Q4 2024 (registration required)

Read more at Read More