Google Ads has begun rolling out channel control for select Demand Gen campaigns. This feature will let you specify where your ads appear across Google’s properties.
Yes, but. While the feature is live, segmentation by individual channel (e.g., YouTube, Discover, Gmail) is not yet available. This will limit your ability to make data-driven adjustments.
Why we care. This update, first announced in January, gives advertisers more control over campaign placement, but the full impact remains unclear since performance data is still aggregated under “Google-owned channels.”
What they’re saying. Greg Kholer, director of digital marketing at ServiceMaster, shared seeing the update on LinkedIn:
“While exciting, we won’t be making any changes until we’re able to see channel performance segmented out – as of today it’s still all lumped together as ‘Google owned channels’”.
What’s next: More search marketers will likely hold off on changes until Google provides detailed channel performance breakdowns.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/03/1742304320885-oft7am.jpeg?fit=646%2C251&ssl=1251646http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-03-18 18:12:402025-03-18 18:12:40Google Ads rolls out channel control for Demand Gen campaigns
Google is testing placing special and very clickable links in its AI Overviwws, but not to publishers or your own website, but rather back to its own search results. You got that right, Google is testing linking the AI Overviews back to new search queries on Google.com.
Clicking on those underlined links in the text of the AI Overview, both at the top and in the middle section, will take you back to a new Google Search. The smaller link icons take you to the side panel links, those go to publishers and external websites.
Why we care. All that talk, even recently with Google’s new AI Mode, around “prominently surfaces relevant links to help people find web pages and content they may not have discovered before,” and I quoted that from Google, means what here?
Those link icons in AI Overviews have shown to result in a drop in CTR from Google Search to external websites. We’ve seen a large company sue Google over traffic drops due to AI Overviews.
These links back to Google results in Google search volume growing, but do they actually help the searcher or they only help Google’s bottom line.
What about Google saying they value independent publishers and are prioritizing ways to send traffic to publishers in AI Overviews?
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/03/google-ai-overview-links-to-search-1742219008-cNMnxt.png?fit=1723%2C897&ssl=18971723http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-03-18 15:57:232025-03-18 15:57:23Google AI Overviews caught linking back to its own search results
Microsoft has added the ability to compare date ranges within the Search Performance report in Bing Webmaster Tools. This allows you to compare metrics and data points such as clicks, impressions, CTR, keywords, and pages across various date ranges.
“These enhancements enable in-depth analysis, providing invaluable insights for optimizing online presence and improving visibility. This means businesses can now make more informed decisions, identify key trends, and effectively adjust their marketing strategies to achieve better results.”
Microsoft also made user experience improvements in time filters, the company added.
What it looks like. Here is a screenshot showing the new “compare” option in the Search Performance report in Bing Webmaster Tools:
Why is this helpful. Microsoft posted a few ways these comparisons helps you, including:
Identifying key trends and patterns
Evaluating the effectiveness of their marketing campaigns
Gaining insights from seasonal search data
Setting benchmarks and goals using historical data
Identifying areas for improvement in search performance
Why we care. Google has offered comparing data in Search Console for a while, plus all the third-party tools offer this.
But now you can compare this data directly in Bing Webmaster Tools without having to export this data to a third-party tool. This should save you a lot of time, by not having to use another tool to compare your Bing Search data.
How does Bluesky differ from other social media platforms?
The platform, which describes itself as “social media as it should be,” has a similar look and feel to how X appeared back when it was Twitter.
Like most social media platforms, it allows users to post, repost, like, and share content.
The most significant difference between Bluesky and many other platforms is that it is decentralized, meaning it allows users to host their own data and create custom feeds.
This helps reduce the risk of data breaches and puts control back in the hands of users.
This decentralized approach, combined with the platform’s strong content moderation tools, is highly appealing to social media users who have become discouraged by recent changes to platforms such as X.
Above: An example of content moderation controls available for Bluesky users
So, how can I advertise on Bluesky?
Slow your horses!
There is no advertising on Bluesky.
While they haven’t ruled out the possibility of ads being available in the future, advertisers won’t be able to add Bluesky to their PPC strategy anytime soon.
Even if or when Bluesky does introduce advertising, it is likely to look different from the models used by other social media platforms.
Speaking to CNBC in November, COO Rose Wang stated that while the company is open to the idea of including ads on the platform, they don’t want it to be an “intrusive experience.”
If there are no ads, then how is Bluesky making money?
Bluesky is funded by investors and venture capital firms.
In November, Bluesky announced it was developing an optional subscription model. Subscriptions are rumored to provide users access to additional features, such as the ability to use a custom domain in their username and higher video upload limits.
Why ads on Bluesky would be a good thing
While there aren’t any opportunities to advertise on Bluesky, that doesn’t mean the time won’t come.
With a growing user base and increasing operational costs, Bluesky will likely to introduce some.
When that happens, here are some reasons why PPC advertisers might want to give it a try:
PPC advertisers are becoming increasingly frustrated with the developments – or lack thereof – on platforms like Meta and LinkedIn, which have long dominated the market.
The introduction of Bluesky to advertisers’ PPC strategy would allow them to try something new and potentially avoid the grievances and pitfalls they are experiencing on other advertising platforms.
Early adopters are likely to gain an advantage over competitors if they quickly embrace any potential advertising openings.
Opportunities to be the first business in your niche to advertise on a platform are rare, so advertisers could benefit from staking their claim early and cementing their presence.
Bluesky’s strong focus on privacy, moderation, and protection from misinformation could offer advertisers a more brand-safe advertising opportunity.
With recent reports of Google Ads placements appearing on illegal and compromising websites, PPC advertisers have increasing and legitimate concerns regarding where their ads are displayed.
If Bluesky introduces advertising, PPC advertisers should also consider potential drawbacks. These concerns include:
One of the biggest challenges in advertising on Bluesky stems from what makes it unique – its decentralized approach. While this is appealing and important for users, it may complicate advertising due to data and privacy restrictions.
Depending on the type of advertising model offered, PPC advertisers may need to rethink their targeting and messaging strategies, which could limit their efficiency.
Many Bluesky users have migrated from X, seeking a platform that values privacy over profit. As a result, Bluesky users are more likely to resist advertising, which could impact ad performance in terms of driving leads or sales.
Early adopters of new technology often face challenges such as technical bugs and unknown performance benchmarks. Not all businesses may be in a position to take on such risks if or when advertising reaches the Bluesky platform.
What should PPC advertisers be doing now?
In preparation for the likely launch of some form of advertising model on the platform, there are steps PPC advertisers can take now:
1. Claim your handle
Advertisers should secure the handle for their business or clients to ensure they have access to usernames that accurately represent their brand.
Above: A screenshot of the Search Engine Land account on Bluesky
2. Explore the platform
Advertisers should spend time navigating Bluesky to:
Understand its interface.
Identify the types of organic content that perform well.
Become familiar with the platform’s extensive moderation tools.
3. Build an organic presence
By posting content and engaging with the community, advertisers can build trust and recognition for their brand.
This proactive approach can help businesses connect with Bluesky’s user base – especially if the audience remains resistant to traditional ads.
What’s next?
PPC advertisers will need to hold off a little longer before adding Bluesky to their PPC strategy.
However, with the platform growing rapidly, some form of advertising model is likely to arrive within the next 18 months – and we’ll be sure to keep you updated.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/03/An-example-of-content-moderation-controls-available-for-Bluesky-users-47E460.png?fit=1600%2C706&ssl=17061600http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-03-18 13:00:002025-03-18 13:00:00Bluesky for PPC: What you need to know
When it comes to keyword research, SEO professionals often rely on expensive tools to find the right keywords.
However, Google Search Console (GSC) provides a completely free way to access insights straight from Google itself.
GSC is a powerful and often underused tool that shows exactly what is working on a site and where improvements can be made.
Unlike other keyword research tools that provide generic suggestions and estimated data, GSC delivers real-life search data based on actual searches leading to a website. It can often uncover interesting insights.
Here’s how to use GSC to find valuable keyword opportunities and improve rankings.
Why use Google Search Console for keyword research?
Google Search Console is a goldmine for keyword insights. Here’s why you should use it.
Free and requires no subscriptions: Many SEO tools require costly subscriptions, but GSC is completely free, making it accessible to businesses of all sizes.
Provides real keyword performance data: Most keyword research tools provide estimated search volumes, but GSC shows actual data on searches that lead users to your site, ensuring accuracy.
Helps identify keywords with high optimization potential: Analyzing existing keyword rankings allows you to optimize content and improve visibility with small tweaks.
Uncovers content gaps and new topic opportunities: GSC reveals queries that may not have been intentionally targeted but are already driving traffic, providing ideas for new content.
Tracks keyword performance over time: You can monitor how rankings fluctuate, which keywords are growing in importance, and how search behavior is evolving.
Helps understand search intent: By analyzing query data, you can refine content to better match user intent and increase engagement.
Provides device-specific insights: Performance can vary between desktop and mobile users, and GSC helps fine-tune SEO strategies accordingly.
GSC can also reveal irrelevant search terms bringing traffic to a site. Some queries may drive traffic that does not align with the intended audience, leading to vanity traffic that skews reports.
How to manage irrelevant search terms
Identify keywords bringing in non-relevant traffic that do not contribute to conversions or engagement.
Adjust on-page content and metadata to clarify the intent of the page.
Use negative keywords in paid search campaigns if these terms are also appearing in PPC reports.
Monitor engagement rates and session duration for traffic from these terms to assess engagement levels.
Example
If a bathroom renovation site ranks for “how to clean a kitchen splashback,” that traffic is unlikely to convert into meaningful engagement.
Identifying and minimizing such cases ensures that a site is optimized for relevant search terms.
Step 4: Track overall keyword performance
GSC provides detailed performance tracking without the need for a paid keyword tracking tool.
Key metrics to check
Total clicks: The number of visitors coming from search results.
Total impressions: The number of times a site appears in search results.
Click-through rate (CTR): The percentage of users who click after seeing a result.
Average position: The ranking in Google search results.
Branded vs. non-branded search terms: Understanding the balance between brand visibility and new audience acquisition.
Device-specific performance: Identifying whether certain keywords perform better on mobile vs. desktop.
Unlike most SEO tools that limit the number of keywords tracked, GSC offers unlimited data on how a site is performing.
For websites targeting multiple countries, understanding geographic search performance can help refine international SEO strategies and localize content for different markets.
How to use it
In Google Search Console, navigate to Performance > Search Results.
Click on the Countries tab to see a breakdown of traffic by region.
Identify which countries are driving the most organic traffic and how search trends vary between locations.
Google Search Console is a powerful and often overlooked tool for keyword research.
It provides real data directly from Google, showing exactly how a site is performing in search.
Use it to find quick-win keyword opportunities.
Identify new content ideas based on real user searches.
Eliminate vanity traffic that does not convert.
Track performance trends and adjust SEO strategies accordingly.
By using GSC effectively, you can uncover high-impact opportunities, refine strategies, and drive meaningful improvements in search performance – all without spending a penny on keyword research tools.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/03/How-to-use-Google-Search-Console-for-keyword-research-800x450-xz5rXw.png?fit=800%2C450&ssl=1450800http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-03-18 12:00:002025-03-18 12:00:00How to use Google Search Console for keyword research
With the robots.txt file, site owners
have a simple way to control which parts of a website are accessible by crawlers.
To help site owners further express how search engines and web
crawlers can use their pages, the web standards group came
up with robots meta tags in 1996, just a few months after meta tags
were proposed for HTML (and anecdotally, also before Google
was founded). Later, X-Robots-Tag HTTP response headers were added.
These instructions are sent together with a URL, so crawlers can only take them into account
if they’re not disallowed from crawling the URL through the robots.txt file. Together, they
form the Robots Exclusion Protocol (REP).
Start by diagnosing what’s working on your site and what isn’t.
Then, apply targeted fixes based on real data. Not hunches.
In this guide, I’m sharing my lessons and strategies from 10+ years in digital marketing.
Plus, I interviewed four leading ecommerce website optimization experts for their best conversion-driving insights:
Leigh McKenzie from UnderFit (also head of SEO @ Backlinko)
Rishi Rawat from Frictionless Commerce
Anna Bolton from Conversion Copy Co.
Kurt Philip from Convertica
Let’s start by identifying the biggest roadblocks standing between you and more revenue.
Phase 1: Analyze and Diagnose Your Site’s Existing Issues
Every effective ecommerce website optimization strategy starts with a solid, data-driven diagnosis.
As economist W. Edwards Deming once said:
“Without data, you’re just another person with an opinion.”
Quantitative Research: Finding Patterns in the Numbers
Quantitative research focuses on analyzing data to identify trends and behaviors.
It helps you answer questions about your online store’s performance, such as:
Where are visitors dropping off in the funnel?
What are users actually doing on each page (scrolling, clicking)?
How does behavior differ across traffic segments (e.g., mobile vs. desktop, organic vs. paid)?
The good news:
There are many tools to help you with this analysis.
Google Analytics (GA4)
Google Analytics provides helpful insights into user behavior and website performance.
Including how visitors from different traffic sources behave.
For example, to uncover drop-off points during checkout:
Navigate to Reports > Monetization > Checkout journey.
This lets you examine the flow from checkout to purchase.
And analyze abandonment rates for each stage to identify potential bottlenecks.
For example, a high abandonment rate on the payment page might signal technical issues.
Or trust barriers, such as last-minute doubt about product quality.
Pro tip: There’s no universal definition of a high abandonment rate. It varies by industry, funnel, and goals. Compare it against your historical data to see if there’s a problem.
Hotjar
Hotjar, a heatmap and behavior analytics tool, is incredibly powerful for qualitative research (more on that soon).
It gives you a clear picture of how online shoppers interact with your site.
And lets you uncover friction points that frustrate users.
For example, click tracking reveals where visitors interact with your site.
And which elements get the most engagement.
Scroll heatmaps show you how far users make it down a page. And where they drop off.
And cool colors (like blue) signal lower engagement.
Move heatmaps track how shoppers move their mouse across the page.
This reveals areas of interest and hesitation.
Session replays let you watch real user recordings, showing exactly how visitors navigate your site.
Watch this in action below:
Semrush Site Audit
Semrush’s Site Audit tool uncovers technical issues that affect SEO and the user experience (UX).
For example, it flags crawl errors, which are usually caused by broken links or incorrect redirects.
These dead ends confuse users and make it harder for search engines to crawl your pages.
(And if Google struggles to crawl them, your ranking can take a hit.)
The tool also identifies slow-loading pages that frustrate visitors.
It can also identify code bloat (aka too much JavaScript or unused CSS) that makes pages sluggish.
This can cause delayed interactions that lower conversion rates.
Qualitative Research: Uncover the “Why” Behind the Data
Qualitative research helps you understand why customers behave the way they do.
Including their pain points, motivations, and desires.
It also helps you identify barriers to conversion, such as hesitations about buying.
And learn about other products your web visitors are considering.
Qualitative Research Methods
There are many data sources for qualitative insights.
And each one can reveal different issues and opportunities:
Research Method
What to Look For
Effort level
Recorded sales calls
Patterns in customer questions, objections, or recurring themes
Low
Live chat transcripts
Common pain points, frequently asked questions, or sources of confusion
Low
Customer reviews
Trends in positive and negative feedback. This includes specific phrases or words that highlight desires, frustrations, or expectations
Low
Online surveys
Customer sentiment toward brand messaging and tone and reasons for abandonment
Low to medium
Customer interviews
Insights into customer motivations, needs, anxieties, and desires in their own words
Medium to high
User testing sessions
Usability issues, unexpected user behaviors, or areas where users struggle to complete tasks
High
But you don’t need to go all-in on every qualitative method right off the bat.
Start with the data you already have.
Then, gradually level up as time and resources allow.
Turn Your Research Into Actionable Insights
You’ve got the research.
Now, you need a system to organize it.
As Anna Bolton, chief CRO and conversion copywriter of Conversion Copy Co., says:
The challenge isn’t just gathering research—it’s making sense of it. Whether you’re analyzing heatmaps, surveys, or reviews, you need to turn that data into meaningful insights. This starts with proper analysis to identify key patterns and trends. And then you need to understand that data in context—what it means for your business, audience, and goals. That’s what turns raw data into results.
So, what do you do?
Build a research repository to bring all your insights together in one place.
Think of it as a living database of findings and insights. This way, it’s easier for you to act on data.
But you don’t need anything fancy.
Start with a simple spreadsheet.
Include everything from customer research (interviews, surveys) to conversion rate optimization (CRO) results and survey data.
For example, Anna and I use a spreadsheet like the one below for one-off client projects.
For larger-scale projects, use UX research tools like Aurelius Lab and Dovetail.
These tools offer more advanced ways to store, categorize, and retrieve insights.
Phase 2: Apply Ecommerce Website Optimization Fixes to Increase Conversions
The ecommerce website optimization best practices we’re about to cover are designed to do one thing:
Improve the customer experience.
And when you do that, conversions naturally follow.
Side note: If you’re here for SEO tips, stick around. While I’m focusing on conversion rate optimization, CRO and SEO are becoming increasingly intertwined. Anything you do to make your site better for humans will also make Google happy.
As Leigh McKenzie, head of SEO at Backlinko and owner of UnderFit, says:
“Conversion rate optimization is becoming more and more an SEO responsibility. Google heavily rewards websites that deliver a positive user experience. It’s no longer about just bringing traffic. It’s also about what happens when people get there.”
Begin with the pages that offer the quickest wins, such as product and checkout pages.
This is what Rishi Rawat, product page optimization specialist at Frictionless Commerce, does.
I work exclusively on bestselling product pages because they have the highest impact. My goal is to turn first-time visitors into buyers. Since these pages already drive a big share of the store’s revenue, I don’t spread optimization efforts thin. Instead, I improve the sales pitch and sharpen the product story. And then I make what’s already working even more persuasive.
So, how do you identify your site’s high-impact pages?
These are the pages that attract visitors in the decision and action stages. Such as product pages or the cart page.
But you might also include other pages based on user behavior.
For example, optimize the product and cart pages if your site has high cart abandonment.
This ensures the product page sets the right expectations.
So, when shoppers get to checkout, they feel confident in their choice.
But, if your goal is to boost mobile sales, optimize the mobile experience first.
Want to maximize paid ads conversions? Make product landing pages a priority.
3. Make Navigation and Search Intuitive
Shoppers don’t always leave because they dislike your products.
Sometimes, they leave because they can’t find what they’re looking for.
That’s why navigation plays a big role in ecommerce website optimization.
If your navigation makes users rethink their next step, you’re already losing them.
For example, imagine you’re searching for dog crates on pet company Chewy’s website.
You sort the results by price.
But now, the first products you see are lock latch replacements and crate pans—not dog crates.
That’s a bad user experience.
And it might cost them the sale.
The solution?
Always test filters before launch to ensure they work as expected.
And design navigation to adapt to various browsing behaviors.
Make backtracking easy with breadcrumbs and a “Recently Viewed” section.
Plus, use AI to suggest relevant filters, related categories, and top products.
Navigation also impacts SEO.
As Leigh put it,
Good navigation isn’t just about getting users to a page. It’s about keeping them engaged in the shopping process. Shoppers want to see product variations, compare options, and refine their choices easily. When they do, they stay longer. And that’s what Google values. It favors sites where users engage rather than bounce back to search results. That’s why you want to optimize for getting people deeper into the experience.
Forcing people to create an account is an unnecessary barrier. You can just auto-generate one for them. Let them check out first, and then send them a confirmation email with their details. And a ‘Set Your Password’ option later. That way, the process stays frictionless, and they still get an account without effort.
But consider this:
Your job doesn’t stop when someone adds an item to the cart.
This is your chance to remove any last-minute hesitation and get the sale.
Ridge Wallet, an accessories manufacturer, does this well.
It displays social proof at the top of the checkout page by highlighting its “100K+ 5-star reviews.”
It also includes trust boosters like a risk-free trial and fast shipping.
Outdoor gear company Patagonia highlights its “Ironclad Guarantee” on the checkout page.
This reassures buyers that buying is risk-free.
And it also strengthens Patagonia’s credibility.
Clothing company Everlane also understands the power of timing.
It reminds shoppers of first-time buyer discounts at checkout to encourage them to take advantage of savings.
CRM data (buying history, abandoned carts): Powers retargeting campaigns and perfectly timed offers
Predictive insights: Uses AI to analyze patterns and predict needs
9. A/B Test to Learn. Not Just to Win.
At the heart of ecommerce website optimization is A/B testing.
But here’s the thing:
Your goal isn’t just about finding a “winning variation.”
It’s to learn more about the psychology of your buyers.
As Jonny Longden, chief growth officer at Speero, puts it:
When you run a test, whether it wins or loses is in some ways irrelevant because you can learn something from it. Some of the most successful tests that you will run happen as a result of a test that lost. When you chase winners, you ignore that fact.
For example, if a trust badge increases conversions, the real takeaway isn’t just that the badge works.
It’s that customers need more reassurance before they give you their credit card.
This insight goes beyond checkout.
It suggests that trust signals should be reinforced earlier in the buying journey. On product pages, in the cart, and even in post-purchase messaging.
Why?
If hesitation exists at checkout, it likely started long before.
One more thing.
A/B testing only works if you have enough traffic to reach statistical significance.
Kurt says your test page should receive at least 10,000 visits per month.
This gives you meaningful insights in a reasonable timeframe.
But traffic alone isn’t enough.
What matters is whether you can reach statistical significance. This ensures your results aren’t just due to chance.
AI search engines and chatbots often provide wrong answers and make up article citations, according to a new study from Columbia Journalism Review.
Why we care. AI search tools have ramped up the scraping of your content so they can serve answers to their users, often resulting in no clicks to your website. Also, click-through rates from AI search and chatbots are much lower than Google Search, according to a separate, unrelated study. But hallucinating citations makes an already bad situation even worse.
By the numbers. More than half of the responses from Gemini and Grok 3 cited fabricated or broken URLs that led to error pages. Also, according to the study:
Overall, chatbots provided incorrect answers to more than 60% of queries:
Grok 3 (the highest error rate) answered 94% of the queries incorrectly.
Gemini only provided a completely correct response on one occasion (in 10 attempts).
Perplexity, which had the lowest error rate, answered 37% of queries incorrectly.
What they’re saying. The study authors (Klaudia Jaźwińska and Aisvarya Chandrasekar), who also noted that “multiple chatbots seemed to bypass Robot Exclusion Protocol preferences,” summed up this way:
“The findings of this study align closely with those outlined in our previous ChatGPT study, published in November 2024, which revealed consistent patterns across chabots: confident presentations of incorrect information, misleading attributions to syndicated content, and inconsistent information retrieval practices. Critics of generative search like Chirag Shah and Emily M. Bender have raised substantive concerns about using large language models for search, noting that they ‘take away transparency and user agency, further amplify the problems associated with bias in [information access] systems, and often provide ungrounded and/or toxic answers that may go unchecked by a typical user.’”
About the comparison. This analysis of 1,600 queries compared the ability of generative AI tools (ChatGPT search, Perplexity, Perplexity Pro, DeepSeek search, Microsoft CoPilot, xAI’s Grok-2 and Grok-3 search, and Google Gemini) to identify an article’s headline, original publisher, publication date, and URL, based on direct excerpts of 10 articles chosen at random from 20 publishers.
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/03/sad-robot-on-phone-1920-800x457-Bte36n.jpeg?fit=800%2C457&ssl=1457800http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-03-11 19:04:222025-03-11 19:04:22AI search engines often make up citations and answers: Study
AI-powered search engines (e.g., OpenAI’s ChatGPT, Perplexity) are failing to drive meaningful traffic to publishers while their web scraping activities increase. That’s one big takeaway from a recent report from TollBit, a platform that says it helps publishers monetize their content.
CTR comparison. Google’s average search click-through rate (CTR) was 8.63%, according to the report. However, the CTR for AI search engines was 0.74% and 0.33% CTR for AI chatbots. That means AI search sends 91% fewer referrals and chatbots send 96% less than traditional search.
Why we care. This is bad news for publishers because it shows AI search won’t replace traditional search traffic. As AI-generated answers replace direct website visits, you should expect to see this trend continue.
By the numbers. AI bot scraping doubled (+117%) between Q3 and Q4 2024. Also:
The average number of scrapes from AI bots per website for Q4 was 2 million, with another 1.89 million done by hidden AI scrapers.
40% more AI bots ignored robots.txt in Q4 than in Q3.
ChatGPT-User bot activity skyrocketed by 6,767.60%, making it the most aggressive scraper.
Top AI bots by share of scraping activity:
ChatGPT-User (15.6%)
Bytespider (ByteDance/TikTok) (12.44%)
Meta-ExternalAgent (11.34%)
PerplexityBot continued sending referrals to sites that had explicitly blocked it, raising concerns about undisclosed scraping.
Context. One company, Chegg, is attempting to sue Google over AI Overviews. Chegg claims Google’s search feature has severely damaged its traffic and revenue.
Google announced last week an expansion of AI Overviews. It is now starting to show AI Overviews to users who aren’t logged in.
About the data. There’s no methodology section, so it’s not entirely clear how many websites were analyzed, just that it’s based on “all onboarded ToolBit sites in Q4.” Toolbit says it “helps over 500 publisher sites.”
https://i0.wp.com/dubadosolutions.com/wp-content/uploads/2025/03/ai-search-crawlers-agents-bots-800x450-wyjgHD.png?fit=800%2C450&ssl=1450800http://dubadosolutions.com/wp-content/uploads/2017/05/dubado-logo-1.png2025-03-11 16:52:232025-03-11 16:52:23As AI scraping surges, AI search traffic fails to follow: Report
Google Ads is significantly increasing the negative keyword limit for Performance Max (PMax) campaigns, raising the cap from 100 to 10,000 per campaign, aligning with Search campaigns.
By the numbers:
Previous cap: 100 negative keywords per PMax campaign
New cap: 10,000 negative keywords per PMax campaign
Rollout timeline: Next few weeks for all PMax advertisers
Why we care. Advertisers had expressed frustration that the previous 100-keyword limit was too restrictive, limiting control over where their ads appeared. The update provides greater flexibility while maintaining campaign effectiveness.
The big picture: Google Ads Liaison, Ginny Marvin says the cap ensures system flexibility while giving advertisers more control, in her update on X. She also advises using negative keywords carefully to avoid limiting conversions.
What’s next: Google is working on further enhancements, including support for negative keyword lists in PMax later this year. Advertisers can also use tools like brand exclusions and account-level negative keywords for additional control.