Google VP of Ads bets on AI to transform ads into tailored consumer journeys

Google’s Ads and Commerce product lead, Vidhya Srinivasan, today outlined how the company is reimagining advertising as “avenues for tailored exploration” in response to unpredictable consumer behavior.

The big picture: Google is focusing on three key solutions to help advertisers break through:

  • AI-powered shopping innovations. Google launched several new shopping features, including ads in Lens, AI-powered Google Shopping, 3D product spins, and virtual try-on experiences for clothing items.
  • YouTube creator partnerships. The platform’s highly engaged audiences, particularly Gen Z, trust creator recommendations 98% more than those on other social platforms (according to Google figures). Google is developing more interactive ads with the aim of helping brands connect with relevant creators.
  • Enhanced search experiences. AI-powered features like AI Overviews, Circle to Search, and Google Lens are expanding the types of questions people can ask. These new search capabilities has potential for increased commercial query volume.

Why we care. As consumer behavior becomes increasingly fragmented across devices and platforms, Google is betting on AI to help advertisers create more personalized, relevant content that can break through the noise.

With consumers rapidly switching between devices and platforms, these AI-powered solutions have the potential to help advertisers maintain visibility throughout the entire customer journey, from discovery to purchase, while leveraging trusted creator relationships that drive higher engagement, particularly among younger audiences.

Although it is still key to ensure that adequate human intervention still remains as AI capabilities keep improving and evolving.

By the numbers (according to Google internal research):

  • People shop more than a billion times daily across Google
  • Consumers used Google or YouTube in approximately two-thirds of purchases where they discovered something new
  • YouTube viewers watch over 1 billion hours of content daily on TVs
  • Google processes more than 5 trillion searches annually (416 billion searches per month)

Between the lines. Srinivasan’s letter emphasizes that simply creating compelling content isn’t enough. Brands need to “show up everywhere people are, from discovery to decision” to capture attention in today’s fragmented media landscape.

Bottom line. Srinivasan points to several AI-powered advertising innovations already launched, including ads in Lens, AI-powered shopping, 3D spins for ad images, and virtual try-on features for clothing, with promises of “much more to come.”

Google is positioning itself as the solution to fragmented consumer attention by helping brands create more relevant content and appear at critical moments across the customer journey, from discovery to purchase decision.

Read more at Read More

Google now sees more than 5 trillion searches per year

Google processes more than 5 trillion searches per year. This is the first time Google has publicly shared such a figure since 2016, when the company confirmed it was handling “more than 2 trillion” queries annually.

By the numbers. Google revealed the new figure in a blog post today, saying it is based on internal Google data:

  • “We already see more than 5 trillion searches on Google annually.”

Google added another tidbit in the same blog post: that “the volume of commercial queries has increased” since the launch of AI Overviews. However, Google didn’t share any data or a percentage to explain how much commercial queries have increased.

Searches per second, minute, day and month. Now that we have an updated figure, we can also estimate how many Google searches there are pretty much down to the second. Here’s a breakdown based on this new Google data point:

  • Searches per second: 158,548
  • Searches per minute: 9.5 million.
  • Searches per hour: 571 million.
  • Searches per day: 14 billion.
  • Searches per month: 417 billion.
  • Searches per year: More than 5 trillion.

Google searches per year, over time. Curious about how the number of Google search queries has grown over time, at least based on what Google self-reported? Here’s a brief recap:

  • 1999: 1 billion. This figure was based on 3 million searches per day, reported in August 1999 by John Battelle in his book, “The Search.”
  • 2000: 14 billion. This figure was based on 18 million searches per day for the first half of 2000 and 60 million for the second half, as reported by Battelle.
  • 2001–2003: 55 billion+. This figure was based on reports by Google for its Zeitgeist in 20012002 and 2003.
  • 2004–2008: 73 billion. This figure was based on Google saying it was doing 200 million searches per day in 2004. After that, it said only “billions” in Google Zeitgeist for 2005 and 2007. No updates were shared in 2006 or 2008.
  • 2009: 365 billion+. A Google blog post, Google Instant, behind the scenes, said Google was doing more than 1 billion searches per day. No updates for 2010 or 2011)
  • 2012–2015: 1.2 trillion. This figure is based on a 100-billion-per-month figure Google released during a special press briefing on search in 2012. Google repeated this figure in 2015, when expressing it as 3 billion searches per day.
  • 2016-2024: 2 trillion+. Google confirmed to Search Engine Land that because it said it handles “trillions” of searches per year worldwide, the figure could be safely assumed to be 2 trillion or above.
  • 2025: 5 trillion+. This figure is based on internal Google data and was reported in Google’s blog post, AI, personalization and the future of shopping.

Why we care. Since 2016, we’ve known that Google processes “at least 2 trillion” searches per year. Now, nearly nine years later, we have a new official figure from Google for how many searches are conducted on Google annually: 5 trillion.

5.9 trillion? Hours after we published our story, Rand Fishkin published new research that estimated the number of Google searches per year to be 5.9 trillion. From the study:

  • “Our math above puts the number at 5.9 Trillion, a little high, likely because Datos’ panel focuses on wealthier countries where more search activity per person is to be expected. Still incredible that they’d come out with numbers the day we publish that help back up the veracity of these results, and the quality of Datos’ panel.”

Dig deeper. Americans search Google 126 times per month on average: Study

Read more at Read More

Google expands Vehicle ads to include RVs and campers

Google vehicle ads now accept listings for recreational vehicles (RVs) and campers. This broadens the scope beyond traditional automobiles.

The details. The expansion, announced Feb. 28, allows RV and camper dealers to showcase their inventory directly in Google search results, similar to how car dealerships have been using the platform.

The catch. Dealers must maintain valid dealership licenses in all states, territories, or provinces where their RVs and campers are located or offered for sale – the same requirement that applies to other vehicle categories.

Why we care. Dealerships can now reach potential buyers searching for recreational vehicles directly through Google’s vehicle ad format, expanding their reach.

What’s next. Interested dealers should review Google’s Vehicle ads policies (Beta) to ensure their RV and camper inventory meets all eligibility requirements before listing.

Bottom line. This expansion gives dealers a new way to connect with potential buyers actively searching for RVs and campers.

Read more at Read More

Google Merchant Center renames Conversion Settings as Key Event Setup

Top 5 Google Ads opportunities you might be missing

Google updated its conversion terminology in Google Merchant Center, renaming “Conversion settings” to “Key event setup” in the Google Ads UI.

The terminology change aligns Google Merchant Center with Google Analytics 4’s shift from “conversions” to “key events,” creating more consistent language across Google’s marketing platforms.

The big picture: This change reflects Google’s broader move toward standardizing measurement terms across its suite of marketing tools, which began with Google Analytics 4’s introduction of the “key events” terminology in March 2024.

Why we care. The alignment with Google Analytics 4 means you will need to adapt your workflows, reporting, and possibly your strategic approach to measuring customer interactions.

Between the lines: The shift from “conversions” to “key events” represents more than just a naming convention – it’s part of Google’s evolving approach to how businesses track and measure meaningful user interactions.

First seen. We were first made aware of this update by Emmanuel Flossie when he posted about seeing the change on LinkedIn:

What to watch: As Google continues to align terminology across its platforms, marketers should expect similar updates to appear in other Google marketing tools to create a more unified measurement framework.

Read more at Read More

A guide to web crawlers: What you need to know

A guide to web crawlers: What you need to know

Understanding the difference between search bots and scrapers is crucial for SEO

Website crawlers fall into two categories: 

  • First-party bots, which you use to audit and optimize your own site.
  • Third-party bots, which crawl your site externally – sometimes to index your content (like Googlebot) and other times to extract data (like competitor scrapers).

This guide breaks down first-party crawlers that can improve your site’s technical SEO and third-party bots, exploring their impact and how to manage them effectively.

First-party crawlers: Mining insights from your own website

Crawlers can help you identify ways to improve your technical SEO. 

Enhancing your site’s technical foundation, architectural depth, and crawl efficiency is a long-term strategy for increasing search traffic.

Occasionally, you may uncover major issues – such as a robots.txt file blocking all search bots on a staging site that was left active after launch. 

Fixing such problems can lead to immediate improvements in search visibility.

Now, let’s explore some crawl-based technologies you can use.

Googlebot via Search Console

You don’t work in a Google data center, so you can’t launch Googlebot to crawl your own site. 

However, by verifying your site with Google Search Console (GSC), you can access Googlebot’s data and insights. (Follow Google’s guidance to set yourself up on the platform.)

GSC is free to use and provides valuable information – especially about page indexing. 

GSC page indexing

There’s also data on mobile-friendliness, structured data, and Core Web Vitals:

GSC Core Web Vitals

Technically, this is third-party data from Google, but only verified users can access it for their site. 

In practice, it functions much like the data from a crawl you run yourself.

Screaming Frog SEO Spider

Screaming Frog is a desktop application that runs locally on your machine to generate crawl data for your website. 

They also offer a log file analyzer, which is useful if you have access to server log files. For now, we’ll focus on Screaming Frog’s SEO Spider.

At $259 per year, it’s highly cost-effective compared to other tools that charge this much per month. 

However, because it runs locally, crawling stops if you turn off your computer – it doesn’t operate in the cloud. 

Still, the data it provides is fast, accurate, and ideal for those who want to dive deeper into technical SEO.

Screaming Frog main interface

From the main interface, you can quickly launch your own crawls. 

Once completed, export Internal > All data to an Excel-readable format and get comfortable handling and pivoting the data for deeper insights. 

Screaming Frog also offers many other useful export options.

Screaming Frog export options

It provides reports and exports for internal linking, redirects (including redirect chains), insecure content (mixed content), and more.

The drawback is it requires more hands-on management, and you’ll need to be comfortable working with data in Excel or Google Sheets to maximize its value.

Dig deeper: 4 of the best technical SEO tools

Ahrefs Site Audit

Ahrefs is a comprehensive cloud-based platform that includes a technical SEO crawler within its Site Audit module. 

To use it, set up a project, configure the crawl parameters, and launch the crawl to generate technical SEO insights.

Ahrefs Overview

Once the crawl is complete, you’ll see an overview that includes a technical SEO health rating (0-100) and highlights key issues. 

You can click on these issues for more details, and a helpful button appears as you dive deeper, explaining why certain fixes are necessary.

Ahrefs why and how to fix

Since Ahrefs runs in the cloud, your machine’s status doesn’t affect the crawl. It continues even if your PC or Mac is turned off. 

Compared to Screaming Frog, Ahrefs provides more guidance, making it easier to turn crawl data into actionable SEO insights. 

However, it’s less cost-effective. If you don’t need its additional features, like backlink data and keyword research, it may not be worth the expense.

Semrush Site Audit

Next is Semrush, another powerful cloud-based platform with a built-in technical SEO crawler. 

Like Ahrefs, it also provides backlink analysis and keyword research tools.

Semrush Site Audit

Semrush offers a technical SEO health rating, which improves as you fix site issues. Its crawl overview highlights errors and warnings.

As you explore, you’ll find explanations of why fixes are needed and how to implement them.

Semrush why and how to fix

Both Semrush and Ahrefs have robust site audit tools, making it easy to launch crawls, analyze data, and provide recommendations to developers. 

While both platforms are pricier than Screaming Frog, they excel at turning crawl data into actionable insights. 

Semrush is slightly more cost-effective than Ahrefs, making it a solid choice for those new to technical SEO.

Get the newsletter search marketers rely on.



Third-party crawlers: Bots that might visit your website

Earlier, we discussed how third parties might crawl your website for various reasons. 

But what are these external crawlers, and how can you identify them?

Googlebot

As mentioned, you can use Google Search Console to access some of Googlebot’s crawl data for your site. 

Without Googlebot crawling your site, there would be no data to analyze.

(You can learn more about Google’s common crawl bots in this Search Central documentation.)

Google’s most common crawlers are:

  • Googlebot Smartphone.
  • Googlebot Desktop.

Each uses separate rendering engines for mobile and desktop, but both contain “Googlebot/2.1” in their user-agent string.

If you analyze your server logs, you can isolate Googlebot traffic to see which areas of your site it crawls most frequently. 

This can help identify technical SEO issues, such as pages that Google isn’t crawling as expected. 

To analyze log files, you can create spreadsheets to process and pivot the data from raw .txt or .csv files. If that seems complex, Screaming Frog’s Log File Analyzer is a useful tool.

In most cases, you shouldn’t block Googlebot, as this can negatively affect SEO. 

However, if Googlebot gets stuck in highly dynamic site architecture, you may need to block specific URLs via robots.txt. Use this carefully – overuse can harm your rankings.

Fake Googlebot traffic

Not all traffic claiming to be Googlebot is legitimate. 

Many crawlers and scrapers allow users to spoof user-agent strings, meaning they can disguise themselves as Googlebot to bypass crawl restrictions.

For example, Screaming Frog can be configured to impersonate Googlebot. 

However, many websites – especially those hosted on large cloud networks like AWS – can differentiate between real and fake Googlebot traffic. 

They do this by checking if the request comes from Google’s official IP ranges. 

If a request claims to be Googlebot but originates outside of those ranges, it’s likely fake.

Other search engines

In addition to Googlebot, other search engines may crawl your site. For example:

  • Bingbot (Microsoft Bing).
  • DuckDuckBot (DuckDuckGo).
  • YandexBot (Yandex, a Russian search engine, though not well-documented).
  • Baiduspider (Baidu, a popular search engine in China).

In your robots.txt file, you can create wildcard rules to disallow all search bots or specify rules for particular crawlers and directories.

However, keep in mind that robots.txt entries are directives, not commands – meaning they can be ignored.

Unlike redirects, which prevent a server from serving a resource, robots.txt is merely a strong signal requesting bots not to crawl certain areas.

Some crawlers may disregard these directives entirely.

Screaming Frog’s Crawl Bot

Screaming Frog typically identifies itself with a user agent like Screaming Frog SEO Spider/21.4.

The “Screaming Frog SEO Spider” text is always included, followed by the version number.

However, Screaming Frog allows users to customize the user-agent string, meaning crawls can appear to be from Googlebot, Chrome, or another user-agent. 

This makes it difficult to block Screaming Frog crawls. 

While you can block user agents containing “Screaming Frog SEO Spider,” an operator can simply change the string.

If you suspect unauthorized crawling, you may need to identify and block the IP range instead. 

This requires server-side intervention from your web developer, as robots.txt cannot block IPs – especially since Screaming Frog can be configured to ignore robots.txt directives.

Be cautious, though. It might be your own SEO team conducting a crawl to check for technical SEO issues. 

Before blocking Screaming Frog, try to determine the source of the traffic, as it could be an internal employee gathering data.

Ahrefs Bot

Ahrefs has a crawl bot and a site audit bot for crawling.

  • When Ahrefs crawls the web for its own index, you’ll see traffic from AhrefsBot/7.0.
  • When an Ahrefs user runs a site audit, traffic will come from AhrefsSiteAudit/6.1.

Both bots respect robots.txt disallow rules, per Ahrefs’ documentation. 

If you don’t want your site to be crawled, you can block Ahrefs using robots.txt. 

Alternatively, your web developer can deny requests from user agents containing “AhrefsBot” or “AhrefsSiteAudit“.

Semrush Bot

Like Ahrefs, Semrush operates multiple crawlers with different user-agent strings. 

Be sure to review all available information to identify them properly.

The two most common user-agent strings you’ll encounter are:

  • SemrushBot: Semrush’s general web crawler, used to improve its index.
  • SiteAuditBot: Used when a Semrush user initiates a site audit.

Rogerbot, Dotbot, and other crawlers

Moz, another widely used cloud-based SEO platform, deploys Rogerbot to crawl websites for technical insights. 

Moz also operates Dotbot, a general web crawler. Both can be blocked via your robots.txt file if needed.

Another crawler you may encounter is MJ12Bot, used by the Majestic SEO platform. Typically, it’s nothing to worry about.

Non-SEO crawl bots

Not all crawlers are SEO-related. Many social platforms operate their own bots. 

Meta (Facebook’s parent company) runs multiple crawlers, while Twitter previously used Twitterbot – and it’s likely that X now deploys a similar, though less-documented, system.

Crawlers continuously scan the web for data. Some can benefit your site, while others should be monitored through server logs.

Understanding search bots, SEO crawlers and scrapers for technical SEO

Managing both first-party and third-party crawlers is essential for maintaining your website’s technical SEO.

Key takeaways

  • First-party crawlers (e.g., Screaming Frog, Ahrefs, Semrush) help audit and optimize your own site.
  • Googlebot insights via Search Console provide crucial data on indexation and performance.
  • Third-party crawlers (e.g., Bingbot, AhrefsBot, SemrushBot) crawl your site for search indexing or competitive analysis.
  • Managing bots via robots.txt and server logs can help control unwanted crawlers and improve crawl efficiency in specific cases.
  • Data handling skills are crucial for extracting meaningful insights from crawl reports and log files.

By balancing proactive auditing with strategic bot management, you can ensure your site remains well-optimized and efficiently crawled.

Read more at Read More

PPC budgeting in 2025: When to adjust, scale, and optimize with data

PPC budgeting in 2025- When to adjust, scale, and optimize with data

Budgeting for paid ad campaigns has long been a static process – set a monthly budget, monitor spending, and adjust incrementally as needed. 

This method works for industries with stable demand and predictable conversion rates but falls short in dynamic, competitive markets.

Still, static budgets aren’t obsolete. In industries with long sales cycles, consistent conversion trends, or strict financial planning – like B2B SaaS and healthcare – planned budgets remain essential.

The key isn’t choosing between static and dynamic budgeting; it’s knowing when and how to adjust PPC spend using data-driven signals.

The role of Smart Bidding and Performance Max in budgeting

Automation has changed our budgeting strategies, but it hasn’t eliminated the need for human oversight. 

While Google’s Smart Bidding and Performance Max (PMax) campaigns help optimize performance, they do not fully control budget allocation the way some advertisers may assume.

Smart Bidding: What it does (and doesn’t do) for budgeting

Smart Bidding (i.e., Target ROAS, Target CPA, Maximize Conversions, and Maximize Conversion Value) uses real-time auction signals to adjust bids but does not shift budgets between campaigns. 

If a campaign has an insufficient budget, smart bidding won’t automatically pull spend from another campaign; this still requires manual adjustments or automated budget rules.

To overcome the budget allocation limitations of Smart Bidding, use:

  • Portfolio bidding strategies: Setting bid strategies at the campaign level lets you use a common bidding approach (e.g., Target ROAS or Target CPA) across multiple campaigns. This enables more efficient spending across campaigns with similar goals without manual adjustments.
  • Shared budgets: Assigning a single budget across multiple campaigns ensures high-performing campaigns receive adequate funding while preventing overspending on lower-performing ones.

Dig deeper: How each Google Ads bid strategy influences campaign success

Performance Max: A black box for budget allocation?

PMax automates asset and bid optimization across multiple Google properties (Search, Display, YouTube, Discovery, etc.), but you don’t control which channel yorur budget goes to. 

Google’s algorithm decides how much to allocate to each network, which can sometimes result in excessive spend on lower-performing placements like Display rather than Search.

Instead of relying solely on PMax, run separate Search campaigns alongside it to ensure an adequate budget is allocated to high-intent traffic.

Dig deeper: How to make search and PMax campaigns complement each other

Balancing automation and control: Avoid these PPC budget pitfalls

While automation streamlines bidding, it can also lead to costly mistakes. 

Watch out for these common budget-wasting pitfalls and learn to stay in control.

Overspending on low-value traffic

Smart Bidding sometimes aggressively increases bids to meet a Target ROAS or Target CPA, which can inflate CPCs without increasing conversion volume.

Solution

  • Set bid caps when using Maximize Conversion Value to prevent excessive CPC increases.
  • Monitor search terms to ensure increased bids aren’t capturing low-intent queries.

Advanced tip

When setting a tCPA or tROAS, allow a 10-20% margin for flexibility to help Google’s algorithm optimize effectively.

For example, if your ideal tCPA is $100, setting it to $115 gives Google room to secure conversions that may exceed your target while still delivering strong performance. 

Since tCPA operates as an average, not every lead will cost the same amount.

Once you are consistently hitting your target, gradually lower the tCPA (or raise the tROAS) to improve budget efficiency without restricting conversions.

Underfunding efficient campaigns

If a campaign has a long conversion delay (i.e., B2B lead gen), Smart Bidding may incorrectly shift the budget elsewhere before enough data accumulates.

Solution

  • Extend conversion windows in Smart Bidding settings. The default is 30 days, but advertisers can adjust the window from one day up to 90 days
  • Manually monitor lagging conversions and adjust budgets proactively.

Lack of budget control in PMax campaigns

Performance Max doesn’t allow advertisers to set separate budgets for Search, YouTube, and Display. 

As a result, Google may (advertiser sentiment is that they do) favor low-cost clicks from Display rather than higher-intent Search traffic.

Solution

  • Run branded and high-intent non-branded Search campaigns separately to control budget spend on direct-response traffic.
  • Use brand exclusions in PMax to prevent Google from serving brand search queries within PMax, ensuring that branded traffic remains in the dedicated Search campaign.
  • Apply negative keywords via account-level negatives. While PMax doesn’t allow campaign-level negatives, account-level negative keyword lists can help block irrelevant or redundant queries. The maximum number of negative keywords allowed to be applied is 100. Google has stated that it created this limit because PMax isn’t meant to be a heavily restricted campaign type.
  • By monitoring your search impression share, you can identify when branded queries are slipping into PMax instead of the dedicated Search campaign. This will allow you to adjust bid strategies and audience signals accordingly. 
  • Use audience exclusions in PMax to prevent excessive Display spend on irrelevant audiences.

Advanced tip

Tools like Optmyzr can help advertisers determine how their budget is allocated in PMax with the PMax Channel Distribution feature. 

Although we may not have much control over the allocation, we can at least be aware of it. 

Dig deeper: How to manage a paid media budget: Allocation, risk and scaling

How to use first-party data to improve budget allocation

An underutilized strategy for improving budgeting is leveraging first-party data to allocate spend toward high-value audiences. 

As privacy restrictions tighten and tracking capabilities decline, it’s important to shift your focus from broad automated bidding to first-party audience targeting.

Use customer match to prioritize high-value audiences

Instead of spending equally across all users, advertisers can upload Customer Match lists (based on past purchasers, high-LTV customers, or CRM data) and adjust budgets accordingly.

Example

  • If historical data shows that repeat customers generate a higher ROAS than new users, more budget should be allocated to remarketing campaigns targeting Customer Match audiences.

Advanced tip

To maximize campaign efficiency, consider using value-based bidding (VBB) to ensure your budget prioritizes high-value conversions rather than just the volume of leads. 

By assigning different conversion values based on customer lifetime value (LTV), using Customer Match, GA4 insights, or CRM data, you can direct more spending toward audiences that generate the highest long-term revenue.

Changes to customer match lists

Google recently introduced two key updates to Customer Match lists that will impact how advertisers manage audience data.

To stay compliant and maximize audience targeting, be sure to regularly refresh your lists and align your data collection with Google’s updated policies.

Apply GA4 data for smarter budget scaling

Google Analytics 4 (GA4) provides insights into conversion paths, high-value audience segments, and multi-channel attribution. 

Instead of relying solely on Google Ads conversion tracking, use GA4 to determine which audience segments should receive higher budgets.

Best practice

  • Create custom lists/audiences around users with high engagement signals (repeat visits, add-to-cart actions, lead form interactions) and allocate more budget toward these users.
  • Create custom lists/audiences around low-intent users who bounce after viewing one page. To reduce wasted ad spend, decrease your bids or exclude them.

Dig deeper: How to leverage Google Analytics 4 and Google Ads for better audience targeting

Get the newsletter search marketers rely on.



Budget scaling strategies: When and how to increase PPC spend

Scaling your PPC campaigns requires a structured, gradual approach. 

Increasing budgets too aggressively can cause Smart Bidding to overcompensate, leading to inefficient scaling and missed revenue opportunities.

Incremental budget scaling

Instead of doubling your budget overnight, it is better to gradually increase it by 10-20% daily. 

This gives Smart Bidding algorithms time to adjust without overspending or wasting budget.

This will also allow us better control as we can monitor performance changes due to budget shifts more closely.

Example

  • If a campaign is hitting its conversion goals consistently, increase the budget by 15% per week while monitoring conversion trends.

Cross-campaign budget reallocation

Rather than increasing spend across the board, shift budget strategically between:

  • Branded campaigns (lower-funnel, high-converting).
  • Non-branded search campaigns (high-growth potential).
  • Remarketing campaigns (high-value repeat customers).

Dayparting for more efficient spend

Instead of distributing the budget equally across all hours, allocate more to high-converting time periods.

Example

  • If the lead volume is highest between 8 a.m. and 2 p.m., increase bids and budget during these hours.
  • If your business hours are from 12 p.m. to 10 p.m., lower your bids during the hours you aren’t operating to prevent unnecessary ad expenses.

Industry-specific budgeting approaches

As we all know, no two industries are the same, so the approach to budgeting should also be different. Here’s how different business models should think about budget allocation:

B2B lead generation

Budgeting for B2B lead generation requires a long-term view. 

Unlike ecommerce, where purchases can happen quickly, B2B sales cycles can range from a week to over a year, depending on the contract size and decision-making process. 

As such, budget pacing should be planned over months. Don’t make frequent (i.e., daily or weekly) adjustments that could cause instability in the account. 

Because the cycle is longer, conversions often take some time to materialize, so conversion delays should be considered when evaluating Smart Bidding performance. 

If budgets are adjusted too soon based on incomplete data, campaigns may be underfunded before the true impact of conversions is realized.

Dig deeper: Paid search for lead gen: Tips for new accounts with limited budgets

Ecommerce

Seasonality plays a large role in budgeting decisions for ecommerce brands. 

Aggressively increase budgets ahead of major sales events, like Black Friday, Cyber Monday, and holiday shopping, to capitalize on higher purchase intent. 

Reacting to performance mid-season will likely result in missed opportunities if the budget is exhausted too early. 

Also, rather than spreading spend evenly across all potential buyers, prioritize high-LTV customers using Customer Match lists and past purchase data. 

This ensures that ad spend is directed toward audiences likely to generate repeat purchases and higher average order values (AOVs).

Dig deeper: Lead gen vs. ecommerce: How to tailor your PPC strategies for success

Local businesses

Budget allocation for local businesses should be narrowly geo-targeted. 

Instead of distributing spend evenly across an entire service area (although you should have some presence in the area), analyze past geographic conversion data to determine which locations typically generate the highest return. 

The budget should then be allocated accordingly, ensuring that high-performing areas receive the majority of ad spend.

Another important factor is setting up call tracking. 

Since many conversions happen over the phone rather than through online forms, integrate call-tracking data to identify which campaigns generate high-quality leads. 

By analyzing call duration, lead quality, and customer inquiries, you can refine budget allocation to optimize for calls that convert into sales or appointments.

Dig deeper: 9 essential geotargeting tactics for Google Ads

Each industry requires a different budgeting approach tailored to its sales cycles, customer behavior, and conversion patterns. 

Understanding these nuances ensures that your PPC budgets are allocated strategically for maximum impact, whether it’s long-term pacing for B2B, seasonal surges for ecommerce, or localized targeting for service-based businesses.

A smarter approach to budgeting

Budgeting for your PPC campaigns doesn’t involve choosing between static and dynamic models; it involves strategically using both.

  • Smart Bidding and PMax improve efficiency but require human oversight.
  • First-party data should play a bigger role in spend allocation.
  • Budget scaling should be incremental and structured.
  • Industry-specific needs should dictate budget pacing strategies.

The best budgets are adaptable, data-driven, and aligned with long-term profitability rather than short-term spend fluctuations. 

Those who master this approach will gain a competitive advantage in an increasingly automated advertising landscape.

Read more at Read More

Google Search Console API delayed

Screenshot of Google Search Console

There are numerous reports that the Google Search Console API is delayed and not showing data sooner than this past Thursday, February 20th. If you use this API for your own tools, or bring in this data through Looker Studio reports, Big Query or other tools, your reports may be delayed.

More details. The delays started around last Wednesday and some are now saying some data for Thursday is slowly coming in. However, generally, data is as recent as today through the Search Console API.

The web interface is not impacted, so you can get data from going to Google Search Console directly.

Some are saying data for Thursday is now coming in, but others are not sure yet.

Google has not comments on this issue yet.

Why we care. If you are noticing weird data in your tools or reports and that data generally comes from Google Search Console’s API, this is why.

I suspect the data flow will return to normal in the coming days, but if you do report and you see weirdness in those reports, this is your explanation.

For more, if you need that data, access it directly through the web interface.

Read more at Read More

SEO prioritization: How to focus on what moves the needle

SEO prioritization- How to focus on what moves the needle

If you feel like you’re being pulled in different directions with your SEO program, you aren’t alone. 

How do you know where to focus first for the most impact? And when that’s done, what do you do next?

It can be challenging to decide which SEO tasks to prioritize because they all impact the end user in some way – but some more than others. This is where discernment comes into play.

This article will help you build a path to get your SEO program organized from point A to point B and figure out how to prioritize tasks to get ROI quicker.

Frameworks for identifying high-impact SEO opportunities

When every SEO task feels urgent, knowing where to focus first can make or break your strategy. These three frameworks can help you prioritize what moves the needle.

1. Technical SEO audit

A technical SEO audit is your roadmap for identifying and fixing the issues that directly impact search visibility and user experience. 

The right audit reveals the most urgent technical barriers to ranking – and helps you prioritize based on impact.

But not all audits are created equal. Here’s a breakdown of the different types:

Basic SEO audit

  • This is where automated software scans your site and flags common SEO issues. While the insights can be helpful, they come in a generic, one-size-fits-all report. 
  • This type of audit is ideal if you’re working with a tight budget or just want to get a basic overview before bringing in an expert. 
  • It’s never a bad idea, but it won’t provide an in-depth analysis.

Mid-level SEO audit

  • Here, you can expect a professional SEO specialist or vendor to go beyond automated reports and offer additional insights that software alone might miss. 
  • While these can pinpoint issues that require attention, they may not provide detailed solutions. 
  • This approach is useful when you need to identify potential problem areas but aren’t ready for a full-scale SEO strategy.

Comprehensive SEO audit

  • This is a full technical audit conducted by experienced technical SEOs. 
  • This deep dive involves top-tier tools, data analysis, and an in-depth website and SEO review by skilled analysts specializing in technical SEO and business strategy. 
  • Tools assist the process, but the real value comes from expert analysis, which makes it a time-intensive but highly valuable investment.

Knowing these key differences in audits can help you make an informed decision before you invest. 

Dig deeper: Technical SEO: Don’t rush the process

2. The Eisenhower Matrix

The Eisenhower Matrix is a powerful tool for prioritizing tasks by urgency and importance. 

Applying it to your SEO strategy helps you determine which tasks need immediate attention and which can wait.

To get started, divide tasks into four quadrants:

Quadrant 1: Urgent and important

  • These are the critical issues that directly impact rankings and user experience. 
  • For example, this could be a slow site or fixing a misconfigured robots.txt file that is blocking search engines from crawling and indexing key pages.
  • Whatever tasks you put in this category will be non-negotiable. Addressing these items can sometimes have an immediate impact on your ability to compete.

Quadrant 2: Important but not urgent

  • These will be the longer-term strategies that build sustainable growth.
  • For instance, maybe developing a long-term content strategy focused on topic authority and evergreen content falls here.
  • These efforts don’t require immediate attention but are essential for long-term SEO success.  

Quadrant 3: Urgent but not important

  • This bucket is for handling tasks that are time-sensitive but don’t significantly influence rankings or user experience.
  • This could be something like responding to a minor Google Search Console alert about a non-critical issue.
  • While these tasks may not have a high impact, taking care of them prevents minor issues from accumulating into big projects.

Quadrant 4: Neither urgent nor important

  • Anything that falls into this category is something you avoid. 
  • One example might be spending hours tweaking meta descriptions that already meet best practices without significant SEO gains.
  • These activities consume time and resources without delivering meaningful results.
Eisenhower matrix example

Using the Eisenhower Matrix helps your SEO by enhancing:

  • Clarity: Identify and fix what demands attention now versus what can wait.
  • Efficiency: Prioritize the highest ROI tasks without getting bogged down.
  • Focus: Stay aligned with business goals, eliminating distractions.

3. The Pareto Principle (80/20 Rule)

The Pareto Principle suggests that 80% of outcomes come from 20% of efforts. 

In SEO, focusing on the most impactful tasks helps you drive faster, more meaningful results without spreading yourself too thin.

Keyword targeting

It’s common for a small subset of your keywords to drive most organic traffic. 

Instead of spreading your efforts thin across all keywords, focus on optimizing the ones that deliver the most value.

  • Use SEO tools to identify the top-performing 20% of keywords that bring in most of your traffic and conversions.
  • Prioritize pages that rank between Positions 5 and 20 for those high-value keywords. These are low-hanging fruit that can move up with improvements.
  • Expand content for high-value keywords by answering related questions and creating supporting content. 

Content focus

Most of your website’s traffic and engagement likely comes from a handful of high-performing pages. 

Instead of endlessly creating new content, invest in improving the 20% of pages that already generate the most traffic and leads.

  • Identify your top 20% of pages by traffic and conversions using analytics tools.
  • Revamp those pages by updating outdated content to enhance optimization and engagement. 
  • Create supporting content to build topical authority around your best pages.

Technical fixes

Technical SEO can feel overwhelming because there’s always more to fix. But, a small subset of technical issues typically has the most impact on site performance.

Focus on fixing the top 20% of technical issues that cause 80% of your performance problems.

Prioritize high-impact fixes like: 

  • Resolving crawl errors so search engines can access your site.
  • Improving page load speed for user experience and rankings.
  • Fixing broken links to avoid losing link equity and frustrating users.
  • Optimizing usability to retain visitors and improve your ability to compete in the search results.

Dig deeper: Prioritizing SEO strategies: Where to focus your efforts

Having a framework for approaching your SEO program helps you stay organized.

Within that framework, you must consider how you will execute both short-term wins and longer-term strategies.

Balancing long-term strategies with quick wins

To succeed in SEO, you must balance short-term wins with long-term growth.

Quick wins can show immediate improvements, but foundational efforts are what build lasting authority.

To achieve the best results, it’s important to defer resources to both.

Quick wins

Quick wins are tactical SEO tasks that can be implemented quickly to produce noticeable results. 

These tasks usually involve optimizing existing content or resolving certain technical issues. 

They may not require large investments of time or resources but can lead to meaningful improvements in rankings, traffic, or user experience.

What constitutes a quick win? 

  • Tasks that are simple to implement. 
  • Things that address known website performance issues. 
  • Fixes that improve both search engine visibility and user experience. 

Examples of SEO quick wins include:

  • Fixing technical errors, like resolving 404 pages, broken links, and crawl issues.
  • Improving site speed.
  • Optimizing existing content by adding internal links, updating outdated information, or including relevant keywords.

Quick wins are valuable because they deliver early signs of progress. This helps build momentum and gain stakeholder buy-in. 

However, relying solely on quick wins isn’t enough to achieve a sustainable SEO program. 

That’s where long-term strategies come in. 

Long-term strategies

Long-term strategies require more time and effort but are key to creating a strong foundation. 

These strategies help your website become more authoritative, trustworthy, and relevant in the eyes of both search engines and your audience.

Examples of long-term strategies include:

  • Content creation that targets important keywords and answers user questions in-depth. Try SEO siloing to build authority around a topic.
  • Earning backlinks through your high-quality content and partnerships. 
  • Refreshing top-performing content to make sure it remains evergreen and relevant. I recommend spending 50% of your content resources on maintaining older but high-performing content. 
  • Continuing education so you can stay ahead of the curve. Consider annual SEO training with additional learning opportunities throughout the year. Search evolves fast, and you want to be able to forecast what’s coming up so you can start working on it early. 

Foundational efforts don’t deliver instant results, but as your site’s authority grows, you’ll see compounding benefits with higher rankings, more traffic, and increased user trust.

Fast gains, lasting growth: Crafting a balanced SEO plan

A good SEO roadmap should include both short-term quick wins and long-term projects. But where to start? 

Here’s one scenario: You could focus 70% of your time on quick wins early on to show immediate results and 30% on long-term efforts. 

Over time, you might adjust the balance to a 50/50 split as your site becomes more stable and foundational work becomes a bigger priority.

Dig deeper: 3 quick SEO wins to kick-start growth next year

Focus on what matters most for lasting results

Prioritizing your SEO strategies is the key to driving meaningful results. 

SEO isn’t about doing everything at once. It’s about doing the right things at the right time. 

When you focus on high-impact tasks and continuously refine your approach, you’ll build a more competitive search engine presence that pays off for years to come.

Read more at Read More

Google sued by Chegg over AI Overviews hurting traffic and revenue

Chegg, the publicly traded education technology company, has sued Google over its AI Overviews, claiming they have hurt its traffic and revenue. The company said that AI Overviews is “materially impacting our acquisitions, revenue, and employees.”

What Chegg said. Chegg wrote:

Second, we announced the filing of a complaint against Google LLC and Alphabet Inc. These two actions are connected, as we would not need to review strategic alternatives if Google hadn’t launched AI Overviews, or AIO, retaining traffic that historically had come to Chegg, materially impacting our acquisitions, revenue, and employees. Chegg has a superior product for education, as evident by our brand awareness, engagement, and retention. Unfortunately, traffic is being blocked from ever coming to Chegg because of Google’s AIO and their use of Chegg’s content to keep visitors on their own platform. We retained Goldman Sachs as the financial advisor in connection with our strategic review and Susman Godfrey with respect to our complaint against Google.

More details. CNBC reports that “Chegg is worth less than $200 million, and in after-hours trading Monday, the stock was trading just above $1 per share.” Chegg has engaged Goldman Sachs to look at options to get acquired or other strategic options for the company.

Chegg reported a $6.1 million net loss on $143.5 million in fourth-quarter revenue, a 24% decline year over year, according to a statement. Analysts polled by LSEG had expected $142.1 million in revenue. Management called for first-quarter revenue between $114 million and $116 million, but analysts had been targeting $138.1 million. The stock was down 18% in extended trading.

The report goes on to say that Google forces companies like Chegg to “supply our proprietary content in order to be included in Google’s search function,” said Schultz, adding that the search company uses its monopoly power, “reaping the financial benefits of Chegg’s content without having to spend a dime.”

Here is more from Chegg’s statement:

While we made significant headway on our technology, product, and marketing programs, 2024 came with a series of challenges, including the rapid evolution of the content landscape, particularly the rise of Google AIO, which as I previously mentioned, has had a profound impact on Chegg’s traffic, revenue, and workforce. As already mentioned, we are filing a complaint against Google LLC and Alphabet Inc. in the U.S. District Court for the District of Columbia, making three main arguments.

  • First is reciprocal dealing, meaning that Google forces companies like Chegg to supply our proprietary content in order to be included in Google’s search function.
  • Second is monopoly maintenance, or that Google unfairly exercises its monopoly power within search and other anti-competitive conduct to muscle out companies like Chegg.
  • And third is unjust enrichment, meaning Google is reaping the financial benefits of Chegg’s content without having to spend a dime.

As we allege in our complaint, Google AIO has transformed Google from a “search engine” into an “answer engine,” displaying AI-generated content sourced from third-party sites like Chegg. Google’s expansion of AIO forces traffic to remain on Google, eliminating the need to go to third-party content source sites. The impact on Chegg’s business is clear. Our non-subscriber traffic plummeted to negative 49% in January 2025, down significantly from the modest 8% decline we reported in Q2 2024.

We believe this isn’t just about Chegg—it’s about students losing access to quality, step-by-step learning in favor of low-quality, unverified AI summaries. It’s about the digital publishing industry. It’s about the future of internet search.

In summary, our complaint challenges Google’s unfair competition, which is unjust, harmful, and unsustainable. While these proceedings are just starting, we believe bringing this lawsuit is both necessary and well-founded.

Google statement. Google spokesperson Jose Castaneda said, “With AI Overviews, people find Search more helpful and use it more, creating new opportunities for content to be discovered. Every day, Google sends billions of clicks to sites across the web, and AI Overviews send traffic to a greater diversity of sites.”

Why we care. Will Chegg win in a court against Google? Will Google have to rethink its AI Overviews and find better ways to send traffic to publishers and site owners? It is hard to imagine but this may be the first large lawsuit over Google’s new AI Overviews.

Read more at Read More

Microsoft Bing testing Copilot Search

Microsoft is testing a new version of Bing named Copilot Search, where it uses Copilot AI to provide a different style of search results. It looks different from the main Bing Search, it looks different from Copilot and it looks different from the Bing generative search experience.

More details. The folks over at Windows Latests reported, “Microsoft is testing a new feature on Bing called “AI Search,” which replaces blue links with AI-summarized answers. Sources tell me it’s part of Microsoft’s efforts to bridge the gap between “traditional search” and “Copilot answers” to take on ChatGPT. However, the company does not plan to make “AI search” the default search mode.”

You can access it at bing.com/copilotsearch?q=addyourqueryhere – just replace the text “addyourqueryhere” with your query.

What it looks like. Here is a screenshot I captured of this interface:

Why we care. Everyone is looking to build the future of search now – with Google Gemini, Google’s AI Overviews, Microsoft Bing, Copilot, ChatGPT Search, Perplexity and the dozens of other start up AI search engines – the future of search is something they are all trying to crack.

This seems to be one new test that Microsoft is trying out for a new approach to AI search.

Read more at Read More