Data providers: Google March 2025 core update had similar volatility to the previous update

The Google March 2025 core update finished rolling out over a 14-day period, starting on March 13, 2025, and completed about 14 days later on March 27, 2025. This March core update took about a week longer than Google’s December 2024 core update, which started December 12, 2024 and completed about 6 days later on December 18, 2024.

Please understand that if a core update impacts a site, it can result in a huge change for that site’s search visibility. So, I do not want to diminish any core updates, including the March core update; those could have been really big for you or the sites you manage.

Data providers on the Google March 2025 core update

Semrush. Semrush (our parent company) was the first to send us data that seemed to show that the Google March 2025 core update had similar volatility than the previous core update, the December 2024 core update. Keep in mind that the December 2024 core update was more volatile than the November 2024 core update, based on the data.

If you glance at the Semrush Senor, you can see the overall volatility the tool reported over that time period:

Mordy Oberstein, who sent me the data from Semrush, told me the two were of “similar in size” when comparing the volatility. He sent me this chart showing the volatility of the past two core updates, broken down by vertical. It shows the peak volatility numbers were pretty similar between the two updates:

If you look at overall ranking volatility change comparison, you can see that the Health sector saw a much bigger change, for some reason:

But when you compare this to the baseline rank volatility, both the December and March core updates were within very similar ranges, Mordy Obertstein told us. “There’s a mere .1 difference between the two,” Obertstein added. Obertstein said he has a theory, which he will share at his session at SMX Advanced, on why he thinks that this update hit different verticals differently.

When you dive into the top ten results, you can see a notable change in what ranking changes there were between these two past core updates:

Similarweb. Similarweb’s SERP Seismometer showed the spikes in volatility cooled down with the March update. You can see it get a bit more volatile on March 13th, 14th and 15th but then start to cool again as the core update rolled out.

Darrel Mordechai from SimilarWeb told us the March 2025 core update was not the most volatile core update they’ve seen and compared to the December core update, it showed “similar levels of volatility.”

Here is a chart showing the core update volatility by average position change for the past core updates, as documented by Similarweb:

Here is when you zoom in comparing the March 2025 and December 2024 core updates, they are super close:

The current update showed slightly lower fluctuations in the top three positions but increased volatility across the top five. Here is where you can see that in this chart:

When you compare it by vertical or niche, you can see the volatility the March 2025 core update caused across the health, finance, retail, travel and finance industries. You can see the finance industry showed the highest levels of fluctuation, particularly in the top five results. In contrast, the travel industry experienced notably low volatility in the top three positions.

seoClarity. The folks at seoClarity also sent me some winners and losers reports, showing the biggest winners and losers from February to March 2025:

Other tools. There are a lot of Google search ranking volatility tools. Here is what they looked like after the core update finished rolling out and over the course of the update:

Mozcast:

Mozcast

Algoroo:

Algoroo

Advanced Web Rankings:

Advancedwebranking

Accuranker:

Accuranker

Cognitive SEO:

Cognitiveseo

Wincher:

Wincher

Mangools (looks broken?):

Mangools

Sistrix:

Sistrix

Data For SEO:

Dataforseo

SERPstat:

Serpstat

Industry. The initial rollout seemed to kick in within a few days after the update was announced. Some sites saw big swings both up and down, in terms, of ranking improvements or decline. But this update did not seem as widespread as some previous core updates, where it had a wider impact on a more diverse site of sites. That is not to say this update was not big for those who were impacted by it – it 100% was very big for those sites.

During the update, some of the tracking tools were tripped up by some Google Search result page changes. That may make it hard for some to track the impact of this update. But you can use Google Search Console to see your impact for your site and see position changes for your most popular keywords.

Then, we saw some additional volatility spike at the tail end of this update.

What to do if you are hit. Google has given advice on what to consider if you are negatively impacted by a core update in the past. Google has not really given much new advice here.

  • There aren’t specific actions to take to recover. A negative rankings impact may not signal anything is wrong with your pages.
  • Google has offered a list of questions to consider if your site is hit by a core update.
  • Google said you can see a bit of a recovery between core updates but the biggest change would be after another core update.

In short, write helpful content for people and not to rank in search engines.

  • “There’s nothing new or special that creators need to do for this update as long as they’ve been making satisfying content meant for people. For those that might not be ranking as well, we strongly encourage reading our creating helpful, reliable, people-first content help page,” Google said previously.

More on Google updates

You can read more of our coverage in Search Engine Land’s Google Algorithm Updates history.

Why we care. While the data above shows how sites in general are doing with the last core update, it does not represent how your individual site did with the update. If your site was hit by this past update, it can be devastating. If you were hit by previous updates and so no improvement with this update, then again, devastating once again. But some sites saw big improvements.

Feel free to compare this to our December core update report.

We hope you saw improvements with this March 2025 coe update.

Read more at Read More

Your guide to Google Ads Smart Bidding

Your guide to Google Ads Smart Bidding

Are you controlling your paid search campaigns, or are they controlling you? 

If you can’t confidently articulate your smart bidding strategies, you lose conversions and credibility. 

True mastery isn’t just about setting up a campaign and picking a bid strategy; it’s about owning and communicating the process effectively. 

This guide is your roadmap to clarity and control, breaking down 2025’s Smart Bidding into actionable insights.

We’ll cover key concepts, common mistakes, and actionable tips for picking the right strategy. 

Smart Bidding in Google Ads: AI-powered bid optimization

Smart Bidding is Google Ads’ advanced form of automated bidding.

It leverages machine learning and real-time auction signals to optimize bids for conversions or conversion value. 

It dynamically adjusts bids to achieve specific goals, such as maximizing conversions at a target cost or achieving a desired return on ad spend.

Key Smart Bidding strategies include: 

Target CPA (cost per action)

  • Optimizes bids to achieve conversions at a target cost per action. 
  • Ideal for campaigns where you have a specific cost you’re willing to pay for each conversion (e.g., lead, sale).
  • Example: “We aim to acquire leads at a CPA of $50.”

Dig deeper: Everything you need to know about Target CPA bidding

Target ROAS (return on ad spend)

  • Focuses on achieving your desired revenue for every dollar spent. 
  • Best for ecommerce or campaigns with clear revenue goals.
  • Example: “We want to achieve a ROAS of 400%, meaning $4 in revenue for every $1 spent.”

Maximize Conversions

  • Automatically sets bids to achieve the most conversions within your budget.
  • Useful when you want to drive as many conversions as possible, regardless of cost.
  • Example: “Our goal is to maximize the number of sign-ups within our daily budget.”

Dig deeper: Mastering Maximize conversions bidding in Google Ads

Maximize Conversion Value

  • Prioritizes higher-value conversions for greater overall return. 
  • Effective when different conversions have varying values to your business.
  • Tends to favor selling more expensive products or services, as they contribute more to the total conversion value.
  • Example: “We value a ‘request for quote’ more than a ‘newsletter sign-up,’ so we want to maximize the total value of conversions.”

Dig deeper: Maximize Conversion Value: Google Ads bidding explained

Maximize Clicks

  • Automatically sets your bids to get as many clicks as possible within your budget.
  • Useful for top-of-funnel campaigns where the goal is to drive traffic to a site.
  • Example: “This campaign is designed to drive as much traffic to our new blog post as possible.”

Enhanced CPC (ECPC)

  • A semi-automated bidding strategy that adjusts your manual bids to try and get more conversions.
  • Google Ads adjusts your manual bid up or down based on the likelihood of a conversion.
  • Example: “We are using manual bidding but want to use Google’s signals to increase conversions where possible.”

Viewable CPM (YouTube)

  • Focuses on maximizing viewable impressions of your display or skippable in-stream video ads.
  • Ideal for brand awareness campaigns where the goal is to get your message seen by as many people as possible.
  • Example: “We want to ensure our brand message is visibly displayed to our target audience on YouTube.”

Cost Per View (YouTube)

  • Optimizes bids to get the most video views or interactions within your budget.
  • Best for campaigns focused on driving engagement with your video content.
  • Example: “We are running a video campaign on YouTube and want to maximize the number of views we receive.”

It’s crucial to understand that while setting a Target CPA or ROAS provides strategic direction, achieving those exact targets isn’t guaranteed.

I’ve had situations where a media planner pushed for an immediate switch to a specific CPA goal. 

They wanted the target set at four times and wouldn’t budge or try to understand why the campaign was set at two times.

A common misconception is that simply setting a desired metric will automatically yield the desired results. 

In practice, achieving optimal performance often requires a nuanced approach.

This may involve:

  • Gradual bid adjustments.
  • A willingness to accept temporary fluctuations in ROAS for broader account health.
  • A comprehensive evaluation of multiple factors, including budget, historical campaign performance, and keyword strategy.

It’s essential to understand that Smart Bidding strategies, while powerful, require strategic oversight and a holistic understanding of account dynamics. 

Success should be measured within the context of overarching account objectives, not solely focusing on individual campaign metrics.

Understanding manual, automated and smart bidding in Google Ads

Understanding manual, automated and smart bidding in Google Ads

Manual bidding allows you to control bid adjustments completely, making it ideal for certain industries, such as legal or home services, where fluctuating competition requires ongoing oversight. However, it requires more time and effort.

It’s like driving a car where you control every gear shift and pedal movement.

Automated bidding simplifies bid management by using algorithms to adjust bids. 

While automated bidding can save time, its generic approach doesn’t account for nuanced conversion goals.

Think of this as engaging cruise control. You tell the car (Google Ads) your general desired speed (goal), and it adjusts the engine (bids) to maintain that pace.

Smart Bidding, however, takes automated bidding further by using real-time signals and advanced machine learning to predict the likelihood of conversions and their value, tailoring bids to individual auctions. 

It’s especially effective for campaigns with clear conversion goals and sufficient historical data.

This is like having a self-driving car with an incredibly sophisticated navigation system.

It’s important to know that while all Smart Bidding is automated, not all automated bidding qualifies as Smart Bidding.

Automated bidding covers a wider range of strategies, some of which are more basic and don’t rely on real-time signals or advanced machine learning.

In essence:

  • Manual: You control every bid.
  • Automated: Google’s algorithms handle bid adjustments based on your chosen strategy.
  • Smart: Google’s machine learning optimizes bids in real-time for conversions and conversion value.

Get the newsletter search marketers rely on.



Smart Bidding: Advantages and risks

There are significant advantages to using Smart Bidding.

  • Improved efficiency: Saves time by automating bid adjustments. 
  • Auction-time optimization: Factors in user intent, device, location, and other data points to optimize bids for each auction. 
  • Goal alignment: Customizes bids to match your campaign objectives, whether it’s maximizing volume or focusing on high-value actions.  

While Smart Bidding offers significant advantages, missteps in implementation can lead to underwhelming results. 

Here’s how to avoid common pitfalls and optimize your campaign performance.

Data dependency

Smart Bidding algorithms rely on robust historical data to make accurate predictions. 

Campaigns with fewer than 30 conversions in the last 30 days may struggle to optimize effectively.

Start with manual bidding or Maximize Clicks to build a data foundation before switching to Smart Bidding. Boris Beceric, a Google Ads consultant and coach, said:

  • “I guess most try Smart Bidding too early – without enough conversion volume. What usually helps: consolidate campaigns so you get more data flowing through a single campaign. Portfolio bidding – kinda the same, but consolidation takes place at the bid strategy level.
  • “Micro conversions – try to add in the micro conversion that had the most volume and is closest to the ‘real’ conversion. Bonus: Reverse engineer CVR and conv value from micro to macro conversion and adjust tCPA accordingly.”

Goal misalignment

Using the wrong bidding strategy can hinder performance. 

For example, applying Target ROAS to a new campaign with limited data can set unrealistic expectations and reduce reach.

Align bidding strategies with your goals.

When prioritizing profitability, use Maximize Conversions for volume and Target ROAS or Target CPA. Harrison Hepp, owner of Industrious Marketing, said:

  • “I had a client who was hybrid ecommerce and lead gen (they sold products, but high-priced deals were lead gen), and they insisted on tracking purchases and leads in every campaign. We constantly battled major fluctuations in the campaigns as they’d swing back and forth between getting purchases or leads and trying to optimize to both.
  • “It also made bid strategy selection really hard, as conversion value bidding would deprioritize leads (no value was tracked), but CPA bidding wasn’t efficient for purchases because of differences in product prices. It really showed how aligning your goals and bid strategy is critical for steady performance. It also underlined how the right bidding strategy can prioritize success in campaigns.”

Monitoring is non-negotiable

Despite its automation, Smart Bidding is not a “set it and forget it” tool. 

Failing to monitor campaigns can lead to wasted ad spend and missed optimization opportunities.

Regularly review performance metrics, adjust campaign parameters, and stay proactive in managing Smart Bidding strategies.

  • “Custom columns/Segment views: We want to measure efficiency, so things like conv value/conv, search impression share, etc.” said Ameet Khabra, owner of Hop Skip Media.

Even with the most advanced AI behind Smart Bidding, performance optimization requires vigilance. 

Regularly review the following metrics to ensure your strategy is working as intended:

  • CPA: Is your Target CPA being met?
  • ROAS: Are the conversions driving sufficient revenue?
  • Conversion rates: Are conversions coming from the right audience segments? Or are you paying for competitors to download your white papers and marking that down as a lead?
  • Search term reports: Are irrelevant keywords consuming a significant portion of your budget? Unprofitable keywords can be why a campaign is not meeting goals.
  • Conversion tracking accuracy: If conversion tracking is improperly implemented, Smart Bidding will optimize based on inaccurate data, reducing effectiveness.

Double-check your conversion tracking setup. Assign accurate values to conversions to reflect their true business impact. Khabra said:

  • “My favorite saying lately is ‘garbage in, garbage out,’ and that is definitely a large component of conversion tracking. Ensuring that we’ve identified the correct conversions that move the needle is half the battle. Implementing the tracking and double-checking that it is correct – collecting conversions – is the second half.”

Budgetary awareness

Strategies like Maximize Conversions and Maximize Clicks will attempt to spend your entire daily budget. 

If your budget is set too high, this can lead to overspending.

Start with smaller daily budgets and gradually increase them while monitoring performance.

Realistic targets

Setting overly aggressive Target CPA or Target ROAS goals can limit your campaign’s reach, as the algorithm will avoid auctions it deems unprofitable.

Begin with realistic targets slightly higher or lower than your current average. Allow time for the algorithm to learn before refining the target.

Best practices for Smart Bidding in Google Ads

To ensure optimal performance, follow these best practices for implementing Smart Bidding in your Google Ads campaigns.

1. Feed accurate data 

Ensure your conversion tracking is set up correctly. 

Assign meaningful values to conversions – whether it’s a purchase, lead form submission, or newsletter signup. 

2. Leverage seasonality adjustments 

Use seasonality adjustments in Google Ads to guide Smart Bidding algorithms for short-term changes (e.g., holiday sales or promotions). 

This prevents excessive or insufficient bids during periods of fluctuating demand. 

Google Ads seasonality adjustments

3. Start with conservative budgets 

Begin with smaller budgets and avoid aggressive bid caps that may limit auction participation. Allow the algorithm to learn and adapt gradually. 

4. Prioritize business value over conversion volume 

Align your bidding tactics with broader business goals. Instead of focusing solely on conversion volume, consider how each conversion contributes to revenue or lifetime customer value. 

5. Test and adapt 

Use Google Ads experiments to test different strategies. 

For example, compare Target CPA with Target ROAS to identify which delivers better results for your campaigns. 

Google Ads Experiments let you directly compare bid strategies in real-world scenarios.

Duplicate your campaign, allocate a split percentage to a new strategy (like comparing Target CPA vs. Target ROAS), and see concrete results with statistical significance.

Final thoughts

Smart Bidding isn’t just about knowing which technical settings to adjust. 

It’s about understanding how to make Google’s automated tools align with your business goals.

The digital landscape evolves quickly, so it’s essential to stay adaptable, continuously monitor performance, and make adjustments as needed. 

Nail the strategy, stay proactive, and you’ll set yourself up for long-term success.

Read more at Read More

Microsoft Advertising will start enforcing Consent Mode in May

Microsoft Advertising will require advertisers to provide explicit user consent signals starting May 5.

First communicated to advertisers a few weeks ago, this change ensures compliance with global privacy regulations while maintaining the ability to gather insights that optimize ad performance.

Why we care. As data privacy concerns grow, businesses face increasing pressure to protect personal information. Microsoft’s enforcement of Consent Mode offers a way to balance privacy with performance, reinforcing trust while meeting regulatory requirements.

What is Consent Mode? Consent Mode is a feature from Microsoft Advertising that respects user privacy preferences while allowing advertisers to track conversions and optimize campaigns. It adjusts cookie access based on user consent, using the ad_storage parameter to either allow or block cookies. This applies to:

  • Universal Event Tracking (UET) on the Microsoft Advertising Platform.
  • Universal Pixel, Segment, and Conversion pixels within Microsoft Invest, Curate, or Monetize.

Consent signals can also be shared through the IAB’s Transparency and Consent Framework (TCF) or directly via a Consent Management Platform (CMP).

How to implement Consent Mode. Businesses can send user consent signals using one of these three options:

  • Direct integration. Implement Consent Mode with UET, Universal Pixel, Segment, or Conversion pixels.
  • IAB framework. Pass consent signals directly in a TCF 2.0 string or through a CMP.
  • Third-party tools. Integrate Microsoft’s Consent Mode through tools like Google Tag Manager.

Read more at Read More

Google Ads policy update: More ads, new rules

Google

Google Ads will update its Unfair Advantage Policy to clarify that the restriction on showing more than one ad at a time for the same business, app, or site only applies within a single ad location.

This change, staring April 14, follows recent experiments allowing multiple ads from the same advertiser in different locations on the search results page.

What’s happening.

  • Double serving now permitted: Advertisers can now run multiple ads for the same business, app, or site on a single search results page—provided they occupy different ad locations. This could potentially increase visibility and clicks for top advertisers but may also intensify competition for smaller players.
  • Shifting auction dynamics: Google’s updated policy leverages different ad locations to run separate auctions, allowing businesses to secure multiple placements. This adjustment aligns with Google’s evolving approach to ads, such as mixing ads with organic results and redefining top ad placements last year.

Why we care. This update opens up opportunities to dominate search results by showing multiple ads for the same business in different ad locations. This could lead to increased visibility, higher click-through rates, and more conversions.

However, it may also drive up competition and costs, especially for smaller advertisers, as larger brands gain more SERP real estate. Understanding this change is crucial for adapting bidding and placement strategies to stay competitive.

Industry reactions. Digital marketing expert Navah Hopkins of Optmyzr noted on LinkedIn:

  • “Google is officially making it fair game to have more than one spot on the SERP. I have thoughts on this, but I want to see how performance actually shakes out in Q2.”

Digital marketing expert Boris Beceric commented that Google is only chasing the money:

  • “Another case of Google liking money more than a good user experience…not even talking from an advertiser’s perspective.”

Bigger picture. This policy shift marks another fundamental change in Google Ads’ long-standing practices, raising questions about how SERP real estate and competition will evolve.

Bottom line. This update could create new opportunities for advertisers to dominate search results, but it might also make it harder for smaller businesses to compete. The real impact will become clearer as the industry adapts in the coming months.

Read more at Read More

Reddit Ads rolls out new SMB tools to boost campaign performance

Reddit SEO: Everything you need to know

Reddit Ads is introducing a suite of new tools aimed at helping small and medium-sized businesses (SMBs) streamline campaign management, optimize ad performance, and improve data accuracy.

Easier Campaign Setup and Management:

  • Campaign Import. Reddit Ads now allows advertisers to import campaigns directly from Meta in just three steps. After signing into their Meta account within Reddit Ads Manager, users can select an ad account and campaign to import, then customize it to fit Reddit’s platform. This seamless process enables advertisers to leverage high-performing Meta ads on Reddit quickly.
  • Simplified Campaign QA. A new review page in the Reddit Ads Manager now consolidates all campaign details for a clear overview. Advertisers can easily identify errors or inconsistencies and make edits before publishing.

Enhanced Signal Quality and Conversion Tracking:

  • 1-Click GTM Integration for Reddit Pixel. Setting up Reddit’s website conversions tag just got easier. With the new Google Tag Manager (GTM) integration, advertisers can install the Reddit Pixel in a few clicks, enabling fast and accurate conversion tracking. This simplifies measuring customer journeys and optimizing lower-funnel strategies.
  • Event Manager QA. The Events Manager’s enhanced Events Overview page now provides a detailed breakdown of conversion events from the Reddit Pixel or Conversions API (CAPI). This update helps advertisers verify event data accuracy, troubleshoot issues, and run effective lower-funnel campaigns.

Why we care. The new Campaign Import feature lets advertisers quickly repurpose high-performing Meta ads on Reddit, saving time and effort. The simplified QA tools helps with quality checking to reduce as many errors as possible before launch, while the 1-click GTM integration and improved Event Manager provide deeper insights into customer behavior and campaign performance

Bottom line. These updates reflect Reddit’s ongoing commitment to making its ad platform more accessible and effective for SMBs. By reducing setup friction and providing better visibility into campaign performance, Reddit Ads aims to help businesses reach niche communities and drive impactful results.

Read more at Read More

The next wave of search: AI Mode, deep research and beyond

The next wave of search: AI Mode, deep research and beyond

With the rise of AI-powered features, search engines are not just directing users to information but delivering answers directly. 

This shift is redefining how people interact with the web, raising questions about the future of SEO, content discovery, and digital marketing. 

Here’s what’s coming next.

From ChatGPT to Grok 3: The breakneck pace of AI advancements

The world has seen rapid and significant advances in AI technology and large language models (LLMs) within two years. 

Looking back just three years ago, Google’s Gemini and Meta’s LLAMA did not exist, and OpenAI’s ChatGPT was later released in late November 2022. 

  • Fast-forward to January 2025, the public was introduced to DeepSeek R1. This open-source large language reasoning model astounded the AI community with its speed, efficiency, and affordability, especially compared to OpenAI’s o1 GPT model. 
  • A few weeks later, Elon Musk’s company xAI launched Grok 3, which impressed users by topping a key AI leaderboard with its complexity and fewer guardrails (see: unhinged mode).
  • More recently, Anthropic released Claude 3.7 Sonnet and Claude Code, an LLM that excels at code creation and debugging to a degree that has made many software engineers a bit uneasy.

These LLMs are just the beginning of AI’s rapid progress, with more breakthroughs on the way. 

Google’s AI Mode: A glimpse of the future 

AI isn’t just bringing new products – it’s transforming existing ones, too.

On March 5, Google announced they were expanding AI Overviews with a new experimental feature called AI Mode

This interactive feature allows users to:

  • Engage with web search in a chat-like manner through multimodal understanding.
  • Refine long-tail queries in a back-and-forth manner. 

AI Mode, powered by Gemini 2.0, enhances research using a “query fan-out” technique to gather real-time data from multiple sources and generate detailed, in-depth summaries.

This may make SEOs uncomfortable, as it potentially reduces clicks to publisher sites and further promotes a zero-click ecosystem. 

With Google integrating Gemini 2.0 into its suite of products and its dominance of 89% of the search industry, its AI innovations demand close attention. 

These technologies will likely be added to search, and AI Mode offers a preview of what’s ahead.

Two terms for the future of search: Agentic and deep research 

We’ll likely hear two terms used more often in the AI and search space: 

Deep research models can browse the web and focus on conducting intensive, in-depth research to provide users with informative summaries on complex topics. 

Unlike previous LLMs, which use a single-step information retrieval system through RAG (retrieval-augmented generation), deep research and agentic models can:

  • Conduct multi-step research through a series of actions, pulling information from multiple sources to provide comprehensive summaries to the user. 
  • Take proactive actions, such as executing tasks and complex instructions. 

Google’s Project Mariner and OpenAI’s Operator already showcase these capabilities by allowing users to perform tasks within their browsers while understanding multi-modal elements such as text, images, and forms.

Dig deeper: How to use OpenAI’s Deep Research for smarter SEO strategies

How these models could change search

Suppose you want to plan a trip to Tokyo and know the best season to go, the weather, and where to stay. 

Typically, this type of research takes a few days or weeks, and you gather information from various sources, such as travel websites or YouTube videos.

A deep research model can do the heavy lifting by searching the web, gathering information, and summarizing relevant content, which saves you time. 

It can also “read, listen, and watch” various sources to provide a thorough answer. 

An agentic model could also book your hotels and flights, navigating checkout flows to complete the purchase.

AI is moving in this direction as companies like Google work toward AGI (artificial general intelligence) – machines that can reason across diverse tasks like humans.

Deep research and agentic models are key milestones in building practical AI solutions for everyday use.

AI Overviews have already impacted click behavior and organic traffic

Now, we must consider these AI features’ long-term effects on the content ecosystem.

Get the newsletter search marketers rely on.



What could the future search landscape look like?

Google’s AI Overviews and agentic advancements are here to stay. 

If AI Mode succeeds, it will be the first deep research feature in Google Search. 

So, what’s next for the search landscape? 

Here are some possibilities.

Continual rise of zero-click searches

Since launching in May 2024, AI Overviews have significantly reduced clicks to informational queries.

As AI search capabilities advance, users will likely rely even more on AI tools for quick answers rather than clicking through to websites or articles. 

AI Mode and future search innovations could accelerate this shift by prioritizing fast, AI-generated summaries over traditional browsing.

As zero-click searches become the norm, you must rethink how you measure value and engagement. 

Traditional KPIs may no longer accurately reflect user behavior, so focusing on brand visibility and awareness will be more critical than ever.

Increased personalization

LLMs and AI systems are revolutionizing search by personalizing responses with unmatched speed and scale, surpassing traditional algorithms. 

Leveraging Google’s vast user data, AI can train on existing information and refine queries in real-time to deliver more tailored results. 

As these systems continuously learn, they will become even better at recognizing, remembering, and adapting to individual user preferences.

As AI-driven search becomes more personalized, it’s worth considering whether hyper-niche content is the key to reaching your audience.

Multimodal search

Google’s AI-powered multimodal capabilities are already embedded in many of its products, including Project Astra, an AI assistant unveiled at Google I/O 2024.

During a live demonstration, Astra used multiple tools – such as Google Lens – to identify objects in real time and respond to voice queries.

In my own experience at Google I/O, the AI assistant:

  • Accurately classified animal figurines.
  • Distinguished between similar names (“Bob” vs. “Rob”).
  • Even created a story about the figures.

While some of these advanced features haven’t been integrated into Google Search yet, multimodal search through Google Lens and voice search is already shaping how users submit queries. 

As Google develops these capabilities, you should anticipate what’s next, look beyond text-based queries, and optimize for image, video, and audio search.

Dig deeper: From search to AI agents: The future of digital experiences

Commercial queries can still draw users to websites

AI-generated results have reduced clicks for informational queries, but commercial and transactional searches still offer opportunities for website traffic.

During the decision-making process, potential buyers research extensively – comparing products, reading reviews, and exploring multiple channels before making a purchase.

While it’s unclear how AI-generated search will impact this journey, think about how AI can streamline multi-touchpoint decision-making while still driving users to your website.

When users move closer to making a purchase, user-generated content – like reviews – will still play a crucial role in conversions.

Content quality still rules

Despite AI’s growing role in search, one thing remains constant: high-quality content is essential. 

Whether users rely on traditional search engines or LLMs, visibility will still depend on the strength of the content itself.

Since both Google Search and LLMs use RAG to pull from vast datasets, ensuring these systems have access to accurate, high-quality information is critical. 

Content demonstrating E-E-A-T (experience, expertise, authoritativeness, and trustworthiness) will continue to rank higher in AI-driven search results.

Your brand will also play a bigger role in search visibility, making it essential to create valuable, well-optimized content across multiple formats.

Dig deeper: Decoding Google’s E-E-A-T: A comprehensive guide to quality assessment signals

Read more at Read More

Pagination and SEO: What you need to know in 2025

Pagination and SEO: What you need to know in 2025

Ever wondered why some of your ecommerce products or blog posts never appear on Google? 

The way your site handles pagination could be the reason.

This article explores the complexities of pagination – what it is, whether your site needs it for SEO, and how it affects search in 2025. 

What is pagination?

Pagination is the coding and technical framework on webpages that allows content to be divided across multiple pages while remaining thematically connected to the original parent page.

When a single page contains too much content to load efficiently, pagination helps by breaking it into smaller sections.

This improves user experience and unburdens the client (i.e., web browser) from loading too much information – much of which may not even be reviewed by the user.

Examples of pagination in action

Product listings

One common example of pagination is navigating multiple pages of product results within a single product feed or category. 

Let’s look at Virgin Experience Days, a site that sells gifted experiences similar to Red Letter Days.

Take their Mother’s Day experiences page:

  • https://www.virginexperiencedays.co.uk/mothers-day-gifts

Scroll down to the “All Mother’s Day Experiences & Gift Ideas Experiences” section, and you’ll see a staggering 1,635 experiences to choose from. 

That’s a lot.

Large scale product listings

Clearly, listing all of them on a single page wouldn’t be practical. 

It would result in excessive vertical scrolling and could slow down page loading times.

Further down the page, you’ll find pagination links:

Embedded Pagination

Clicking a pagination link moves users to separate product listing pages, such as page 2:

  • https://www.virginexperiencedays.co.uk/mothers-day-gifts?page=2

In the URL, ?page=2 appears as a parameter extension, a common pagination syntax. 

Variations include ?p=2 or /page/2/, but the purpose remains the same – allowing users to browse additional pages of listings. 

Even major retailers like Amazon use similar pagination structures.

Pagination also helps search engines discover deeply nested products. 

If a site is so large that all its products can’t be listed in a single XML sitemap, pagination links provide an additional way for crawlers to access them. 

Even when XML sitemaps are in place, internal linking remains important for SEO. 

While pagination links aren’t the strongest ranking signal, they serve a foundational role in ensuring content is discoverable.

Dig deeper: Internal linking for ecommerce: The ultimate guide

Blog and news feeds

Pagination isn’t limited to product listings, it’s also widely used in blog and news feeds. 

Take Search Engine Land’s SEO article archive:

  • https://searchengineland.com/library/seo

In this page, you can access a feed of all SEO-related posts on Search Engine Land. 

Blog news pagination

Scrolling down, you’ll find pagination links. 

Clicking “2” takes you to the next set of SEO articles:

  • https://searchengineland.com/library/seo/page/2

Pagination inside content

Pagination can also exist within individual pieces of content rather than at a feed level. 

For example, some news websites paginate comment sections when a single article receives thousands of comments. 

Similarly, forum threads with extensive discussions often use pagination to break up replies across multiple pages.

Consider this post from WPBeginner:

  • https://www.wpbeginner.com/beginners-guide/how-to-choose-the-best-blogging-platform/

Scroll to the bottom, and you’ll see that even the comment section uses pagination to organize user responses.

UGC Article Comments Pagination

Why is pagination important for SEO?

Pagination plays a crucial role in SEO for several reasons:

Indexing

Without pagination, search crawlers may struggle to find deeply nested content such as blog posts, news articles, products, and comments.

Crawl efficiency

Pagination increases the number of URLs on a site, which might seem counterproductive to efficient crawling.

However, most search engines recognize common pagination structures – even without rich markup.

This understanding allows them to prioritize crawling more valuable content while ignoring less important paginated pages.

Internal linking

Pagination also contributes to internal linking.

While pagination links don’t carry significant link authority, they provide structure.

Google tends to pay less attention to orphaned pages – those without inbound links – so pagination can help ensure content remains connected.

Managing content duplication

If URLs aren’t structured properly, search engines may mistakenly identify them as duplicate content.

Pagination isn’t as strong a signal for content consolidation as redirects or canonical tags.

Still, when implemented correctly, it helps search engines differentiate between paginated pages and true duplicates.

Google’s deprecation of rel=prev/next

Google previously supported rel=prev/next for declaring paginated content. 

However, in March 2019, it was revealed that Google had not used this markup for some time

As a result, these tags are no longer necessary in a website’s code.

Google likely used rel=prev/next to study common pagination structures. 

Over time, those insights were integrated into its core algorithms, making the markup redundant. 

Some SEOs believe these tags may still help with crawling, but there is little evidence to support this.

If your site doesn’t use this markup, there’s no need to worry. Google can still recognize paginated URLs. 

If your site uses it, there’s also no urgent need to remove it, as it won’t negatively impact your SEO.

Get the newsletter search marketers rely on.



Why pagination is still important in 2025: The infinite scroll debate

Alternate methods for browsing large amounts of content have emerged over the past couple of decades.

“View more” or “Load more” buttons often appear under comment streams, while infinite scroll or lazy-loaded feeds are common for posts and products. 

Some argue these features are more user-friendly. 

Originally pioneered by social networks such as Twitter (now X), this form of navigation helped boost social interactions. 

Some websites have adopted it, but why isn’t it more widespread?

From an SEO perspective, the issue is that search engine crawlers interact with webpages in a limited way. 

While headless browsers may sometimes execute JavaScript-based content during a page load, search crawlers typically don’t “scroll down” to trigger new content. 

A search engine bot certainly won’t scroll indefinitely to load everything. 

As a result, websites relying solely on infinite scroll or lazy loading risk orphaning articles, products, and comments over time.

For major news brands with strong SEO authority and extensive XML sitemaps, this may not be a concern. 

The trade-off between SEO and user experience may be acceptable. 

But for most websites, implementing these technologies is likely a bad idea. 

Search crawlers may not spend time scrolling through content feeds, but they will click hyperlinks – including pagination links.

How JavaScript can interfere with pagination

Even if your site doesn’t use infinite scroll plugins, JavaScript can still interfere with pagination. 

Since July 2024, Google has at least attempted to render JavaScript for all visited pages. 

However, details on this remain vague. 

  • Does Google render all pages, including JavaScript, at the time of the crawl? 
  • Or is execution deferred to a separate processing queue? 
  • How does this affect Google’s ranking algorithms? 
  • Does Google make initial determinations before executing JavaScript weeks later?

There are no definitive answers to these questions.

What we do know is that “dynamic rendering is on the decline,” according to the 2024 Web Almanac SEO Chapter

If Google’s effort to execute JavaScript for all crawled pages is progressing well – which seems unlikely given the potential efficiency drawbacks – why are so many sites reverting to a non-dynamic state? 

This doesn’t mean JavaScript use is disappearing. 

Instead, more sites may be shifting to server-side or edge-side rendering.

If your site uses traditional pagination but JavaScript interferes with pagination links, it can still lead to crawling issues.

For example, your site might use traditional pagination links, but the main content of your page is lazy-loaded.

In turn, the pagination links only appear when a user (or bot) scrolls the page. 

Dig deeper: A guide to diagnosing common JavaScript SEO issues

How to handle indexing and canonical tags for paginated URLs

SEO professionals often recommend using canonical tags to point paginated URLs to their parent pages, marking them as non-canonical. 

This practice was especially common before Google introduced rel=prev/next

Since Google deprecated rel=prev/next, many SEOs remain uncertain about the best way to handle pagination URLs.

Avoid blocking paginated content via robots.txt or with canonical tags.

Doing so prevents Google from crawling or indexing those pages. 

In the case of news posts, certain comment exchanges might be considered valuable by Google, potentially connecting a paginated version of an article with keywords that wouldn’t otherwise be associated with it. 

This can generate free traffic – something worth keeping in 2025.

Similarly, restricting the crawling and indexing of paginated product feeds could leave some products effectively soft-orphaned.

In SEO, there’s a tendency to chase perfection and aim for complete crawl control. 

But being overly aggressive here can do more harm than good, so tread carefully.

There are cases where it makes sense to de-canonicalize or limit the crawling of paginated URLs. 

Before taking that step, make sure you have data showing that crawl-efficiency issues outweigh the potential free traffic gains. 

If you don’t have that data, don’t block the URLs. Simple!

Read more at Read More

Ad hijacking: Understanding the threat and learning from Adidas by Bluepear

t affiliate ad hijacking?

Ad hijacking occurs when dishonest affiliates create ads almost identical to a brand’s official ads. 

They copy headlines, text, and display URLs so potential customers assume these ads are legitimate. 

In reality, these affiliates, often involved in affiliate hijacking and other affiliate program scams, send clicks through their own tracking links to earn commissions they haven’t really earned.

When this happens inside an affiliate program, it’s called affiliate ad hijacking. 

Many hijackers use an affiliate link cloaker to hide the final redirect, preventing brands or ad platforms from seeing the trick. If someone clicks on one of these fake ads, they land on the brand’s site with a hidden affiliate tag, causing the brand to pay a commission for a visitor who would have likely arrived directly or through a proper paid search ad.

How affiliate hijacking hurts your brand

If ad hijacking and other affiliate scams aren’t stopped, they can damage your business and reputation:

  • Affiliate hijacking makes brands pay extra commissions on sales they would’ve made anyway. 
  • By running ads on a brand’s keywords, hijackers compete with, or even outrank, the official ads, leading to higher cost-per-click (CPC).
  • Affiliate ad hijacking also distorts performance data by boosting affiliate sales numbers and cutting into your direct or organic traffic. 

Over time, you might make bad decisions, like raising affiliate commissions, based on inflated sales reports. If the hijacker uses an affiliate link cloaker, it becomes even harder to figure out where these sales are coming from.

Spotting ad hijacking

The list of most recent ad hijackers.

Recognizing ad hijacking can be tricky since the fake ads often look exactly like yours. 

However, these signs might help:

  • Imitation ads: Be cautious of ads that copy your official wording, style, or domain but don’t show up in your ad account. Sometimes the displayed URL is identical except for a small punctuation change or extra keyword.
  • Sudden sales spikes: If a single affiliate sees a big jump in sales without any new promotion or change in commission, it could be affiliate ad hijacking.
  • Redirect clues: An affiliate link cloaker may hide the path users take, but you might spot unusual tracking codes in your analytics or strange referral tags appearing at odd times or in certain locations.

Why manual checks often fail

Many brands do a quick check, typing their name into a search engine, to spot suspicious ads. But dishonest affiliates can be sneaky: they might only run these ads late at night or in small cities far from your headquarters.

They may also use cloaking, which sends brand monitors or bots to the real site, hiding any wrongdoing. This means you need continuous monitoring in multiple places, plus advanced detection methods, simple, random checks won’t catch everything.

The Adidas example: Over 100 incidents in 40 days

An example of affiliate ad hijacking.

A clear example is Adidas. Over 40 days, Bluepear uncovered repeated ad hijacking and online ad fraud targeting Adidas’s branded search results. 

More than 100 cases of affiliate hijacking were found, with some ads appearing above the official ones. Bluepear also saw at least 245 variations of these ads, all designed to stay hidden.

This shows why brands can struggle to catch affiliate ad hijacking on their own. Scammers often place ads in overlooked regions or at off-peak times.

A quick check at the main office might not show any problems, while they’re actively abusing your brand name elsewhere. Some fraudsters see this deception as standard practice, creating new ad variations until they’re exposed.

How Bluepear helps

How bluepear helps

Bluepear takes several steps to fight ad hijacking:

  • 24/7 global monitoring: It tracks different locations and time zones, so if an affiliate starts bidding on your keyword at 3 AM in a small city, Bluepear will see it.
  • Detailed evidence: Every instance of affiliate hijacking gets recorded with clear proof.
  • Affiliate identification: You can see exactly which affiliate is responsible.
  • Ads and landing pages: The system stores both the ad and the final landing page, making it easy to show proof if there’s a dispute.
  • Screenshots: You get actual images of the search engine results page, showing where the fake ad appeared.
  • Easy violation reporting: Send a summary of the offense (with timestamps and URLs) straight to the affiliate through Bluepear.

In Adidas’s case, Bluepear identified over 100 infringing ads in just 40 days, proof that some affiliates consider trickery a “hijack industry standard.” Because Bluepear constantly checks search engines around the world, it sets a higher bar for compliance.

Some scammers even use multiple affiliate link cloakers or rotate domains to hide. Bluepear’s continuous scanning and data comparisons make it tough for them to stay hidden. 

It also simplifies your process – no more struggling with spreadsheets or piecing together incomplete ad reports.

Conclusion

Ad hijacking seriously threatens brands that value their online reputation and affiliate partnerships. 

Bluepear’s continuous global checks, advanced cloaking and click-fraud detection, and in-depth reporting features allowed Adidas to uncover more than 100 affiliate hijacking incidents in 40 days, highlighting how common these schemes can be.

By monitoring your branded keywords and using strong tools like Bluepear, you can protect valuable traffic, keep trust in your affiliate program, and guard against needless spending on fraudulent commissions.

Read more at Read More

Ex-Google exec: Giving traffic to publishers ‘a necessary evil’

A new profile of Elizabeth Reid, the head of Google Search, confirms that Google is moving away from its longstanding model of sending its users to websites. As one former unnamed senior executive put it: “Giving traffic to publisher sites is kind of a necessary evil.”

As for the iconic Google Search bar? It will slowly lose prominence in the Google Search experience, due to the continuing growth of voice and visual search, Reid said.

Necessary evil. Google has been increasingly focused on keeping users inside Google properties, reducing the need to click through to external sites. A former Google senior executive told Bloomberg that supporting publishers was incidental to Google’s larger aims:

  • “Giving traffic to publisher sites is kind of a necessary evil. The main thing they’re trying to do is get people to consume Google services.”
  • “So there’s a natural tendency to want to have people stay on Google pages, but it does diminish the sort of deal between the publishers and Google itself.”

Alphabet CEO Sundar Pichai said in December Google spends a lot of time “thinking about the traffic we send to the ecosystem.” But, of late, he has stopped short of promising that Google will send more of it to websites – and there’s probably good reason for that.

Look no further than Barry Schwartz’s article, Google: Not all sites will fully recover with future core algorithm updates, in which Google’s Search Liaison Danny Sullivan said that websites shouldn’t expect to recover from core updates. Sullivan also said this in September. And Google reiterated it again in October.

Instead, Pichai now mentions how AI Overviews are increasing search usage. (Even though, I thought the whole point of AI Overviews was to reduce the number of searches – remember the idea of “let Google do the searching for you” to get “quick answers”?)

As a reminder, Google sees more than 5 trillion searches per year. But for every 1,000 Google searches, only 360 clicks in the U.S. go to the open web (Context: Nearly 60% of Google searches end without a click).

Google Search hovering. The Google Search bar won’t go away, according to Reid. However, it will become less prominent over time as Google prepares for the rise of voice and visual searches. Here’s the full section from the Bloomberg article (Google Is Searching for an Answer to ChatGPT):

“Reid predicts that the traditional Google search bar will become less prominent over time. Voice queries will continue to rise, she says, and Google is planning for expanded use of visual search, too. Rajan Patel, a vice president for search experience, demonstrated how parents can use Google’s visual search tools to help their kids with homework, or to surreptitiously take a photo of a stylish stranger’s sneakers in a coffee shop to buy the same pair (something Patel did recently). The search bar isn’t going away anytime soon, Reid says, but the company is moving toward a future in which Google is always hovering in the background. ‘The world will just expand,’ she says. ‘It’s as if you can ask Google as easily as you could ask a friend, only the friend is all-knowing, right?’”

Other Reid quotes of note. For what is being considered a “profile” of Reid, the article didn’t contain many direct quotes. Here are the few interesting quotes from the piece:

  • “We learned what people really wanted two months faster” (on launching early features in her Google Maps days).
  • “[Search is a] constant evolution [rather than a complete overhaul].”
  • “Things start slowly and then quickly. Suddenly the combination of the tech and the product and the use and the understanding and the polish and everything comes together, and then everyone needs it.”
  • “It’s really exciting to work on search at a time when you think the tech can genuinely change what people can search for.”
  • “[Before generative AI] people did not go to Google Search and say, ‘How many rocks should I eat per day?’ They just didn’t.’” (Context: Google AI Overviews under fire for giving dangerous and wrong answers)

And one indirect quote, where Bloomberg summarizes her thoughts on AI:

“Google’s generative AI products still carry disclaimers that the technology is experimental. Testing tools in public helps them get better, Reid says. She’s convinced that, as with other changes to search, AI will get people to use Google even more than they did before.”

Why we care. Many websites started to lose traffic when Google launched AI Overviews last May and as AI Overviews expanded. Google was a fairly reliable source of organic search traffic for over two decades – but the rules are changing. No, SEO isn’t dead. But old SEO strategies and tactics will need to evolve and playbooks will need to be rewritten.

Read more at Read More

How geotagging photos affects Google Business Profile rank: Study

How does adding coordinates to the EXIF data affect local rank? Our team wanted to find out. That’s why we recently conducted a 10-week study on the effects of geotagging for local rank.

The geotagged images seemed to only affect the ranking for “near me” queries in the areas the EXIF data coordinates specified. Their impact on those queries in those areas was positive and statistically significant.

However, the study also found that queries that mentioned specific towns saw a decrease in ranking during the same period.

In other words, when EXIF data targeted Salt Lake City, Utah, the query [lawn care near me] saw a significant increase in rank.

For the same targeted area, the query for [lawn care salt lake city utah] saw, on average, decreases in rank.

The geotagging debate

SEOs have argued for years about whether adding coordinates to image EXIF data (known as geotagging) affects a business’s Google Business Profiles (GBP) rank.

The theory is that if a business owner or customer takes a photo from their phone and uploads it to a GBP, Google reviews the EXIF (metadata) of that image and uses the location of where it was taken as a ranking signal.

Phones automatically use location details to input EXIF data on each photo taken from the device.

It’s speculated that Google uses the EXIF location data before stripping it.

On the surface, it makes sense.

However, skeptics don’t believe Google does this. This is because this data can easily be manipulated using any free EXIF editor.

Google’s John Mueller said it was unnecessary for SEO purposes, two years ago on Reddit.

  • “No need to geotag images for SEO.”

Mueller also told me he didn’t know much about what GBPs do, in February on Bluesky.

  • Joy Hawkins, owner and president of local SEO agency Sterling Sky, performed a test on this in January 2024. She tested five GBP locations and saw no measurable increases over several weeks.
  • A month later, consultant Tim Kahlert, CEO of Hypetrix, performed a test. He also concluded that “this tactic currently has no effect on local rankings.”

These tests were better than nothing, but still weren’t enough. Plus, the sample sizes of the locations tested were quite small.

Those who say geotagging works never post their data or case studies, only offering anecdotal evidence.

Geotaggers aren’t publishing their tests and skeptics aren’t conducting them at scale. Google flip-flopping on their position doesn’t help either.

It was time this test was done justice.

Methodology and testing

Our test included 27 of our lawn care business clients. All SEO efforts were paused for the sole purpose of this test.

Every week on Tuesday and Thursday, we would post a client-owned image to their GBP (two images per week).

We then selected two towns in their service area grid that needed improvement. We based these on a baseline report taken from Local Falcon at the beginning of the test period. We kept these towns moderately far apart to avoid any kind of bleedover.

In this example, we might have selected “Little Falls” and “Garrisonville.”

During the test period, coordinates would be added to the EXIF data of the images. On Tuesday’s image, we’d add the center of Little Falls. On Thursday’s image, we’d add the center of Garrisonville.

We ran a report, monitored position changes, and charted them, every week

For each location, we tracked three keywords. Following the example above, we tracked:

  • “Lawn care garrisonville”
  • “Lawn care little falls”
  • “Lawn care near me”

For [lawn care near me] we monitored how it affected position changes in both of the target towns.

The control period

Establishing a proper control period was crucial.

The control period had to run for the same duration as the test period (five weeks). To establish consistency and isolate variables, we:

  • Maintained the image posting schedule. This ensured adding images on different days didn’t influence rank.
  • Stripped all EXIF data to ensure the only variables in the test period were the coordinates.
  • Monitored the same keywords to set a baseline.
  • Paused all SEO efforts for all 27 locations.

We continued as normal when the control period ended. The only change was adding town #1’s coordinates to Tuesday’s image and town #2’s coordinates to Thursday’s image.

Findings

Most of what we found validated the skeptics’ statements. But that doesn’t mean we ignored the geotaggers.

Service + city

In our example, when images were geotagged with their coordinates, both Garrisonville and Little Falls saw decreases in rank for “lawn care garrisonville” and “lawn care little falls.”

The conclusion? Geotagging had no impact whatsoever.

Service + near me

This one surprised me – and it had statistical significance. Garrisonville and Little Falls saw an overall increase in rank for [lawn care near me] queries.

Service + near me (CoA)

Local Falcon also produces reports on Center of Business Address. This monitors the rank of your target keywords where the business pin is actually located.

The end result: EXIF data had no effect on the business’s actual location for “near me” queries. Ranking dropped a lot more when EXIF data was added to the images targeting different areas.

Service + city (ATRP)

Average Total Rank Position is the average position in the target area. This is seen if only adding images targeting those two areas affects the rest of the service area.

The end result: There was no impact. When EXIF data was added for the full-service areas, the average rank of those areas decreased further.

Service + near me (ATRP)

The “near me” queries for ATRP yielded the same result as above.

No impact, yet rankings plummeted further with geotagging.

Service + city (SoLV)

Share of Local Voice is another metric Local Falcon tracks. It shows how often a location shows in the top 3 positions of the map pack for the target queries.

The results started to deviate from Center of Address and ATRP reports. However, not by much.

The final result was that geotagged images had no impact. However, this time, the ranking didn’t continue to plummet during the test period.

Service + near me (SoLV)

We had the same results with “near me” queries on both images as we did with the [service] + [city] queries.

Geotagged images had no impact here.

Final thoughts

Out of the seven metrics we looked at:

  • Only one saw an improvement.
  • Six had no impact.
  • Of those six, four of them saw a decrease in rank when images were geotagged

The last five metrics focused on the service area as a whole, not the specific areas where the EXIF data was pointing.

I can draw one main conclusion from this:

Although it helps the “near me” queries in those targeted areas, it hurts everywhere you don’t add geotagged images.

The solution?

Upload tons of images to every town in the area to combat that. But you’re going to run into two problems if you do this:

  1. Your GBP will be spammed with low-quality images for the sake of adding images. Wouldn’t it be better to just make sure the GBP is using good photos? Adding images for the sake of rank diminishes the user-facing quality.
  2. You’re still losing rank for queries that use the target city in the keyword. It’s a trade-off that only looks at one version of a search term. The other version appears to have negative consequences.

For these reasons, our agency won’t geotag our clients’ GBP images. Instead, we’ll focus on things that have a greater impact on local rank.

Read more at Read More