Reddit Ads rolls out new SMB tools to boost campaign performance

Reddit SEO: Everything you need to know

Reddit Ads is introducing a suite of new tools aimed at helping small and medium-sized businesses (SMBs) streamline campaign management, optimize ad performance, and improve data accuracy.

Easier Campaign Setup and Management:

  • Campaign Import. Reddit Ads now allows advertisers to import campaigns directly from Meta in just three steps. After signing into their Meta account within Reddit Ads Manager, users can select an ad account and campaign to import, then customize it to fit Reddit’s platform. This seamless process enables advertisers to leverage high-performing Meta ads on Reddit quickly.
  • Simplified Campaign QA. A new review page in the Reddit Ads Manager now consolidates all campaign details for a clear overview. Advertisers can easily identify errors or inconsistencies and make edits before publishing.

Enhanced Signal Quality and Conversion Tracking:

  • 1-Click GTM Integration for Reddit Pixel. Setting up Reddit’s website conversions tag just got easier. With the new Google Tag Manager (GTM) integration, advertisers can install the Reddit Pixel in a few clicks, enabling fast and accurate conversion tracking. This simplifies measuring customer journeys and optimizing lower-funnel strategies.
  • Event Manager QA. The Events Manager’s enhanced Events Overview page now provides a detailed breakdown of conversion events from the Reddit Pixel or Conversions API (CAPI). This update helps advertisers verify event data accuracy, troubleshoot issues, and run effective lower-funnel campaigns.

Why we care. The new Campaign Import feature lets advertisers quickly repurpose high-performing Meta ads on Reddit, saving time and effort. The simplified QA tools helps with quality checking to reduce as many errors as possible before launch, while the 1-click GTM integration and improved Event Manager provide deeper insights into customer behavior and campaign performance

Bottom line. These updates reflect Reddit’s ongoing commitment to making its ad platform more accessible and effective for SMBs. By reducing setup friction and providing better visibility into campaign performance, Reddit Ads aims to help businesses reach niche communities and drive impactful results.

Read more at Read More

The next wave of search: AI Mode, deep research and beyond

The next wave of search: AI Mode, deep research and beyond

With the rise of AI-powered features, search engines are not just directing users to information but delivering answers directly. 

This shift is redefining how people interact with the web, raising questions about the future of SEO, content discovery, and digital marketing. 

Here’s what’s coming next.

From ChatGPT to Grok 3: The breakneck pace of AI advancements

The world has seen rapid and significant advances in AI technology and large language models (LLMs) within two years. 

Looking back just three years ago, Google’s Gemini and Meta’s LLAMA did not exist, and OpenAI’s ChatGPT was later released in late November 2022. 

  • Fast-forward to January 2025, the public was introduced to DeepSeek R1. This open-source large language reasoning model astounded the AI community with its speed, efficiency, and affordability, especially compared to OpenAI’s o1 GPT model. 
  • A few weeks later, Elon Musk’s company xAI launched Grok 3, which impressed users by topping a key AI leaderboard with its complexity and fewer guardrails (see: unhinged mode).
  • More recently, Anthropic released Claude 3.7 Sonnet and Claude Code, an LLM that excels at code creation and debugging to a degree that has made many software engineers a bit uneasy.

These LLMs are just the beginning of AI’s rapid progress, with more breakthroughs on the way. 

Google’s AI Mode: A glimpse of the future 

AI isn’t just bringing new products – it’s transforming existing ones, too.

On March 5, Google announced they were expanding AI Overviews with a new experimental feature called AI Mode

This interactive feature allows users to:

  • Engage with web search in a chat-like manner through multimodal understanding.
  • Refine long-tail queries in a back-and-forth manner. 

AI Mode, powered by Gemini 2.0, enhances research using a “query fan-out” technique to gather real-time data from multiple sources and generate detailed, in-depth summaries.

This may make SEOs uncomfortable, as it potentially reduces clicks to publisher sites and further promotes a zero-click ecosystem. 

With Google integrating Gemini 2.0 into its suite of products and its dominance of 89% of the search industry, its AI innovations demand close attention. 

These technologies will likely be added to search, and AI Mode offers a preview of what’s ahead.

Two terms for the future of search: Agentic and deep research 

We’ll likely hear two terms used more often in the AI and search space: 

Deep research models can browse the web and focus on conducting intensive, in-depth research to provide users with informative summaries on complex topics. 

Unlike previous LLMs, which use a single-step information retrieval system through RAG (retrieval-augmented generation), deep research and agentic models can:

  • Conduct multi-step research through a series of actions, pulling information from multiple sources to provide comprehensive summaries to the user. 
  • Take proactive actions, such as executing tasks and complex instructions. 

Google’s Project Mariner and OpenAI’s Operator already showcase these capabilities by allowing users to perform tasks within their browsers while understanding multi-modal elements such as text, images, and forms.

Dig deeper: How to use OpenAI’s Deep Research for smarter SEO strategies

How these models could change search

Suppose you want to plan a trip to Tokyo and know the best season to go, the weather, and where to stay. 

Typically, this type of research takes a few days or weeks, and you gather information from various sources, such as travel websites or YouTube videos.

A deep research model can do the heavy lifting by searching the web, gathering information, and summarizing relevant content, which saves you time. 

It can also “read, listen, and watch” various sources to provide a thorough answer. 

An agentic model could also book your hotels and flights, navigating checkout flows to complete the purchase.

AI is moving in this direction as companies like Google work toward AGI (artificial general intelligence) – machines that can reason across diverse tasks like humans.

Deep research and agentic models are key milestones in building practical AI solutions for everyday use.

AI Overviews have already impacted click behavior and organic traffic

Now, we must consider these AI features’ long-term effects on the content ecosystem.

Get the newsletter search marketers rely on.



What could the future search landscape look like?

Google’s AI Overviews and agentic advancements are here to stay. 

If AI Mode succeeds, it will be the first deep research feature in Google Search. 

So, what’s next for the search landscape? 

Here are some possibilities.

Continual rise of zero-click searches

Since launching in May 2024, AI Overviews have significantly reduced clicks to informational queries.

As AI search capabilities advance, users will likely rely even more on AI tools for quick answers rather than clicking through to websites or articles. 

AI Mode and future search innovations could accelerate this shift by prioritizing fast, AI-generated summaries over traditional browsing.

As zero-click searches become the norm, you must rethink how you measure value and engagement. 

Traditional KPIs may no longer accurately reflect user behavior, so focusing on brand visibility and awareness will be more critical than ever.

Increased personalization

LLMs and AI systems are revolutionizing search by personalizing responses with unmatched speed and scale, surpassing traditional algorithms. 

Leveraging Google’s vast user data, AI can train on existing information and refine queries in real-time to deliver more tailored results. 

As these systems continuously learn, they will become even better at recognizing, remembering, and adapting to individual user preferences.

As AI-driven search becomes more personalized, it’s worth considering whether hyper-niche content is the key to reaching your audience.

Multimodal search

Google’s AI-powered multimodal capabilities are already embedded in many of its products, including Project Astra, an AI assistant unveiled at Google I/O 2024.

During a live demonstration, Astra used multiple tools – such as Google Lens – to identify objects in real time and respond to voice queries.

In my own experience at Google I/O, the AI assistant:

  • Accurately classified animal figurines.
  • Distinguished between similar names (“Bob” vs. “Rob”).
  • Even created a story about the figures.

While some of these advanced features haven’t been integrated into Google Search yet, multimodal search through Google Lens and voice search is already shaping how users submit queries. 

As Google develops these capabilities, you should anticipate what’s next, look beyond text-based queries, and optimize for image, video, and audio search.

Dig deeper: From search to AI agents: The future of digital experiences

Commercial queries can still draw users to websites

AI-generated results have reduced clicks for informational queries, but commercial and transactional searches still offer opportunities for website traffic.

During the decision-making process, potential buyers research extensively – comparing products, reading reviews, and exploring multiple channels before making a purchase.

While it’s unclear how AI-generated search will impact this journey, think about how AI can streamline multi-touchpoint decision-making while still driving users to your website.

When users move closer to making a purchase, user-generated content – like reviews – will still play a crucial role in conversions.

Content quality still rules

Despite AI’s growing role in search, one thing remains constant: high-quality content is essential. 

Whether users rely on traditional search engines or LLMs, visibility will still depend on the strength of the content itself.

Since both Google Search and LLMs use RAG to pull from vast datasets, ensuring these systems have access to accurate, high-quality information is critical. 

Content demonstrating E-E-A-T (experience, expertise, authoritativeness, and trustworthiness) will continue to rank higher in AI-driven search results.

Your brand will also play a bigger role in search visibility, making it essential to create valuable, well-optimized content across multiple formats.

Dig deeper: Decoding Google’s E-E-A-T: A comprehensive guide to quality assessment signals

Read more at Read More

Pagination and SEO: What you need to know in 2025

Pagination and SEO: What you need to know in 2025

Ever wondered why some of your ecommerce products or blog posts never appear on Google? 

The way your site handles pagination could be the reason.

This article explores the complexities of pagination – what it is, whether your site needs it for SEO, and how it affects search in 2025. 

What is pagination?

Pagination is the coding and technical framework on webpages that allows content to be divided across multiple pages while remaining thematically connected to the original parent page.

When a single page contains too much content to load efficiently, pagination helps by breaking it into smaller sections.

This improves user experience and unburdens the client (i.e., web browser) from loading too much information – much of which may not even be reviewed by the user.

Examples of pagination in action

Product listings

One common example of pagination is navigating multiple pages of product results within a single product feed or category. 

Let’s look at Virgin Experience Days, a site that sells gifted experiences similar to Red Letter Days.

Take their Mother’s Day experiences page:

  • https://www.virginexperiencedays.co.uk/mothers-day-gifts

Scroll down to the “All Mother’s Day Experiences & Gift Ideas Experiences” section, and you’ll see a staggering 1,635 experiences to choose from. 

That’s a lot.

Large scale product listings

Clearly, listing all of them on a single page wouldn’t be practical. 

It would result in excessive vertical scrolling and could slow down page loading times.

Further down the page, you’ll find pagination links:

Embedded Pagination

Clicking a pagination link moves users to separate product listing pages, such as page 2:

  • https://www.virginexperiencedays.co.uk/mothers-day-gifts?page=2

In the URL, ?page=2 appears as a parameter extension, a common pagination syntax. 

Variations include ?p=2 or /page/2/, but the purpose remains the same – allowing users to browse additional pages of listings. 

Even major retailers like Amazon use similar pagination structures.

Pagination also helps search engines discover deeply nested products. 

If a site is so large that all its products can’t be listed in a single XML sitemap, pagination links provide an additional way for crawlers to access them. 

Even when XML sitemaps are in place, internal linking remains important for SEO. 

While pagination links aren’t the strongest ranking signal, they serve a foundational role in ensuring content is discoverable.

Dig deeper: Internal linking for ecommerce: The ultimate guide

Blog and news feeds

Pagination isn’t limited to product listings, it’s also widely used in blog and news feeds. 

Take Search Engine Land’s SEO article archive:

  • https://searchengineland.com/library/seo

In this page, you can access a feed of all SEO-related posts on Search Engine Land. 

Blog news pagination

Scrolling down, you’ll find pagination links. 

Clicking “2” takes you to the next set of SEO articles:

  • https://searchengineland.com/library/seo/page/2

Pagination inside content

Pagination can also exist within individual pieces of content rather than at a feed level. 

For example, some news websites paginate comment sections when a single article receives thousands of comments. 

Similarly, forum threads with extensive discussions often use pagination to break up replies across multiple pages.

Consider this post from WPBeginner:

  • https://www.wpbeginner.com/beginners-guide/how-to-choose-the-best-blogging-platform/

Scroll to the bottom, and you’ll see that even the comment section uses pagination to organize user responses.

UGC Article Comments Pagination

Why is pagination important for SEO?

Pagination plays a crucial role in SEO for several reasons:

Indexing

Without pagination, search crawlers may struggle to find deeply nested content such as blog posts, news articles, products, and comments.

Crawl efficiency

Pagination increases the number of URLs on a site, which might seem counterproductive to efficient crawling.

However, most search engines recognize common pagination structures – even without rich markup.

This understanding allows them to prioritize crawling more valuable content while ignoring less important paginated pages.

Internal linking

Pagination also contributes to internal linking.

While pagination links don’t carry significant link authority, they provide structure.

Google tends to pay less attention to orphaned pages – those without inbound links – so pagination can help ensure content remains connected.

Managing content duplication

If URLs aren’t structured properly, search engines may mistakenly identify them as duplicate content.

Pagination isn’t as strong a signal for content consolidation as redirects or canonical tags.

Still, when implemented correctly, it helps search engines differentiate between paginated pages and true duplicates.

Google’s deprecation of rel=prev/next

Google previously supported rel=prev/next for declaring paginated content. 

However, in March 2019, it was revealed that Google had not used this markup for some time

As a result, these tags are no longer necessary in a website’s code.

Google likely used rel=prev/next to study common pagination structures. 

Over time, those insights were integrated into its core algorithms, making the markup redundant. 

Some SEOs believe these tags may still help with crawling, but there is little evidence to support this.

If your site doesn’t use this markup, there’s no need to worry. Google can still recognize paginated URLs. 

If your site uses it, there’s also no urgent need to remove it, as it won’t negatively impact your SEO.

Get the newsletter search marketers rely on.



Why pagination is still important in 2025: The infinite scroll debate

Alternate methods for browsing large amounts of content have emerged over the past couple of decades.

“View more” or “Load more” buttons often appear under comment streams, while infinite scroll or lazy-loaded feeds are common for posts and products. 

Some argue these features are more user-friendly. 

Originally pioneered by social networks such as Twitter (now X), this form of navigation helped boost social interactions. 

Some websites have adopted it, but why isn’t it more widespread?

From an SEO perspective, the issue is that search engine crawlers interact with webpages in a limited way. 

While headless browsers may sometimes execute JavaScript-based content during a page load, search crawlers typically don’t “scroll down” to trigger new content. 

A search engine bot certainly won’t scroll indefinitely to load everything. 

As a result, websites relying solely on infinite scroll or lazy loading risk orphaning articles, products, and comments over time.

For major news brands with strong SEO authority and extensive XML sitemaps, this may not be a concern. 

The trade-off between SEO and user experience may be acceptable. 

But for most websites, implementing these technologies is likely a bad idea. 

Search crawlers may not spend time scrolling through content feeds, but they will click hyperlinks – including pagination links.

How JavaScript can interfere with pagination

Even if your site doesn’t use infinite scroll plugins, JavaScript can still interfere with pagination. 

Since July 2024, Google has at least attempted to render JavaScript for all visited pages. 

However, details on this remain vague. 

  • Does Google render all pages, including JavaScript, at the time of the crawl? 
  • Or is execution deferred to a separate processing queue? 
  • How does this affect Google’s ranking algorithms? 
  • Does Google make initial determinations before executing JavaScript weeks later?

There are no definitive answers to these questions.

What we do know is that “dynamic rendering is on the decline,” according to the 2024 Web Almanac SEO Chapter

If Google’s effort to execute JavaScript for all crawled pages is progressing well – which seems unlikely given the potential efficiency drawbacks – why are so many sites reverting to a non-dynamic state? 

This doesn’t mean JavaScript use is disappearing. 

Instead, more sites may be shifting to server-side or edge-side rendering.

If your site uses traditional pagination but JavaScript interferes with pagination links, it can still lead to crawling issues.

For example, your site might use traditional pagination links, but the main content of your page is lazy-loaded.

In turn, the pagination links only appear when a user (or bot) scrolls the page. 

Dig deeper: A guide to diagnosing common JavaScript SEO issues

How to handle indexing and canonical tags for paginated URLs

SEO professionals often recommend using canonical tags to point paginated URLs to their parent pages, marking them as non-canonical. 

This practice was especially common before Google introduced rel=prev/next

Since Google deprecated rel=prev/next, many SEOs remain uncertain about the best way to handle pagination URLs.

Avoid blocking paginated content via robots.txt or with canonical tags.

Doing so prevents Google from crawling or indexing those pages. 

In the case of news posts, certain comment exchanges might be considered valuable by Google, potentially connecting a paginated version of an article with keywords that wouldn’t otherwise be associated with it. 

This can generate free traffic – something worth keeping in 2025.

Similarly, restricting the crawling and indexing of paginated product feeds could leave some products effectively soft-orphaned.

In SEO, there’s a tendency to chase perfection and aim for complete crawl control. 

But being overly aggressive here can do more harm than good, so tread carefully.

There are cases where it makes sense to de-canonicalize or limit the crawling of paginated URLs. 

Before taking that step, make sure you have data showing that crawl-efficiency issues outweigh the potential free traffic gains. 

If you don’t have that data, don’t block the URLs. Simple!

Read more at Read More

Ad hijacking: Understanding the threat and learning from Adidas by Bluepear

t affiliate ad hijacking?

Ad hijacking occurs when dishonest affiliates create ads almost identical to a brand’s official ads. 

They copy headlines, text, and display URLs so potential customers assume these ads are legitimate. 

In reality, these affiliates, often involved in affiliate hijacking and other affiliate program scams, send clicks through their own tracking links to earn commissions they haven’t really earned.

When this happens inside an affiliate program, it’s called affiliate ad hijacking. 

Many hijackers use an affiliate link cloaker to hide the final redirect, preventing brands or ad platforms from seeing the trick. If someone clicks on one of these fake ads, they land on the brand’s site with a hidden affiliate tag, causing the brand to pay a commission for a visitor who would have likely arrived directly or through a proper paid search ad.

How affiliate hijacking hurts your brand

If ad hijacking and other affiliate scams aren’t stopped, they can damage your business and reputation:

  • Affiliate hijacking makes brands pay extra commissions on sales they would’ve made anyway. 
  • By running ads on a brand’s keywords, hijackers compete with, or even outrank, the official ads, leading to higher cost-per-click (CPC).
  • Affiliate ad hijacking also distorts performance data by boosting affiliate sales numbers and cutting into your direct or organic traffic. 

Over time, you might make bad decisions, like raising affiliate commissions, based on inflated sales reports. If the hijacker uses an affiliate link cloaker, it becomes even harder to figure out where these sales are coming from.

Spotting ad hijacking

The list of most recent ad hijackers.

Recognizing ad hijacking can be tricky since the fake ads often look exactly like yours. 

However, these signs might help:

  • Imitation ads: Be cautious of ads that copy your official wording, style, or domain but don’t show up in your ad account. Sometimes the displayed URL is identical except for a small punctuation change or extra keyword.
  • Sudden sales spikes: If a single affiliate sees a big jump in sales without any new promotion or change in commission, it could be affiliate ad hijacking.
  • Redirect clues: An affiliate link cloaker may hide the path users take, but you might spot unusual tracking codes in your analytics or strange referral tags appearing at odd times or in certain locations.

Why manual checks often fail

Many brands do a quick check, typing their name into a search engine, to spot suspicious ads. But dishonest affiliates can be sneaky: they might only run these ads late at night or in small cities far from your headquarters.

They may also use cloaking, which sends brand monitors or bots to the real site, hiding any wrongdoing. This means you need continuous monitoring in multiple places, plus advanced detection methods, simple, random checks won’t catch everything.

The Adidas example: Over 100 incidents in 40 days

An example of affiliate ad hijacking.

A clear example is Adidas. Over 40 days, Bluepear uncovered repeated ad hijacking and online ad fraud targeting Adidas’s branded search results. 

More than 100 cases of affiliate hijacking were found, with some ads appearing above the official ones. Bluepear also saw at least 245 variations of these ads, all designed to stay hidden.

This shows why brands can struggle to catch affiliate ad hijacking on their own. Scammers often place ads in overlooked regions or at off-peak times.

A quick check at the main office might not show any problems, while they’re actively abusing your brand name elsewhere. Some fraudsters see this deception as standard practice, creating new ad variations until they’re exposed.

How Bluepear helps

How bluepear helps

Bluepear takes several steps to fight ad hijacking:

  • 24/7 global monitoring: It tracks different locations and time zones, so if an affiliate starts bidding on your keyword at 3 AM in a small city, Bluepear will see it.
  • Detailed evidence: Every instance of affiliate hijacking gets recorded with clear proof.
  • Affiliate identification: You can see exactly which affiliate is responsible.
  • Ads and landing pages: The system stores both the ad and the final landing page, making it easy to show proof if there’s a dispute.
  • Screenshots: You get actual images of the search engine results page, showing where the fake ad appeared.
  • Easy violation reporting: Send a summary of the offense (with timestamps and URLs) straight to the affiliate through Bluepear.

In Adidas’s case, Bluepear identified over 100 infringing ads in just 40 days, proof that some affiliates consider trickery a “hijack industry standard.” Because Bluepear constantly checks search engines around the world, it sets a higher bar for compliance.

Some scammers even use multiple affiliate link cloakers or rotate domains to hide. Bluepear’s continuous scanning and data comparisons make it tough for them to stay hidden. 

It also simplifies your process – no more struggling with spreadsheets or piecing together incomplete ad reports.

Conclusion

Ad hijacking seriously threatens brands that value their online reputation and affiliate partnerships. 

Bluepear’s continuous global checks, advanced cloaking and click-fraud detection, and in-depth reporting features allowed Adidas to uncover more than 100 affiliate hijacking incidents in 40 days, highlighting how common these schemes can be.

By monitoring your branded keywords and using strong tools like Bluepear, you can protect valuable traffic, keep trust in your affiliate program, and guard against needless spending on fraudulent commissions.

Read more at Read More

Ex-Google exec: Giving traffic to publishers ‘a necessary evil’

A new profile of Elizabeth Reid, the head of Google Search, confirms that Google is moving away from its longstanding model of sending its users to websites. As one former unnamed senior executive put it: “Giving traffic to publisher sites is kind of a necessary evil.”

As for the iconic Google Search bar? It will slowly lose prominence in the Google Search experience, due to the continuing growth of voice and visual search, Reid said.

Necessary evil. Google has been increasingly focused on keeping users inside Google properties, reducing the need to click through to external sites. A former Google senior executive told Bloomberg that supporting publishers was incidental to Google’s larger aims:

  • “Giving traffic to publisher sites is kind of a necessary evil. The main thing they’re trying to do is get people to consume Google services.”
  • “So there’s a natural tendency to want to have people stay on Google pages, but it does diminish the sort of deal between the publishers and Google itself.”

Alphabet CEO Sundar Pichai said in December Google spends a lot of time “thinking about the traffic we send to the ecosystem.” But, of late, he has stopped short of promising that Google will send more of it to websites – and there’s probably good reason for that.

Look no further than Barry Schwartz’s article, Google: Not all sites will fully recover with future core algorithm updates, in which Google’s Search Liaison Danny Sullivan said that websites shouldn’t expect to recover from core updates. Sullivan also said this in September. And Google reiterated it again in October.

Instead, Pichai now mentions how AI Overviews are increasing search usage. (Even though, I thought the whole point of AI Overviews was to reduce the number of searches – remember the idea of “let Google do the searching for you” to get “quick answers”?)

As a reminder, Google sees more than 5 trillion searches per year. But for every 1,000 Google searches, only 360 clicks in the U.S. go to the open web (Context: Nearly 60% of Google searches end without a click).

Google Search hovering. The Google Search bar won’t go away, according to Reid. However, it will become less prominent over time as Google prepares for the rise of voice and visual searches. Here’s the full section from the Bloomberg article (Google Is Searching for an Answer to ChatGPT):

“Reid predicts that the traditional Google search bar will become less prominent over time. Voice queries will continue to rise, she says, and Google is planning for expanded use of visual search, too. Rajan Patel, a vice president for search experience, demonstrated how parents can use Google’s visual search tools to help their kids with homework, or to surreptitiously take a photo of a stylish stranger’s sneakers in a coffee shop to buy the same pair (something Patel did recently). The search bar isn’t going away anytime soon, Reid says, but the company is moving toward a future in which Google is always hovering in the background. ‘The world will just expand,’ she says. ‘It’s as if you can ask Google as easily as you could ask a friend, only the friend is all-knowing, right?’”

Other Reid quotes of note. For what is being considered a “profile” of Reid, the article didn’t contain many direct quotes. Here are the few interesting quotes from the piece:

  • “We learned what people really wanted two months faster” (on launching early features in her Google Maps days).
  • “[Search is a] constant evolution [rather than a complete overhaul].”
  • “Things start slowly and then quickly. Suddenly the combination of the tech and the product and the use and the understanding and the polish and everything comes together, and then everyone needs it.”
  • “It’s really exciting to work on search at a time when you think the tech can genuinely change what people can search for.”
  • “[Before generative AI] people did not go to Google Search and say, ‘How many rocks should I eat per day?’ They just didn’t.’” (Context: Google AI Overviews under fire for giving dangerous and wrong answers)

And one indirect quote, where Bloomberg summarizes her thoughts on AI:

“Google’s generative AI products still carry disclaimers that the technology is experimental. Testing tools in public helps them get better, Reid says. She’s convinced that, as with other changes to search, AI will get people to use Google even more than they did before.”

Why we care. Many websites started to lose traffic when Google launched AI Overviews last May and as AI Overviews expanded. Google was a fairly reliable source of organic search traffic for over two decades – but the rules are changing. No, SEO isn’t dead. But old SEO strategies and tactics will need to evolve and playbooks will need to be rewritten.

Read more at Read More

How geotagging photos affects Google Business Profile rank: Study

How does adding coordinates to the EXIF data affect local rank? Our team wanted to find out. That’s why we recently conducted a 10-week study on the effects of geotagging for local rank.

The geotagged images seemed to only affect the ranking for “near me” queries in the areas the EXIF data coordinates specified. Their impact on those queries in those areas was positive and statistically significant.

However, the study also found that queries that mentioned specific towns saw a decrease in ranking during the same period.

In other words, when EXIF data targeted Salt Lake City, Utah, the query [lawn care near me] saw a significant increase in rank.

For the same targeted area, the query for [lawn care salt lake city utah] saw, on average, decreases in rank.

The geotagging debate

SEOs have argued for years about whether adding coordinates to image EXIF data (known as geotagging) affects a business’s Google Business Profiles (GBP) rank.

The theory is that if a business owner or customer takes a photo from their phone and uploads it to a GBP, Google reviews the EXIF (metadata) of that image and uses the location of where it was taken as a ranking signal.

Phones automatically use location details to input EXIF data on each photo taken from the device.

It’s speculated that Google uses the EXIF location data before stripping it.

On the surface, it makes sense.

However, skeptics don’t believe Google does this. This is because this data can easily be manipulated using any free EXIF editor.

Google’s John Mueller said it was unnecessary for SEO purposes, two years ago on Reddit.

  • “No need to geotag images for SEO.”

Mueller also told me he didn’t know much about what GBPs do, in February on Bluesky.

  • Joy Hawkins, owner and president of local SEO agency Sterling Sky, performed a test on this in January 2024. She tested five GBP locations and saw no measurable increases over several weeks.
  • A month later, consultant Tim Kahlert, CEO of Hypetrix, performed a test. He also concluded that “this tactic currently has no effect on local rankings.”

These tests were better than nothing, but still weren’t enough. Plus, the sample sizes of the locations tested were quite small.

Those who say geotagging works never post their data or case studies, only offering anecdotal evidence.

Geotaggers aren’t publishing their tests and skeptics aren’t conducting them at scale. Google flip-flopping on their position doesn’t help either.

It was time this test was done justice.

Methodology and testing

Our test included 27 of our lawn care business clients. All SEO efforts were paused for the sole purpose of this test.

Every week on Tuesday and Thursday, we would post a client-owned image to their GBP (two images per week).

We then selected two towns in their service area grid that needed improvement. We based these on a baseline report taken from Local Falcon at the beginning of the test period. We kept these towns moderately far apart to avoid any kind of bleedover.

In this example, we might have selected “Little Falls” and “Garrisonville.”

During the test period, coordinates would be added to the EXIF data of the images. On Tuesday’s image, we’d add the center of Little Falls. On Thursday’s image, we’d add the center of Garrisonville.

We ran a report, monitored position changes, and charted them, every week

For each location, we tracked three keywords. Following the example above, we tracked:

  • “Lawn care garrisonville”
  • “Lawn care little falls”
  • “Lawn care near me”

For [lawn care near me] we monitored how it affected position changes in both of the target towns.

The control period

Establishing a proper control period was crucial.

The control period had to run for the same duration as the test period (five weeks). To establish consistency and isolate variables, we:

  • Maintained the image posting schedule. This ensured adding images on different days didn’t influence rank.
  • Stripped all EXIF data to ensure the only variables in the test period were the coordinates.
  • Monitored the same keywords to set a baseline.
  • Paused all SEO efforts for all 27 locations.

We continued as normal when the control period ended. The only change was adding town #1’s coordinates to Tuesday’s image and town #2’s coordinates to Thursday’s image.

Findings

Most of what we found validated the skeptics’ statements. But that doesn’t mean we ignored the geotaggers.

Service + city

In our example, when images were geotagged with their coordinates, both Garrisonville and Little Falls saw decreases in rank for “lawn care garrisonville” and “lawn care little falls.”

The conclusion? Geotagging had no impact whatsoever.

Service + near me

This one surprised me – and it had statistical significance. Garrisonville and Little Falls saw an overall increase in rank for [lawn care near me] queries.

Service + near me (CoA)

Local Falcon also produces reports on Center of Business Address. This monitors the rank of your target keywords where the business pin is actually located.

The end result: EXIF data had no effect on the business’s actual location for “near me” queries. Ranking dropped a lot more when EXIF data was added to the images targeting different areas.

Service + city (ATRP)

Average Total Rank Position is the average position in the target area. This is seen if only adding images targeting those two areas affects the rest of the service area.

The end result: There was no impact. When EXIF data was added for the full-service areas, the average rank of those areas decreased further.

Service + near me (ATRP)

The “near me” queries for ATRP yielded the same result as above.

No impact, yet rankings plummeted further with geotagging.

Service + city (SoLV)

Share of Local Voice is another metric Local Falcon tracks. It shows how often a location shows in the top 3 positions of the map pack for the target queries.

The results started to deviate from Center of Address and ATRP reports. However, not by much.

The final result was that geotagged images had no impact. However, this time, the ranking didn’t continue to plummet during the test period.

Service + near me (SoLV)

We had the same results with “near me” queries on both images as we did with the [service] + [city] queries.

Geotagged images had no impact here.

Final thoughts

Out of the seven metrics we looked at:

  • Only one saw an improvement.
  • Six had no impact.
  • Of those six, four of them saw a decrease in rank when images were geotagged

The last five metrics focused on the service area as a whole, not the specific areas where the EXIF data was pointing.

I can draw one main conclusion from this:

Although it helps the “near me” queries in those targeted areas, it hurts everywhere you don’t add geotagged images.

The solution?

Upload tons of images to every town in the area to combat that. But you’re going to run into two problems if you do this:

  1. Your GBP will be spammed with low-quality images for the sake of adding images. Wouldn’t it be better to just make sure the GBP is using good photos? Adding images for the sake of rank diminishes the user-facing quality.
  2. You’re still losing rank for queries that use the target city in the keyword. It’s a trade-off that only looks at one version of a search term. The other version appears to have negative consequences.

For these reasons, our agency won’t geotag our clients’ GBP images. Instead, we’ll focus on things that have a greater impact on local rank.

Read more at Read More

Google Business Profile bug prevents adding new businesses

open_business_1920

If you try to add a new business to Google Business Profiles today, you may run into difficulties. When you get to the screen where you add your phone number and website address, Google won’t let you proceed to the next step.

It is unclear if there is an issue with phone numbers in general, which may be causing this bug, or not. But there are also businesses reporting their phone numbers are being removed and also rejected from their Google Business Profiles.

More details. As noted, on this screen, the “Next” button, simply won’t take you to the next screen:

Google product expert, Vinay Toshniwal, wrote in the Google Business Profile forums:

I’ve come across several posts about users facing issues when creating a Google Business Profile—specifically where the “Next” button becomes unresponsive after entering the phone number and website details.

Please note that I’ve already escalated this issue to the Google team. I’ll share any updates here as soon as I receive more information.

Vinay Toshniwal also noted that phone numbers are disappearing from some Google Business Profile listings.

Why we care. If you are trying to get a new business added to Google Business Profiles and run into this issue, you should know that this is impacting everyone. There seems to be a bug with Google where you cannot add new businesses right now. I suspect this will be fixed in the coming hours or days.

Read more at Read More

Answer engine optimization: 6 AI models you should optimize for

How to evolve your organic approach for the rise of the answer engine

AI-powered search is drastically changing the way people find information. For years, ranking well in Google and Microsoft Bing has been the foundation of search visibility.

But as AI-driven search gains traction, is that ranking priority shifting or staying the same? 

We’re in an era of uncertainty we haven’t seen since the early days of SEO – when search engines rose and fell in a battle for relevance.

Time will tell which AI answer engine wins the game. SEO is still relevant, but AI is rewriting the rules.

This article will break down: 

  • Six AI search players to watch and how they source content.
  • Market share and usage trends that show where search behavior is shifting.
  • How AI answer engines still depend on traditional search engines.
  • What businesses should do now to optimize for AI-driven search.

 Here’s what you need to know.

AI models and real-time search 

Before we move forward, a quick note on how these AI platforms generally work. AI models are trained on data available up to specific cutoff dates. 

For instance, Google’s Gemini 2.0 Pro had a knowledge cutoff in August 2024, while OpenAI’s GPT-4o extended its training data up to June 2024. Yes, it can change daily.

This means that for recent events or emerging trends, among other things, these models rely on real-time data retrieval rather than their internal knowledge base. 

In other words, their ability to provide accurate and up-to-date results is directly tied to their ability to access and process new information from the web. That is key.

Another thing to note is that AI search engines can build their own web indexes. 

For instance, Perplexity AI’s PerplexityBot crawls the web directly, creating its own content database rather than relying on Google’s or Bing’s indexes. But even so, AI search engines can and do still rely on search engine results, too.

Website owners who want to control how AI search engines access their content can manage these crawlers in their robots.txt settings.

Now let’s discuss the different ways AI search engines are relevant to SEO today.

Google: AI Overviews

AI Overviews represented a huge shift in Google Search.  

Google’s AI-generated responses – powered by Google’s Gemini AI model – are designed to provide quick but comprehensive answers by pulling information from multiple sources.

For SEO professionals, this introduced both opportunities and challenges. 

AI Overviews rely on Google’s search index to determine what information to present, but they also change how users interact with search results.

Google AI Overview result for [how will AI overviews impact SEO]

How AI Overviews work

  • Google’s Gemini 2.0 AI model powers AI Overviews, generating instant summaries for certain queries.
  • AI-generated responses appear at the top of Google’s search results, often before traditional organic listings.

Market share and adoption

  • Google still dominates the search market, holding about 87.28% of the U.S. search market.
  • With billions of searches per day, AI Overviews have the potential to reshape organic search traffic and user behavior.

What this means for SEO

  • Ranking in Google still matters – AI Overviews primarily pull from the search results.
  • A study by Rich Sanger and Authoritas found that 46% of AI Overview citations come from the top 10 organic search results.
  • Anecdotal data from my SEO agency suggests you need to be in the top 20 for a better chance of inclusion in AI Overviews. There are outliers, and some resources cited in AI Overviews will rank outside of what we would traditionally call “being ranked at all.” 
  • Inclusion in AI Overviews can boost clicks to the cited sources and, according to some research, harm performance for those that don’t show up. For instance, for transactional queries, webpages included in AI Overviews had 3.2 times as many clicks as pages that were excluded. For informational queries, webpages with a presence in AI Overviews had 1.5 times as many clicks compared to webpages excluded by AI Overviews.
  • If your content doesn’t rank well in Google, it’s unlikely to appear in AI Overviews, reinforcing the need for strong SEO. There are many opinions on which tactics you need to succeed. I advocate continuing to stay the course with a strong SEO program with a balanced mix of technical SEO, on-page optimization and excellent content.
  • Website owners can control whether their content is included in AI-generated answers. Google-Extended is an opt-out setting that allows websites to block Google from using their content for AI models like Gemini, while still allowing Googlebot to crawl their site for search rankings. Blocking Google-Extended won’t affect your rankings in Google Search, but it will stop Gemini from using your content in AI-generated responses. 
  • Take note: Research shows that inclusion in AI Overviews can be more volatile than the organic search results. 

Takeaway: AI Overviews aren’t replacing traditional search, but they are changing how search results are consumed. For now, the strategy remains the same: Optimize for Google Search, and AI Overviews will follow.

Google: AI Mode

In March 2025, Google announced AI Mode, an optional feature designed to offer a more AI-driven search experience. 

Unlike standard Google Search, where AI Overviews appear alongside organic results, AI Mode allows users to toggle into a search environment where AI-generated answers take center stage.

Image credit: The Keyword blog, Google

How AI Mode works

  • A separate search option where AI-generated responses are more detailed, conversational, and visually enhanced. Reminiscent of Bing’s Copilot toggle.
  • AI Mode “brings together advanced model capabilities with Google’s best-in-class information systems, and it’s built right into Search. You can not only access high-quality web content, but also tap into fresh, real-time sources like the Knowledge Graph, info about the real world, and shopping data for billions of products. It uses a ‘query fan-out’ technique, issuing multiple related searches concurrently across subtopics and multiple data sources and then brings those results together to provide an easy-to-understand response,” according to Google (announcement link above). 
  • Google told Search Engine Land that, like AI Overviews, AI Mode surfaces relevant links to help people find webpages and content, and that Google teaches the model to decide when to include hyperlinks in the response. For example, if it’s likely that users want to take action on a website (like booking tickets), then links would be useful. AI mode will also decide when to prioritize visual information, such as images or videos, for queries like how-to searches.

What this means for SEO 

  • Google says that AI Mode “is rooted in our core quality and ranking systems, and we’re also using novel approaches with the model’s reasoning capabilities to improve factuality. We aim to show an AI-powered response as much as possible, but in cases where we don’t have high confidence in helpfulness and quality, the response will be a set of web search results.”
  • AI Mode is now in testing. Whether it will impact click-through rates in the same way as AI Overviews remains to be seen. 

Takeaway: AI Mode signals an ongoing shift towards AI-dominated search results. For now, we can assume that the importance of ranking well in traditional search remains the same.

Dig deeper. Google’s AI Mode: Here’s what matters for SEOs and marketers

Google: Gemini

Gemini is Google’s competitor to ChatGPT and other generative AI tools. Gemini functions as both an independent chatbot and the power behind AI Overviews in Google Search.

Over the coming months, Google plans to upgrade virtually all Assistant-enabled devices – from phones to smart home gadgets – to use Gemini as the default assistant.

This shift shows Google’s long-term commitment to AI as a core part of search and user interactions.

How Gemini works

  • Gemini pulls from Google search results and third-party content partners (for example, AP) to generate responses, integrating search rankings into its answers. 
  • Gemini can also personalize results based on a user’s Google search history, YouTube activity, and app usage, making responses adaptive rather than purely search-driven.

Market share and adoption

  • According to Statista, Gemini ranks No. 3 on the most downloaded gen AI apps globally as of January 2025, with approximately 9 million downloads.
  • Similarweb data shows that the majority of users are aged 25 to 34 (approximately 30%), with the second highest usage among 18- to 24-year-olds at about 21%.
Image credit: gemini.google.com analysis, Similarweb

What this means for SEO

  • Ranking in Google Search is still crucial, but there’s more. Gemini pulls from Google’s search index, but it also sources data from content partnerships.
  • Consider content that’s optimized for natural language queries, structured data to help enhance context and education-focused content (where teaching something is front and center).
  • When Gemini personalizes responses based on user history, visibility in Gemini answers may vary between users. For example, if a user frequently engages with a particular brand’s YouTube channel, Gemini might be more inclined to mention or draw from that brand’s content when that user asks a related question. 
  • Click-through rates from Gemini remain uncertain, as Gemini doesn’t always provide direct links. Gemini comes in third for referral traffic as compared to ChatGPT and Perplexity, according to one study. 

Takeaway: Visibility in Gemini means business as usual in terms of having an excellent site that can rank in Google Search, but it also adds complexity with Gemini’s AI-driven personalization and conversational search trends. 

Microsoft: Bing Copilot

Bing was the first major search engine to integrate AI directly into its results, launching Bing Copilot (formerly Bing Chat) in February 2023. It’s no surprise Bing beat Google here, as Microsoft has been a big investor in OpenAI since 2019.

How Bing Copilot works

  • Powered by Microsoft’s Prometheus model, which builds on OpenAI’s GPT-4.
  • Generates AI summaries based on real-time Bing search results and external data sources.
  • AI-generated responses appear at the top of search results, sometimes before traditional web listings.
  • You can also click on “Deep Search” for more in-depth AI-powered answers. These answers are also linked to sources found on the web. 
  • In addition, there’s a Copilot toggle in Bing for a more interactive, fully powered AI search mode.

Market share and adoption

  • Bing accounts for about 7.48% of the U.S. search market.
  • While small compared to Google, it’s possible that Bing’s market share may grow more in the future due to its early adoption of AI-powered search and the reliance of other AI platforms on Bing results.

What this means for SEO

  • Unlike Google’s AI Overviews, Bing Copilot is more likely to cite sources outside the top-ranked pages, but ranking higher still increases the likelihood of inclusion.
  • A study by Rich Sanger found that over 70% of URLs included in Bing Copilot summaries rank in the top 20 Bing search results.
  • Bing may present a growing opportunity as AI search adoption increases. 

Takeaway: Bing may no longer be just an afterthought in many companies’ SEO strategies. You’ll want to continue to have a robust SEO program that takes into account ranking signals for Bing.

OpenAI: ChatGPT Search

ChatGPT search is OpenAI’s initiative to enhance traditional search by integrating AI-powered real-time web search into ChatGPT. 

It was initially launched as the SearchGPT prototype in mid-2024 and later integrated into ChatGPT, allowing users to access live search capabilities rather than relying solely on pre-trained knowledge.

By October, OpenAI fully integrated SearchGPT into ChatGPT, enabling it to perform real-time web searches and provide more current, sourced information for user queries.

This positioned ChatGPT search as a direct competitor to traditional search engines, offering users an AI-powered alternative to platforms like Google and Bing. 

But here’s the kicker: It still relies on search engine results.

How ChatGPT search works

  • Powered by a fine-tuned version of GPT-4o, post-trained using synthetic data-generation techniques. This includes distilling outputs from OpenAI’s o1-preview model, meaning some responses are AI-synthesized rather than directly retrieved from the web.
  • SearchGPT pulls data from multiple sources, including third-party search providers like Bing and direct content partnerships that supply proprietary information.

Market share and adoption

  • ChatGPT is the most widely used text generation AI tool, holding nearly 20% of the global generative AI user share in 2023, according to Statista.
  • ChatGPT’s weekly active user base doubled in six months, with 400 million weekly active users now relying on its search capabilities, according to TechCrunch.
Image credit: “Leading generative artificial intelligence (AI) text tools market share of users globally in 2023,” Statista.com

What this means for SEO

  • Since SearchGPT relies on Bing’s indexing system, ensuring your content ranks in Bing is essential. Content not indexed by Bing is unlikely to appear in SearchGPT’s responses. 
  • Chatter in the SEO industry suggests that SearchGPT might favor trusted, high-ranking sources in Bing but that it also relies on sources outside the top rankings in Bing.
  • SearchGPT’s responses can include clickable sources, potentially driving traffic back to a site. A study analyzing traffic data from 391 SMB websites found that ChatGPT’s referral traffic increased by 123% between September 2024 and February 2025, making it the largest referrer among AI-driven search engines during that period. Additionally, ChatGPT has been sending more traffic to education and technology sites, with more than 30,000 unique domains receiving referrals by November 2024.
  • The conversational nature of ChatGPT is changing how users search and consume information. Continuing to emphasize helpful content can only make a website more competitive. 
  • In the early days of SEO, search engines were highly susceptible to simple manipulation tactics. Similarly, ChatGPT’s AI-powered search may be vulnerable to manipulation, with tests showing that it could be influenced to return misleading or biased results.

Takeaway: ChatGPT could be the biggest threat to search engine usage. However, SearchGPT’s reliance on Bing means SEO strategies must prioritize Bing to improve the chances of being surfaced in AI-generated results as well.

Perplexity AI 

Perplexity AI is an independent, AI-powered search engine that blends large language models with real-time web data to provide concise AI-powered responses with direct citations. 

The citations piece is probably one of the more compelling things about Perplexity. 

In an interview with Lex Fridman, Perplexity’s founder Aravind Srinivas said:

  • “When I wrote my first paper, the senior people who were working with me on the paper told me this one profound thing, which is that every sentence you write in a paper should be backed with a citation, with a citation from another peer-reviewed paper, or an experimental result in your own paper. Anything else that you say in the paper is more like an opinion.
  • “It’s a simple statement, but pretty profound in how much it forces you to say things that are only right.
  • “We took this principle and asked ourselves, what is the best way to make chatbots accurate, is force it to only say things that it can find on the internet, and find from multiple sources.” 

Launched in late 2022, it has positioned itself as an alternative and direct competitor to traditional search engines like Google. 

How Perplexity works

  • Perplexity AI operates as an independent search engine, actively crawling and indexing the web to provide real-time, AI-generated responses to user queries. 
  • Instead of building a massive index like Google’s, Perplexity prioritizes indexing high-quality, frequently searched topics based on user behavior. By focusing on trusted and helpful sources, it optimizes for accuracy and truthfulness while maintaining efficiency.
  • Each response includes direct source links, differentiating Perplexity from AI chatbots that provide answers without attribution.

Market share and adoption

What this means for SEO

  • Perplexity relies on trusted sources from the web. This means you must have an authoritative presence on the web.
  • One study showed that 60% of Perplexity citations overlap with the top 10 Google organic results.
  • Other research indicates that Perplexity has a group of favored, authoritative sources on the web to pull from.
  • Because Perplexity is an independent search engine, ranking factors will be different from Google or Bing. 
  • Content formatted in a certain way may have a leg up, including clear headings, well-organized sections and succinct answers like FAQs embedded in your content—all of which can be quickly understood and extracted by Perplexity’s model.​
  • While Perplexity links to sources, data suggests referral traffic is still quite low

Takeaway: Perplexity AI is another contender that could continue to gain traction in AI search and take users away from major search engines. It’s important to remember that it still relies on sources across the web, making an authoritative site with the right content optimized for AI an important step in visibility. 

The future of AI search and SEO

While some predict that the rise of AI will reduce search engine volume significantly (Gartner predicts a 25% drop by 2026), the importance of having a reputable website with trustworthy, optimized content remains critical for the foreseeable future. 

Time will tell which AI wins the game. With many AI platforms facing legal challenges (like Google’s AI Overviews and Perplexity’s lawsuits), legal decisions will also likely shape the winners and how AI search ultimately operates.

So, which AI search engine should you optimize for right now? 

I suggest gathering research on the potential for referral traffic and the audience demographics using the AI search engine. Does it align with your industry and business? 

For those AI search engines that require “extra SEO effort” on top of what you’re already doing, make sure it’s worth it. Track your referral traffic to see if any patterns emerge.

We know that Google is the dominant search engine, so continuing to optimize for Google is key. 

The situation is not perfect, however. While some websites report clicks are up from things like AI Overviews, others are losing big time.

For example, research shows that for queries where AI Overviews appeared in Google, organic CTR fell sharply from 1.41% to 0.64% year over year. 

Image credit: Seer Interactive

On the other hand, a different study looking specifically at AI search engines like ChatGPT, Perplexity, and others found that they send 96% less referral traffic to news sites and blogs than traditional Google Search.

Emarketer data echoes this:

Image credit: emarketer

Some data already suggest a basic hierarchy of referral traffic coming from certain AI search engines. 

For example, one study found ChatGTP to be a clear winner in referral traffic overall, but things fluctuate based on industry.

Image credit: “SMB websites see rising traffic from ChatGPT and other AI engines,” William Kammer, Search Engine Land

Image credit: “SMB websites see rising traffic from ChatGPT and other AI engines,” William Kammer, Search Engine Land

As we continue to see all this play out, we can relax knowing that the fundamentals of SEO are not going away. 

Yes, the approach may change, but the foundation is the same: Put the user first, make a great website that’s optimized for the platforms your target audience uses, and continue to adapt to the different ways you can remain visible in search.

Read more at Read More

Google AI Mode rolling out to second batch of users now

Google is now rolling out access to AI Mode to its second batch of users. Google first allowed Google One AI Premium subscribers access to AI mode, when it first launched on March 5th. If you opted into AI Mode and are based in the United States, you may now have access.

How to access AI Mode. Once you again access then you should be able to access AI Mode – here is how:

  • Go to www.google.com, enter a question in the Search bar, and tap the “AI Mode” tab below the Search bar.
  • Go directly to the AI Mode tab on Google Search at: google.com/aimode.
  • In the Google app, tap the AI Mode icon below the Search bar on the home screen.

The initial bug. When Google emailed me and hundreds of other searchers with their invites to try AI Mode at around 5:20pm ET today, many were unable to access it. When you clicked the “Try now” button, it told you to opt in and wait to get access.

I covered these details on the Search Engine Roundtable.

What is AI Mode. AI Mode is a new tab within Google Search, right now only for those accepted into the Google Search Labs experiment, that brings you into a more AI-like interface. Google said AI Mode “is particularly helpful for queries where further exploration, reasoning, or comparisons are needed.” AI Mode lets you explore a topic and get comprehensive AI-based answers without you needing to do those comparisons and analyses yourself. We saw rumors of this news and it is finally officially here, for some of you.

I have a detailed write up on AI Mode over here.

Why we care. AI Mode may reveal the future of Google Search and search futures that may be incorporated into Google Search in the days ahead.

So see if you have access and play around with it so you can understand how this new Google Search feature works.

Confirmed. Google’s Robby Stein confirmed the expanded rollout of AI Mode:

Read more at Read More

Google Merchant Center to align click reporting with Google Ads

google merchant center

Google Merchant Center click reporting is changing on April 21, 2025, where clicks will be reported in the same manner Google Ads reports clicks. Google said this will align click reporting with Google Ads and thus may impact some current and historical data reported in Merchant Center.

What is changing. Google wrote in this email, “As of April 21, 2025, we’re updating Google Merchant Center to align click reporting with Google Ads.”

The email goes on to say:

“This change reflects new advertising formats that have different types of interactions. While Google Ads reports clicks separately from other interactions, Merchant Center currently reports all interactions as product clicks. With this update, the definition of product clicks will be the same across both platforms.
As a result, you’ll notice some changes to your current and historical data reported in Merchant Center. There will be no change to your reporting experience in Google Ads, where you’ll continue to see clicks and interactions for your ad campaigns.”

More details. Arpan Banerjee who notified me of this, said the email has a hyperlink to the Google Ads definition of interactions, which reads:

“The main user action associated with an ad format—clicks and swipes for text and Shopping ads, views for video ads, calls for call assets, and so on.”

Why we care. If you run Google Merchant Center and notice a change in click reporting around April 21st (in about a month), then this is why. This is just a reporting change and the changes you see in the clicks in your reports are not related to any changes in performance of those listings within Google Search.

Read more at Read More