Google partners with Roblox on video ads

Roblox unveiled a new immersive video advertising format on its platform and announced a strategic partnership with Google to expand its advertising business reach.

Google will roll out these ads across AdMob and Ad Manager, creating significant new revenue opportunities for publishers while giving advertisers contextually relevant ways to reach engaged audiences.

Big picture. The gaming giant is introducing rewarded video ads up to 30 seconds long that offer in-game benefits, marking a significant evolution of its business model beyond traditional gaming revenue.

These ad formats are designed to blend seamlessly into virtual environments – appearing as billboards in digital cities or on screens during virtual sporting events – creating less disruptive advertising experiences that maintain user engagement.

By the numbers:

  • 85.3 million: Daily active users on Roblox.
  • Majority: Users aged 13+.
  • 30 seconds: Maximum length of new video ad format.

Between the lines. This move addresses a key challenge in gaming advertising: maintaining user engagement without disrupting gameplay experiences. As Google executive Scott Sheffer noted, “traditional ad formats haven’t always been the right choice” in gaming environments.

The Roblox partnership also represents a strategic pivot for Google’s advertising business toward emerging virtual spaces where younger audiences spend significant time, positioning them ahead of industry shifts toward more immersive digital experiences.

What Google is saying. Google announced it’s extending Immersive Ads capabilities to more publishers following successful testing, with Roblox becoming a cornerstone partner for the technology.

Why we care. This partnership opens access to 85.3 million daily active users, predominantly Gen Z—a notoriously difficult audience to reach effectively. The rewarded video format offers a non-disruptive way to engage with these highly attentive users, creating authentic brand interactions rather than intrusive interruptions.

Additionally, integration with Google’s ad platform dramatically simplifies campaign management, allowing advertisers to incorporate Roblox into their broader digital strategies without learning new systems.

What’s next. Brands will soon be able to purchase ads directly or through Google Ad Manager, with additional formats like billboards due in the coming months. Partnerships with measurement firms Cint, DoubleVerify, and Nielsen will help track campaign performance

Read more at Read More

Indexing and SEO: 9 steps to get your content indexed by Google and Bing

Indexing and SEO- 9 steps to get your content indexed by Google and Bing

Sick of seeing the error “Discovered – currently not indexed” in Google Search Console (GSC)?

So am I.

Too much SEO effort is focused on ranking.

But many sites would benefit from looking one level up – to indexing.

Why?

Because your content can’t compete until it’s indexed.

Whether the selection system is ranking or retrieval-augmented generation (RAG), your content won’t matter unless it’s indexed.

The same goes for where it appears – traditional SERPs, AI-generated SERPs, Discover, Shopping, News, Gemini, ChatGPT, or whatever AI agents come next.

Without indexing, there’s no visibility, no clicks, and no impact.

And indexing issues are, unfortunately, very common. 

Based on my experience working with hundreds of enterprise-level sites, an average of 9% of valuable deep content pages (products, articles, listings, etc.) fail to get indexed by Google and Bing.

Percentage of page types not indexed by Google and Bing

So, how do you ensure your deep content gets indexed? 

Follow these nine proven steps to accelerate the process and maximize your site’s visibility.

Step 1: Audit your content for indexing issues

A sitemap for each deep content page type

In Google Search Console and Bing Webmaster Tools, submit a separate sitemap for each page type:

  • One for products.
  • One for articles.
  • One for videos.
  • And so on.

After submitting a sitemap, it may take a few days to appear in the Pages interface. 

Use this interface to filter and analyze how much of your content has been excluded from indexing and, more importantly, the specific reasons why.

GSC - Why pages aren't indexed

All indexing issues fall into three main categories:

  • Poor SEO directives
  • Low content quality
    • If submitted pages are showing soft 404 or content quality issues, first ensure all SEO-relevant content is rendered server-side. 
    • Once confirmed, focus on improving the content’s value – enhance the depth, relevance, and uniqueness of the page.
  • Processing issues

While the first two categories can often be resolved relatively quickly, processing issues demand more time and attention. 

By using sitemap indexing data as benchmarks, you can track your progress in improving your site’s indexing performance.

Dig deeper: The 4 stages of search all SEOs need to know

Step 2: Submit a news sitemap for faster article indexing

News sitemap for articles

For article indexing in Google, be sure to submit a News sitemap

This specialized sitemap includes specific tags designed to speed up the indexing of articles published within the last 48 hours. 

Importantly, your content doesn’t need to be traditionally “newsy” to benefit from this submission method.

Step 3: Use Google Merchant Center feeds to improve product indexing

While this applies only to Google and specific categories, submitting your products to Google Merchant Center can significantly improve indexing. 

Ensure your entire active product catalog is added and kept up to date.

Dig deeper: How to optimize your ecommerce site for better indexing

Step 4: Submit an RSS feed to speed up crawling

Submit an RSS feed

Create an RSS feed that includes content published in the last 48 hours. 

Submit this feed in the Sitemaps section of both Google Search Console and Bing Webmaster Tools.

This works effectively because RSS feeds, by their nature, are crawled more frequently than traditional XML sitemaps. 

Plus, indexers still respond to WebSub pings for RSS feeds – a protocol no longer supported for XML sitemaps. 

To maximize benefits, ensure your development team integrates WebSub.

Step 5: Leverage indexing APIs for faster discovery

Integrate both IndexNow (unlimited) and the Google Indexing API (limited to 200 API calls per day unless you can secure a quota increase).

Officially, the Google Indexing API is only for pages with job posting or broadcast event markup.

(Note: The keyword “officially.” I’ll leave it to you to decide if you wish to test it.)

Get the newsletter search marketers rely on.



Step 6: Strengthen internal linking to boost indexing signals

The primary way most indexers discover content is through links. 

URLs with stronger link signals are prioritized higher in the crawl queue and carry more indexing power.

While external links are valuable, internal linking is the real game-changer for indexing large sites with thousands of deep content pages.

Your related content blocks, pagination, breadcrumbs, and especially the links displayed on your homepage are prime optimization points for Googlebot and Bingbot.

When it comes to the homepage, you can’t link every deep content page – but you don’t need to. 

Focus on those that are not yet indexed. Here’s how:

  • When a new URL is published, check it against the log files.
  • As soon as you see Googlebot crawl the URL for the first time, ping the Google Search Console Inspection API.
  • If the response is “URL is unknown to Google,” “Crawled, not indexed,” or “Discovered, not indexed,” add the URL to a dedicated feed that populates a section on your homepage.
  • Re-check the URL periodically. Once indexed, remove it from the homepage feed to maintain relevance and focus on other non-indexed content.
Diagram - internal linking and indexin

This effectively creates a real-time RSS feed of non-indexed content linked from the homepage, leveraging its authority to accelerate indexing.

Step 7: Block non-SEO relevant URLs from crawlers

Block non-SEO relevant URLs from crawlers

Audit your log files regularly and block high-crawl, no-value URL paths using a robots.txt disallow.

Pages such as faceted navigation, search result pages, tracking parameters, and other irrelevant content can:

  • Distract crawlers.
  • Create duplicate content.
  • Split ranking signals.
  • Ultimately downgrade the indexer’s view of your site quality.

However, a robots.txt disallow alone is not enough. 

If these pages have internal links, traffic, or other ranking signals, indexers may still index them.

To prevent this:

  • In addition to disallowing the route in robots.txt, apply rel=”nofollow” to all possible links pointing to these pages.
  • Ensure this is done not only on-site but also in transactional emails and other communication channels to prevent indexers from ever discovering the URL.

Dig deeper: Crawl budget: What you need to know in 2025

Step 8: Use 304 responses to help crawlers prioritize new content

For most sites, the bulk of crawling is invested in refreshing already indexed content.

Use 304 responses to help crawlers prioritize new content

When a site returns a 200 response code, indexers redownload the content and compare it against their existing cache. 

While this is valuable when content has changed, it’s not necessary for most pages.

For content that hasn’t been updated, return a 304 HTTP response code (“Not Modified”). 

This tells crawlers the page hasn’t changed, allowing indexers to allocate resources to content discovery instead.

Step 9: Manually request indexing for hard-to-index pages

Manually request indexing for hard-to-index pages

For stubborn URLs that remain non-indexed, manually submit them in Google Search Console. 

However, keep in mind that there is a limit of 10 submissions per day, so use them wisely.

From my testing, manual submissions in Bing Webmaster Tools offer no significant advantage over submissions via the IndexNow API. 

Therefore, it’s more efficient to use the API.

Maximize your site’s visibility in Google and Bing

If your content isn’t indexed, it’s invisible. Don’t let valuable pages sit in limbo.

Prioritize the steps relevant to your content type, take a proactive approach to indexing, and unlock the full potential of your content.

Dig deeper: Why 100% indexing isn’t possible, and why that’s OK

Read more at Read More

Data providers: Google March 2025 core update had similar volatility to the previous update

The Google March 2025 core update finished rolling out over a 14-day period, starting on March 13, 2025, and completed about 14 days later on March 27, 2025. This March core update took about a week longer than Google’s December 2024 core update, which started December 12, 2024 and completed about 6 days later on December 18, 2024.

Please understand that if a core update impacts a site, it can result in a huge change for that site’s search visibility. So, I do not want to diminish any core updates, including the March core update; those could have been really big for you or the sites you manage.

Data providers on the Google March 2025 core update

Semrush. Semrush (our parent company) was the first to send us data that seemed to show that the Google March 2025 core update had similar volatility than the previous core update, the December 2024 core update. Keep in mind that the December 2024 core update was more volatile than the November 2024 core update, based on the data.

If you glance at the Semrush Senor, you can see the overall volatility the tool reported over that time period:

Mordy Oberstein, who sent me the data from Semrush, told me the two were of “similar in size” when comparing the volatility. He sent me this chart showing the volatility of the past two core updates, broken down by vertical. It shows the peak volatility numbers were pretty similar between the two updates:

If you look at overall ranking volatility change comparison, you can see that the Health sector saw a much bigger change, for some reason:

But when you compare this to the baseline rank volatility, both the December and March core updates were within very similar ranges, Mordy Obertstein told us. “There’s a mere .1 difference between the two,” Obertstein added. Obertstein said he has a theory, which he will share at his session at SMX Advanced, on why he thinks that this update hit different verticals differently.

When you dive into the top ten results, you can see a notable change in what ranking changes there were between these two past core updates:

Similarweb. Similarweb’s SERP Seismometer showed the spikes in volatility cooled down with the March update. You can see it get a bit more volatile on March 13th, 14th and 15th but then start to cool again as the core update rolled out.

Darrel Mordechai from SimilarWeb told us the March 2025 core update was not the most volatile core update they’ve seen and compared to the December core update, it showed “similar levels of volatility.”

Here is a chart showing the core update volatility by average position change for the past core updates, as documented by Similarweb:

Here is when you zoom in comparing the March 2025 and December 2024 core updates, they are super close:

The current update showed slightly lower fluctuations in the top three positions but increased volatility across the top five. Here is where you can see that in this chart:

When you compare it by vertical or niche, you can see the volatility the March 2025 core update caused across the health, finance, retail, travel and finance industries. You can see the finance industry showed the highest levels of fluctuation, particularly in the top five results. In contrast, the travel industry experienced notably low volatility in the top three positions.

seoClarity. The folks at seoClarity also sent me some winners and losers reports, showing the biggest winners and losers from February to March 2025:

Other tools. There are a lot of Google search ranking volatility tools. Here is what they looked like after the core update finished rolling out and over the course of the update:

Mozcast:

Mozcast

Algoroo:

Algoroo

Advanced Web Rankings:

Advancedwebranking

Accuranker:

Accuranker

Cognitive SEO:

Cognitiveseo

Wincher:

Wincher

Mangools (looks broken?):

Mangools

Sistrix:

Sistrix

Data For SEO:

Dataforseo

SERPstat:

Serpstat

Industry. The initial rollout seemed to kick in within a few days after the update was announced. Some sites saw big swings both up and down, in terms, of ranking improvements or decline. But this update did not seem as widespread as some previous core updates, where it had a wider impact on a more diverse site of sites. That is not to say this update was not big for those who were impacted by it – it 100% was very big for those sites.

During the update, some of the tracking tools were tripped up by some Google Search result page changes. That may make it hard for some to track the impact of this update. But you can use Google Search Console to see your impact for your site and see position changes for your most popular keywords.

Then, we saw some additional volatility spike at the tail end of this update.

What to do if you are hit. Google has given advice on what to consider if you are negatively impacted by a core update in the past. Google has not really given much new advice here.

  • There aren’t specific actions to take to recover. A negative rankings impact may not signal anything is wrong with your pages.
  • Google has offered a list of questions to consider if your site is hit by a core update.
  • Google said you can see a bit of a recovery between core updates but the biggest change would be after another core update.

In short, write helpful content for people and not to rank in search engines.

  • “There’s nothing new or special that creators need to do for this update as long as they’ve been making satisfying content meant for people. For those that might not be ranking as well, we strongly encourage reading our creating helpful, reliable, people-first content help page,” Google said previously.

More on Google updates

You can read more of our coverage in Search Engine Land’s Google Algorithm Updates history.

Why we care. While the data above shows how sites in general are doing with the last core update, it does not represent how your individual site did with the update. If your site was hit by this past update, it can be devastating. If you were hit by previous updates and so no improvement with this update, then again, devastating once again. But some sites saw big improvements.

Feel free to compare this to our December core update report.

We hope you saw improvements with this March 2025 coe update.

Read more at Read More

Your guide to Google Ads Smart Bidding

Your guide to Google Ads Smart Bidding

Are you controlling your paid search campaigns, or are they controlling you? 

If you can’t confidently articulate your smart bidding strategies, you lose conversions and credibility. 

True mastery isn’t just about setting up a campaign and picking a bid strategy; it’s about owning and communicating the process effectively. 

This guide is your roadmap to clarity and control, breaking down 2025’s Smart Bidding into actionable insights.

We’ll cover key concepts, common mistakes, and actionable tips for picking the right strategy. 

Smart Bidding in Google Ads: AI-powered bid optimization

Smart Bidding is Google Ads’ advanced form of automated bidding.

It leverages machine learning and real-time auction signals to optimize bids for conversions or conversion value. 

It dynamically adjusts bids to achieve specific goals, such as maximizing conversions at a target cost or achieving a desired return on ad spend.

Key Smart Bidding strategies include: 

Target CPA (cost per action)

  • Optimizes bids to achieve conversions at a target cost per action. 
  • Ideal for campaigns where you have a specific cost you’re willing to pay for each conversion (e.g., lead, sale).
  • Example: “We aim to acquire leads at a CPA of $50.”

Dig deeper: Everything you need to know about Target CPA bidding

Target ROAS (return on ad spend)

  • Focuses on achieving your desired revenue for every dollar spent. 
  • Best for ecommerce or campaigns with clear revenue goals.
  • Example: “We want to achieve a ROAS of 400%, meaning $4 in revenue for every $1 spent.”

Maximize Conversions

  • Automatically sets bids to achieve the most conversions within your budget.
  • Useful when you want to drive as many conversions as possible, regardless of cost.
  • Example: “Our goal is to maximize the number of sign-ups within our daily budget.”

Dig deeper: Mastering Maximize conversions bidding in Google Ads

Maximize Conversion Value

  • Prioritizes higher-value conversions for greater overall return. 
  • Effective when different conversions have varying values to your business.
  • Tends to favor selling more expensive products or services, as they contribute more to the total conversion value.
  • Example: “We value a ‘request for quote’ more than a ‘newsletter sign-up,’ so we want to maximize the total value of conversions.”

Dig deeper: Maximize Conversion Value: Google Ads bidding explained

Maximize Clicks

  • Automatically sets your bids to get as many clicks as possible within your budget.
  • Useful for top-of-funnel campaigns where the goal is to drive traffic to a site.
  • Example: “This campaign is designed to drive as much traffic to our new blog post as possible.”

Enhanced CPC (ECPC)

  • A semi-automated bidding strategy that adjusts your manual bids to try and get more conversions.
  • Google Ads adjusts your manual bid up or down based on the likelihood of a conversion.
  • Example: “We are using manual bidding but want to use Google’s signals to increase conversions where possible.”

Viewable CPM (YouTube)

  • Focuses on maximizing viewable impressions of your display or skippable in-stream video ads.
  • Ideal for brand awareness campaigns where the goal is to get your message seen by as many people as possible.
  • Example: “We want to ensure our brand message is visibly displayed to our target audience on YouTube.”

Cost Per View (YouTube)

  • Optimizes bids to get the most video views or interactions within your budget.
  • Best for campaigns focused on driving engagement with your video content.
  • Example: “We are running a video campaign on YouTube and want to maximize the number of views we receive.”

It’s crucial to understand that while setting a Target CPA or ROAS provides strategic direction, achieving those exact targets isn’t guaranteed.

I’ve had situations where a media planner pushed for an immediate switch to a specific CPA goal. 

They wanted the target set at four times and wouldn’t budge or try to understand why the campaign was set at two times.

A common misconception is that simply setting a desired metric will automatically yield the desired results. 

In practice, achieving optimal performance often requires a nuanced approach.

This may involve:

  • Gradual bid adjustments.
  • A willingness to accept temporary fluctuations in ROAS for broader account health.
  • A comprehensive evaluation of multiple factors, including budget, historical campaign performance, and keyword strategy.

It’s essential to understand that Smart Bidding strategies, while powerful, require strategic oversight and a holistic understanding of account dynamics. 

Success should be measured within the context of overarching account objectives, not solely focusing on individual campaign metrics.

Understanding manual, automated and smart bidding in Google Ads

Understanding manual, automated and smart bidding in Google Ads

Manual bidding allows you to control bid adjustments completely, making it ideal for certain industries, such as legal or home services, where fluctuating competition requires ongoing oversight. However, it requires more time and effort.

It’s like driving a car where you control every gear shift and pedal movement.

Automated bidding simplifies bid management by using algorithms to adjust bids. 

While automated bidding can save time, its generic approach doesn’t account for nuanced conversion goals.

Think of this as engaging cruise control. You tell the car (Google Ads) your general desired speed (goal), and it adjusts the engine (bids) to maintain that pace.

Smart Bidding, however, takes automated bidding further by using real-time signals and advanced machine learning to predict the likelihood of conversions and their value, tailoring bids to individual auctions. 

It’s especially effective for campaigns with clear conversion goals and sufficient historical data.

This is like having a self-driving car with an incredibly sophisticated navigation system.

It’s important to know that while all Smart Bidding is automated, not all automated bidding qualifies as Smart Bidding.

Automated bidding covers a wider range of strategies, some of which are more basic and don’t rely on real-time signals or advanced machine learning.

In essence:

  • Manual: You control every bid.
  • Automated: Google’s algorithms handle bid adjustments based on your chosen strategy.
  • Smart: Google’s machine learning optimizes bids in real-time for conversions and conversion value.

Get the newsletter search marketers rely on.



Smart Bidding: Advantages and risks

There are significant advantages to using Smart Bidding.

  • Improved efficiency: Saves time by automating bid adjustments. 
  • Auction-time optimization: Factors in user intent, device, location, and other data points to optimize bids for each auction. 
  • Goal alignment: Customizes bids to match your campaign objectives, whether it’s maximizing volume or focusing on high-value actions.  

While Smart Bidding offers significant advantages, missteps in implementation can lead to underwhelming results. 

Here’s how to avoid common pitfalls and optimize your campaign performance.

Data dependency

Smart Bidding algorithms rely on robust historical data to make accurate predictions. 

Campaigns with fewer than 30 conversions in the last 30 days may struggle to optimize effectively.

Start with manual bidding or Maximize Clicks to build a data foundation before switching to Smart Bidding. Boris Beceric, a Google Ads consultant and coach, said:

  • “I guess most try Smart Bidding too early – without enough conversion volume. What usually helps: consolidate campaigns so you get more data flowing through a single campaign. Portfolio bidding – kinda the same, but consolidation takes place at the bid strategy level.
  • “Micro conversions – try to add in the micro conversion that had the most volume and is closest to the ‘real’ conversion. Bonus: Reverse engineer CVR and conv value from micro to macro conversion and adjust tCPA accordingly.”

Goal misalignment

Using the wrong bidding strategy can hinder performance. 

For example, applying Target ROAS to a new campaign with limited data can set unrealistic expectations and reduce reach.

Align bidding strategies with your goals.

When prioritizing profitability, use Maximize Conversions for volume and Target ROAS or Target CPA. Harrison Hepp, owner of Industrious Marketing, said:

  • “I had a client who was hybrid ecommerce and lead gen (they sold products, but high-priced deals were lead gen), and they insisted on tracking purchases and leads in every campaign. We constantly battled major fluctuations in the campaigns as they’d swing back and forth between getting purchases or leads and trying to optimize to both.
  • “It also made bid strategy selection really hard, as conversion value bidding would deprioritize leads (no value was tracked), but CPA bidding wasn’t efficient for purchases because of differences in product prices. It really showed how aligning your goals and bid strategy is critical for steady performance. It also underlined how the right bidding strategy can prioritize success in campaigns.”

Monitoring is non-negotiable

Despite its automation, Smart Bidding is not a “set it and forget it” tool. 

Failing to monitor campaigns can lead to wasted ad spend and missed optimization opportunities.

Regularly review performance metrics, adjust campaign parameters, and stay proactive in managing Smart Bidding strategies.

  • “Custom columns/Segment views: We want to measure efficiency, so things like conv value/conv, search impression share, etc.” said Ameet Khabra, owner of Hop Skip Media.

Even with the most advanced AI behind Smart Bidding, performance optimization requires vigilance. 

Regularly review the following metrics to ensure your strategy is working as intended:

  • CPA: Is your Target CPA being met?
  • ROAS: Are the conversions driving sufficient revenue?
  • Conversion rates: Are conversions coming from the right audience segments? Or are you paying for competitors to download your white papers and marking that down as a lead?
  • Search term reports: Are irrelevant keywords consuming a significant portion of your budget? Unprofitable keywords can be why a campaign is not meeting goals.
  • Conversion tracking accuracy: If conversion tracking is improperly implemented, Smart Bidding will optimize based on inaccurate data, reducing effectiveness.

Double-check your conversion tracking setup. Assign accurate values to conversions to reflect their true business impact. Khabra said:

  • “My favorite saying lately is ‘garbage in, garbage out,’ and that is definitely a large component of conversion tracking. Ensuring that we’ve identified the correct conversions that move the needle is half the battle. Implementing the tracking and double-checking that it is correct – collecting conversions – is the second half.”

Budgetary awareness

Strategies like Maximize Conversions and Maximize Clicks will attempt to spend your entire daily budget. 

If your budget is set too high, this can lead to overspending.

Start with smaller daily budgets and gradually increase them while monitoring performance.

Realistic targets

Setting overly aggressive Target CPA or Target ROAS goals can limit your campaign’s reach, as the algorithm will avoid auctions it deems unprofitable.

Begin with realistic targets slightly higher or lower than your current average. Allow time for the algorithm to learn before refining the target.

Best practices for Smart Bidding in Google Ads

To ensure optimal performance, follow these best practices for implementing Smart Bidding in your Google Ads campaigns.

1. Feed accurate data 

Ensure your conversion tracking is set up correctly. 

Assign meaningful values to conversions – whether it’s a purchase, lead form submission, or newsletter signup. 

2. Leverage seasonality adjustments 

Use seasonality adjustments in Google Ads to guide Smart Bidding algorithms for short-term changes (e.g., holiday sales or promotions). 

This prevents excessive or insufficient bids during periods of fluctuating demand. 

Google Ads seasonality adjustments

3. Start with conservative budgets 

Begin with smaller budgets and avoid aggressive bid caps that may limit auction participation. Allow the algorithm to learn and adapt gradually. 

4. Prioritize business value over conversion volume 

Align your bidding tactics with broader business goals. Instead of focusing solely on conversion volume, consider how each conversion contributes to revenue or lifetime customer value. 

5. Test and adapt 

Use Google Ads experiments to test different strategies. 

For example, compare Target CPA with Target ROAS to identify which delivers better results for your campaigns. 

Google Ads Experiments let you directly compare bid strategies in real-world scenarios.

Duplicate your campaign, allocate a split percentage to a new strategy (like comparing Target CPA vs. Target ROAS), and see concrete results with statistical significance.

Final thoughts

Smart Bidding isn’t just about knowing which technical settings to adjust. 

It’s about understanding how to make Google’s automated tools align with your business goals.

The digital landscape evolves quickly, so it’s essential to stay adaptable, continuously monitor performance, and make adjustments as needed. 

Nail the strategy, stay proactive, and you’ll set yourself up for long-term success.

Read more at Read More

Social media optimization with Yoast SEO

Are you tired of your social media efforts not achieving the results you hoped for? It might be time to scale up your social media optimization efforts. Your content might be good, but you could do various enhancements to make it stand out. For instance, your content needs proper metadata for X, Facebook, and the like to appear properly on each platform. Yoast SEO can help you do this quickly.

Sharing your freshly written (or optimized) content on social media is important. It helps you stay in touch with your audience and update them on news about your business and related topics. But to get their attention, you need to optimize your social media posts before you share them.

In this article, we’ll explain how you can optimize your posts for Facebook and X, and how our plugin can help you with that! Lastly, we’ll briefly discuss Pinterest and the use of Rich Pins.

What is social media optimization?

Social media optimization is about improving how you use social media platforms to build your online presence. You do this not only by creating and sharing content for every platform you’d like to be active on but also by optimizing that content in such a way that you get traffic to your site. The goal is to build strong connections with your audience and to keep them engaged.

Social media optimization starts with well-optimized, highly relevant content that grabs attention. For most platforms, images and video are best suited for this. You can test various formats and ideas to see what your audience prefers. You can use any of the social media analytics tools to do this. Also, find the best times to publish your content to get the best engagement. Your posts should also have metadata for specific platforms like X Cards or OpenGraph for Facebook to help these platforms understand your content.

After posting, remember to engage with your audience. Respond to comments, participate in discussions, and listen to what people say about you and your content. Track your best-performing posts and use data to improve your content to stay relevant and engaging.

Promoting your content on various platforms makes sense in most cases. Remember to share your articles, videos, and other content on whatever social media network makes sense for you and your audience. Read this article if you don’t know where to begin with your social media strategy.

Facebook and other social media

Years ago, Facebook introduced OpenGraph to determine which elements of your page you want to show when someone shares that page. Several social networks and search engines use Facebook’s OpenGraph, but the main reason for adding it is for Facebook itself. Facebook’s OpenGraph support is continuously evolving, but the basics are simple. With a few pieces of metadata, you declare:

  • What type of content is this?
  • What’s the locale?
  • What’s the canonical URL of the page?
  • What’s the name of the site and the title of the page?
  • What’s the page about?
  • Which image/images should be shown when this post or page is shared on Facebook?

Social media preview in Yoast SEO

When you use Yoast SEO, most of the values above are filled out automatically based on your post’s data. It uses the locale of your site, the site’s name, SEO title, the canonical, the meta description value, etc, to fill out most of the required OpenGraph tags. You can see what your post will look like when you click on ‘Social media appearance’ in the Yoast SEO sidebar:

You’ll notice the Social media appearance button in the sidebar opening the modal for the feature

This preview tab allows you to edit how your Facebook post is shown when shared. Our plugin lets you change your social image, title, and description in your preview. This makes your social media optimization much quicker and easier, as you won’t have to leave your post to make these changes.

Make more impact on social media with Yoast SEO Premium!

Get Yoast SEO Premium today and make it quick and easy to manage how your social media snippets look.

Get Yoast SEO Premium Only $99 / year (ex VAT)

If you use the options for social media optimization in Yoast SEO, your Facebook post could look like this when you share the URL of a post or page:

Example of a Facebook post as seen on Yoast’s profile

So what do you need to do?

  1. First, go to Yoast SEO → Settings → Site representation, and fill in your social media accounts.
  2. Afterward, go to Yoast SEO → Settings → Social sharing, and make sure OpenGraph is enabled.
  3. Then, set a good default image under the site basics settings. This image is used when you have a post or page that does not contain an image. It’s important to set this image to ensure that every post or page has an image when shared. Facebook is forgiving when uploading images, but 1200px by 630px should work well.
  4. Lastly, follow the steps in this article to go to your personal WordPress profile and add a link to your Facebook profile, if you want to associate your Facebook profile with your content. If you do, be sure to also enable the ‘Follow’ functionality on Facebook.

You can complete all of these steps in a few minutes. After that, Yoast SEO takes all of the work out of your hands. However, it is important to remember that Facebook sometimes doesn’t immediately pick up changes. So, if you want to “debug” how Facebook perceives your page, enter your URL in the Facebook Sharing Debugger and click the Debug button. If the preview that you see there isn’t the latest version, you can try the Scrape again button. But remember that it can take a while for Facebook to see your changes.

OpenGraph for Video Content

If you have video content, you must do more work unless you use our Video SEO plugin. This plugin handles all the needed metadata and lets you share your videos on Facebook.

X

X’s functionality is quite similar to Facebook’s. The name of this functionality is X Cards. X “falls back” on Facebook OpenGraph for several of these values, so we don’t have to include everything. But it still is quite a bit. We’re talking about:

  • the type of content/type of card
  • an image
  • a description
  • the X account of the site/publisher
  • the X account of the author
  • the “name” for the domain to show in an X card

X preview in Yoast SEO

As you might have seen in Yoast SEO, optimizing your X listings is also an option. Simply click that tab to preview how your page appears when it gets shared to X. By default, the plugin uses the title, description and image you enter in the search appearance preview. Of course, this tab allows you to change these for your Twitter post.

Here’s an example of what your post could look like with all the required metadata our plugin helps you add:

An example of a post on Yoast’s X profile

So what do you need to do?

Ensure X card metadata is enabled by going to Yoast SEO → Settings → Site features → Social sharing and activating the X feature. This leaves a couple of values for you to fill out in the settings, which you can do using this guide on activating X Cards in Yoast SEO.

Use templates for social media snippets

Do you spend a lot of time tweaking the preview appearance of each page or post? You’ll be glad to know that Yoast SEO Premium also offers a very helpful feature: the ability to set default templates for your social snippets. With this powerful feature, you can design the ideal social appearance for all your content and feel certain that the output will always look great to whoever is sharing it.

Use variables to set up templates to optimize your social media postings

What about Pinterest?

Pinterest’s Rich Pins allow for OpenGraph markup as well. Add variables like product name, availability, price, and currency to your page to create a rich pin. As this is mainly interesting for products, we decided to add functionalities to create rich pins to our Yoast WooCommerce SEO plugin.

Read more: How to promote your products and earn money on Pinterest »

Conclusion on social media optimization

So, go ahead and use Yoast SEO to optimize your social media. It isn’t very hard; it just takes a few minutes of your time, and you will reap the rewards immediately. As these social networks add new features, we’ll keep our plugin and this article up-to-date. So, be sure to update the Yoast SEO plugin regularly.

Keep reading: Social Media Strategy: where to begin? »

The post Social media optimization with Yoast SEO appeared first on Yoast.

Read more at Read More

Microsoft Advertising will start enforcing Consent Mode in May

Microsoft Advertising will require advertisers to provide explicit user consent signals starting May 5.

First communicated to advertisers a few weeks ago, this change ensures compliance with global privacy regulations while maintaining the ability to gather insights that optimize ad performance.

Why we care. As data privacy concerns grow, businesses face increasing pressure to protect personal information. Microsoft’s enforcement of Consent Mode offers a way to balance privacy with performance, reinforcing trust while meeting regulatory requirements.

What is Consent Mode? Consent Mode is a feature from Microsoft Advertising that respects user privacy preferences while allowing advertisers to track conversions and optimize campaigns. It adjusts cookie access based on user consent, using the ad_storage parameter to either allow or block cookies. This applies to:

  • Universal Event Tracking (UET) on the Microsoft Advertising Platform.
  • Universal Pixel, Segment, and Conversion pixels within Microsoft Invest, Curate, or Monetize.

Consent signals can also be shared through the IAB’s Transparency and Consent Framework (TCF) or directly via a Consent Management Platform (CMP).

How to implement Consent Mode. Businesses can send user consent signals using one of these three options:

  • Direct integration. Implement Consent Mode with UET, Universal Pixel, Segment, or Conversion pixels.
  • IAB framework. Pass consent signals directly in a TCF 2.0 string or through a CMP.
  • Third-party tools. Integrate Microsoft’s Consent Mode through tools like Google Tag Manager.

Read more at Read More

Google Ads policy update: More ads, new rules

Google

Google Ads will update its Unfair Advantage Policy to clarify that the restriction on showing more than one ad at a time for the same business, app, or site only applies within a single ad location.

This change, staring April 14, follows recent experiments allowing multiple ads from the same advertiser in different locations on the search results page.

What’s happening.

  • Double serving now permitted: Advertisers can now run multiple ads for the same business, app, or site on a single search results page—provided they occupy different ad locations. This could potentially increase visibility and clicks for top advertisers but may also intensify competition for smaller players.
  • Shifting auction dynamics: Google’s updated policy leverages different ad locations to run separate auctions, allowing businesses to secure multiple placements. This adjustment aligns with Google’s evolving approach to ads, such as mixing ads with organic results and redefining top ad placements last year.

Why we care. This update opens up opportunities to dominate search results by showing multiple ads for the same business in different ad locations. This could lead to increased visibility, higher click-through rates, and more conversions.

However, it may also drive up competition and costs, especially for smaller advertisers, as larger brands gain more SERP real estate. Understanding this change is crucial for adapting bidding and placement strategies to stay competitive.

Industry reactions. Digital marketing expert Navah Hopkins of Optmyzr noted on LinkedIn:

  • “Google is officially making it fair game to have more than one spot on the SERP. I have thoughts on this, but I want to see how performance actually shakes out in Q2.”

Digital marketing expert Boris Beceric commented that Google is only chasing the money:

  • “Another case of Google liking money more than a good user experience…not even talking from an advertiser’s perspective.”

Bigger picture. This policy shift marks another fundamental change in Google Ads’ long-standing practices, raising questions about how SERP real estate and competition will evolve.

Bottom line. This update could create new opportunities for advertisers to dominate search results, but it might also make it harder for smaller businesses to compete. The real impact will become clearer as the industry adapts in the coming months.

Read more at Read More

Reddit Ads rolls out new SMB tools to boost campaign performance

Reddit SEO: Everything you need to know

Reddit Ads is introducing a suite of new tools aimed at helping small and medium-sized businesses (SMBs) streamline campaign management, optimize ad performance, and improve data accuracy.

Easier Campaign Setup and Management:

  • Campaign Import. Reddit Ads now allows advertisers to import campaigns directly from Meta in just three steps. After signing into their Meta account within Reddit Ads Manager, users can select an ad account and campaign to import, then customize it to fit Reddit’s platform. This seamless process enables advertisers to leverage high-performing Meta ads on Reddit quickly.
  • Simplified Campaign QA. A new review page in the Reddit Ads Manager now consolidates all campaign details for a clear overview. Advertisers can easily identify errors or inconsistencies and make edits before publishing.

Enhanced Signal Quality and Conversion Tracking:

  • 1-Click GTM Integration for Reddit Pixel. Setting up Reddit’s website conversions tag just got easier. With the new Google Tag Manager (GTM) integration, advertisers can install the Reddit Pixel in a few clicks, enabling fast and accurate conversion tracking. This simplifies measuring customer journeys and optimizing lower-funnel strategies.
  • Event Manager QA. The Events Manager’s enhanced Events Overview page now provides a detailed breakdown of conversion events from the Reddit Pixel or Conversions API (CAPI). This update helps advertisers verify event data accuracy, troubleshoot issues, and run effective lower-funnel campaigns.

Why we care. The new Campaign Import feature lets advertisers quickly repurpose high-performing Meta ads on Reddit, saving time and effort. The simplified QA tools helps with quality checking to reduce as many errors as possible before launch, while the 1-click GTM integration and improved Event Manager provide deeper insights into customer behavior and campaign performance

Bottom line. These updates reflect Reddit’s ongoing commitment to making its ad platform more accessible and effective for SMBs. By reducing setup friction and providing better visibility into campaign performance, Reddit Ads aims to help businesses reach niche communities and drive impactful results.

Read more at Read More

The next wave of search: AI Mode, deep research and beyond

The next wave of search: AI Mode, deep research and beyond

With the rise of AI-powered features, search engines are not just directing users to information but delivering answers directly. 

This shift is redefining how people interact with the web, raising questions about the future of SEO, content discovery, and digital marketing. 

Here’s what’s coming next.

From ChatGPT to Grok 3: The breakneck pace of AI advancements

The world has seen rapid and significant advances in AI technology and large language models (LLMs) within two years. 

Looking back just three years ago, Google’s Gemini and Meta’s LLAMA did not exist, and OpenAI’s ChatGPT was later released in late November 2022. 

  • Fast-forward to January 2025, the public was introduced to DeepSeek R1. This open-source large language reasoning model astounded the AI community with its speed, efficiency, and affordability, especially compared to OpenAI’s o1 GPT model. 
  • A few weeks later, Elon Musk’s company xAI launched Grok 3, which impressed users by topping a key AI leaderboard with its complexity and fewer guardrails (see: unhinged mode).
  • More recently, Anthropic released Claude 3.7 Sonnet and Claude Code, an LLM that excels at code creation and debugging to a degree that has made many software engineers a bit uneasy.

These LLMs are just the beginning of AI’s rapid progress, with more breakthroughs on the way. 

Google’s AI Mode: A glimpse of the future 

AI isn’t just bringing new products – it’s transforming existing ones, too.

On March 5, Google announced they were expanding AI Overviews with a new experimental feature called AI Mode

This interactive feature allows users to:

  • Engage with web search in a chat-like manner through multimodal understanding.
  • Refine long-tail queries in a back-and-forth manner. 

AI Mode, powered by Gemini 2.0, enhances research using a “query fan-out” technique to gather real-time data from multiple sources and generate detailed, in-depth summaries.

This may make SEOs uncomfortable, as it potentially reduces clicks to publisher sites and further promotes a zero-click ecosystem. 

With Google integrating Gemini 2.0 into its suite of products and its dominance of 89% of the search industry, its AI innovations demand close attention. 

These technologies will likely be added to search, and AI Mode offers a preview of what’s ahead.

Two terms for the future of search: Agentic and deep research 

We’ll likely hear two terms used more often in the AI and search space: 

Deep research models can browse the web and focus on conducting intensive, in-depth research to provide users with informative summaries on complex topics. 

Unlike previous LLMs, which use a single-step information retrieval system through RAG (retrieval-augmented generation), deep research and agentic models can:

  • Conduct multi-step research through a series of actions, pulling information from multiple sources to provide comprehensive summaries to the user. 
  • Take proactive actions, such as executing tasks and complex instructions. 

Google’s Project Mariner and OpenAI’s Operator already showcase these capabilities by allowing users to perform tasks within their browsers while understanding multi-modal elements such as text, images, and forms.

Dig deeper: How to use OpenAI’s Deep Research for smarter SEO strategies

How these models could change search

Suppose you want to plan a trip to Tokyo and know the best season to go, the weather, and where to stay. 

Typically, this type of research takes a few days or weeks, and you gather information from various sources, such as travel websites or YouTube videos.

A deep research model can do the heavy lifting by searching the web, gathering information, and summarizing relevant content, which saves you time. 

It can also “read, listen, and watch” various sources to provide a thorough answer. 

An agentic model could also book your hotels and flights, navigating checkout flows to complete the purchase.

AI is moving in this direction as companies like Google work toward AGI (artificial general intelligence) – machines that can reason across diverse tasks like humans.

Deep research and agentic models are key milestones in building practical AI solutions for everyday use.

AI Overviews have already impacted click behavior and organic traffic

Now, we must consider these AI features’ long-term effects on the content ecosystem.

Get the newsletter search marketers rely on.



What could the future search landscape look like?

Google’s AI Overviews and agentic advancements are here to stay. 

If AI Mode succeeds, it will be the first deep research feature in Google Search. 

So, what’s next for the search landscape? 

Here are some possibilities.

Continual rise of zero-click searches

Since launching in May 2024, AI Overviews have significantly reduced clicks to informational queries.

As AI search capabilities advance, users will likely rely even more on AI tools for quick answers rather than clicking through to websites or articles. 

AI Mode and future search innovations could accelerate this shift by prioritizing fast, AI-generated summaries over traditional browsing.

As zero-click searches become the norm, you must rethink how you measure value and engagement. 

Traditional KPIs may no longer accurately reflect user behavior, so focusing on brand visibility and awareness will be more critical than ever.

Increased personalization

LLMs and AI systems are revolutionizing search by personalizing responses with unmatched speed and scale, surpassing traditional algorithms. 

Leveraging Google’s vast user data, AI can train on existing information and refine queries in real-time to deliver more tailored results. 

As these systems continuously learn, they will become even better at recognizing, remembering, and adapting to individual user preferences.

As AI-driven search becomes more personalized, it’s worth considering whether hyper-niche content is the key to reaching your audience.

Multimodal search

Google’s AI-powered multimodal capabilities are already embedded in many of its products, including Project Astra, an AI assistant unveiled at Google I/O 2024.

During a live demonstration, Astra used multiple tools – such as Google Lens – to identify objects in real time and respond to voice queries.

In my own experience at Google I/O, the AI assistant:

  • Accurately classified animal figurines.
  • Distinguished between similar names (“Bob” vs. “Rob”).
  • Even created a story about the figures.

While some of these advanced features haven’t been integrated into Google Search yet, multimodal search through Google Lens and voice search is already shaping how users submit queries. 

As Google develops these capabilities, you should anticipate what’s next, look beyond text-based queries, and optimize for image, video, and audio search.

Dig deeper: From search to AI agents: The future of digital experiences

Commercial queries can still draw users to websites

AI-generated results have reduced clicks for informational queries, but commercial and transactional searches still offer opportunities for website traffic.

During the decision-making process, potential buyers research extensively – comparing products, reading reviews, and exploring multiple channels before making a purchase.

While it’s unclear how AI-generated search will impact this journey, think about how AI can streamline multi-touchpoint decision-making while still driving users to your website.

When users move closer to making a purchase, user-generated content – like reviews – will still play a crucial role in conversions.

Content quality still rules

Despite AI’s growing role in search, one thing remains constant: high-quality content is essential. 

Whether users rely on traditional search engines or LLMs, visibility will still depend on the strength of the content itself.

Since both Google Search and LLMs use RAG to pull from vast datasets, ensuring these systems have access to accurate, high-quality information is critical. 

Content demonstrating E-E-A-T (experience, expertise, authoritativeness, and trustworthiness) will continue to rank higher in AI-driven search results.

Your brand will also play a bigger role in search visibility, making it essential to create valuable, well-optimized content across multiple formats.

Dig deeper: Decoding Google’s E-E-A-T: A comprehensive guide to quality assessment signals

Read more at Read More

Pagination and SEO: What you need to know in 2025

Pagination and SEO: What you need to know in 2025

Ever wondered why some of your ecommerce products or blog posts never appear on Google? 

The way your site handles pagination could be the reason.

This article explores the complexities of pagination – what it is, whether your site needs it for SEO, and how it affects search in 2025. 

What is pagination?

Pagination is the coding and technical framework on webpages that allows content to be divided across multiple pages while remaining thematically connected to the original parent page.

When a single page contains too much content to load efficiently, pagination helps by breaking it into smaller sections.

This improves user experience and unburdens the client (i.e., web browser) from loading too much information – much of which may not even be reviewed by the user.

Examples of pagination in action

Product listings

One common example of pagination is navigating multiple pages of product results within a single product feed or category. 

Let’s look at Virgin Experience Days, a site that sells gifted experiences similar to Red Letter Days.

Take their Mother’s Day experiences page:

  • https://www.virginexperiencedays.co.uk/mothers-day-gifts

Scroll down to the “All Mother’s Day Experiences & Gift Ideas Experiences” section, and you’ll see a staggering 1,635 experiences to choose from. 

That’s a lot.

Large scale product listings

Clearly, listing all of them on a single page wouldn’t be practical. 

It would result in excessive vertical scrolling and could slow down page loading times.

Further down the page, you’ll find pagination links:

Embedded Pagination

Clicking a pagination link moves users to separate product listing pages, such as page 2:

  • https://www.virginexperiencedays.co.uk/mothers-day-gifts?page=2

In the URL, ?page=2 appears as a parameter extension, a common pagination syntax. 

Variations include ?p=2 or /page/2/, but the purpose remains the same – allowing users to browse additional pages of listings. 

Even major retailers like Amazon use similar pagination structures.

Pagination also helps search engines discover deeply nested products. 

If a site is so large that all its products can’t be listed in a single XML sitemap, pagination links provide an additional way for crawlers to access them. 

Even when XML sitemaps are in place, internal linking remains important for SEO. 

While pagination links aren’t the strongest ranking signal, they serve a foundational role in ensuring content is discoverable.

Dig deeper: Internal linking for ecommerce: The ultimate guide

Blog and news feeds

Pagination isn’t limited to product listings, it’s also widely used in blog and news feeds. 

Take Search Engine Land’s SEO article archive:

  • https://searchengineland.com/library/seo

In this page, you can access a feed of all SEO-related posts on Search Engine Land. 

Blog news pagination

Scrolling down, you’ll find pagination links. 

Clicking “2” takes you to the next set of SEO articles:

  • https://searchengineland.com/library/seo/page/2

Pagination inside content

Pagination can also exist within individual pieces of content rather than at a feed level. 

For example, some news websites paginate comment sections when a single article receives thousands of comments. 

Similarly, forum threads with extensive discussions often use pagination to break up replies across multiple pages.

Consider this post from WPBeginner:

  • https://www.wpbeginner.com/beginners-guide/how-to-choose-the-best-blogging-platform/

Scroll to the bottom, and you’ll see that even the comment section uses pagination to organize user responses.

UGC Article Comments Pagination

Why is pagination important for SEO?

Pagination plays a crucial role in SEO for several reasons:

Indexing

Without pagination, search crawlers may struggle to find deeply nested content such as blog posts, news articles, products, and comments.

Crawl efficiency

Pagination increases the number of URLs on a site, which might seem counterproductive to efficient crawling.

However, most search engines recognize common pagination structures – even without rich markup.

This understanding allows them to prioritize crawling more valuable content while ignoring less important paginated pages.

Internal linking

Pagination also contributes to internal linking.

While pagination links don’t carry significant link authority, they provide structure.

Google tends to pay less attention to orphaned pages – those without inbound links – so pagination can help ensure content remains connected.

Managing content duplication

If URLs aren’t structured properly, search engines may mistakenly identify them as duplicate content.

Pagination isn’t as strong a signal for content consolidation as redirects or canonical tags.

Still, when implemented correctly, it helps search engines differentiate between paginated pages and true duplicates.

Google’s deprecation of rel=prev/next

Google previously supported rel=prev/next for declaring paginated content. 

However, in March 2019, it was revealed that Google had not used this markup for some time

As a result, these tags are no longer necessary in a website’s code.

Google likely used rel=prev/next to study common pagination structures. 

Over time, those insights were integrated into its core algorithms, making the markup redundant. 

Some SEOs believe these tags may still help with crawling, but there is little evidence to support this.

If your site doesn’t use this markup, there’s no need to worry. Google can still recognize paginated URLs. 

If your site uses it, there’s also no urgent need to remove it, as it won’t negatively impact your SEO.

Get the newsletter search marketers rely on.



Why pagination is still important in 2025: The infinite scroll debate

Alternate methods for browsing large amounts of content have emerged over the past couple of decades.

“View more” or “Load more” buttons often appear under comment streams, while infinite scroll or lazy-loaded feeds are common for posts and products. 

Some argue these features are more user-friendly. 

Originally pioneered by social networks such as Twitter (now X), this form of navigation helped boost social interactions. 

Some websites have adopted it, but why isn’t it more widespread?

From an SEO perspective, the issue is that search engine crawlers interact with webpages in a limited way. 

While headless browsers may sometimes execute JavaScript-based content during a page load, search crawlers typically don’t “scroll down” to trigger new content. 

A search engine bot certainly won’t scroll indefinitely to load everything. 

As a result, websites relying solely on infinite scroll or lazy loading risk orphaning articles, products, and comments over time.

For major news brands with strong SEO authority and extensive XML sitemaps, this may not be a concern. 

The trade-off between SEO and user experience may be acceptable. 

But for most websites, implementing these technologies is likely a bad idea. 

Search crawlers may not spend time scrolling through content feeds, but they will click hyperlinks – including pagination links.

How JavaScript can interfere with pagination

Even if your site doesn’t use infinite scroll plugins, JavaScript can still interfere with pagination. 

Since July 2024, Google has at least attempted to render JavaScript for all visited pages. 

However, details on this remain vague. 

  • Does Google render all pages, including JavaScript, at the time of the crawl? 
  • Or is execution deferred to a separate processing queue? 
  • How does this affect Google’s ranking algorithms? 
  • Does Google make initial determinations before executing JavaScript weeks later?

There are no definitive answers to these questions.

What we do know is that “dynamic rendering is on the decline,” according to the 2024 Web Almanac SEO Chapter

If Google’s effort to execute JavaScript for all crawled pages is progressing well – which seems unlikely given the potential efficiency drawbacks – why are so many sites reverting to a non-dynamic state? 

This doesn’t mean JavaScript use is disappearing. 

Instead, more sites may be shifting to server-side or edge-side rendering.

If your site uses traditional pagination but JavaScript interferes with pagination links, it can still lead to crawling issues.

For example, your site might use traditional pagination links, but the main content of your page is lazy-loaded.

In turn, the pagination links only appear when a user (or bot) scrolls the page. 

Dig deeper: A guide to diagnosing common JavaScript SEO issues

How to handle indexing and canonical tags for paginated URLs

SEO professionals often recommend using canonical tags to point paginated URLs to their parent pages, marking them as non-canonical. 

This practice was especially common before Google introduced rel=prev/next

Since Google deprecated rel=prev/next, many SEOs remain uncertain about the best way to handle pagination URLs.

Avoid blocking paginated content via robots.txt or with canonical tags.

Doing so prevents Google from crawling or indexing those pages. 

In the case of news posts, certain comment exchanges might be considered valuable by Google, potentially connecting a paginated version of an article with keywords that wouldn’t otherwise be associated with it. 

This can generate free traffic – something worth keeping in 2025.

Similarly, restricting the crawling and indexing of paginated product feeds could leave some products effectively soft-orphaned.

In SEO, there’s a tendency to chase perfection and aim for complete crawl control. 

But being overly aggressive here can do more harm than good, so tread carefully.

There are cases where it makes sense to de-canonicalize or limit the crawling of paginated URLs. 

Before taking that step, make sure you have data showing that crawl-efficiency issues outweigh the potential free traffic gains. 

If you don’t have that data, don’t block the URLs. Simple!

Read more at Read More