Posts

Organic Traffic: What It Is & How to Increase It (6 Ways)

Growing your website starts with reaching the right people.

Not random traffic.

People actively searching for the solutions you provide.

That’s where organic traffic shines.

Unlike paid advertising or social media, organic traffic attracts visitors with clear intent.

At Backlinko, organic traffic brings us over 571K monthly visitors—each one specifically interested in SEO and digital marketing.

GA – Backlinko – Users

In this guide, you’ll learn all about organic traffic, including how to measure and increase it.

Let’s start with what organic traffic is and how it differs from other traffic sources.

What Is Organic Traffic?

Organic traffic refers to visitors who land on your website or blog from unpaid search engine results.

Think of it as people finding your content naturally when they search for information, products, or services.

These unpaid clicks are organic traffic.

For example, if someone searches “seo competitor analysis” on Google and clicks on a regular (non-ad) result, that counts as organic traffic.

Google SERP – SEO competitor analysis

The key word here is “unpaid.”

While you might see “sponsored” or “ad” results at the top of search results, clicks on these aren’t organic traffic—they’re paid traffic.

Google SERP – CRM software – Sponsored

You earn organic traffic by creating high-quality content that matches what people are searching for (search intent) and optimizing it for search engines (SEO).

Understanding Organic vs. Other Traffic Sources

Search engines aren’t the only way people find websites.

Visitors might come from social media, email newsletters, or by typing your URL directly.

Here’s how different traffic sources compare:

Traffic Source How It Works Best For
Organic Users find you through unpaid search results Building long-term authority and consistent traffic
Paid search Users click your ads in search results Quick traffic for specific campaigns
Direct Users type your URL or use bookmarks Returning visitors and brand awareness
Social Users find you through social media Brand awareness and community building
Email Users click links in your emails Nurturing leads and customer retention
Referral Users click links from other websites Building authority and partnerships

Why Organic Traffic Matters

Organic traffic isn’t just about reaching your target audience.

It’s about building assets that continue delivering value long after you create them.

Here’s why investing in organic search makes sense for businesses of all sizes.

Cost-Effective

Organic traffic is one of the most affordable ways to attract qualified visitors to your website.

This is especially true when you compare it to paid advertising, which costs anywhere from $0.11 to $0.50 per click, according to a WebFX survey.

Now, let’s consider a leading home improvement site, The Spruce, as an example.

The Spruce – Homepage

They attract 9 million organic visitors monthly.

This traffic would cost them an estimated $7.1 million if they paid for it through Google Ads.

Organic Research – The Spruce – Overview

Pretty impressive, right?

As you can see, organic traffic can deliver incredible ROI compared to the ongoing costs of paid advertising.

Myth vs. fact: While organic traffic doesn’t require paying per click, it’s not exactly “free” either. At a minimum, you’ll be investing time into content creation and SEO. Many businesses also hire writers and editors to scale content production.


Builds Authority

Consistently ranking for search terms helps establish your site as an industry leader.

NerdWallet demonstrates this perfectly.

With 13.2 million monthly organic visitors and 5.2 million backlinks, they’ve become the go-to source for financial advice.

Domain Overview – NerdWallet – Overview

Their approach?

Creating comprehensive content that displays E-E-A-T (experience, expertise, authoritativeness, and trustworthiness) signals:

  • Author credentials and expertise
  • Regular content updates with accurate information
  • Citations to credible sources
  • Real-world experience and testing
  • Clear website policies and contact information

NerdWallet – E-E-A-T content

When they rank #1 on Google for terms like “best high yield savings account” or “how to invest in stocks,” it reinforces their position as a trusted voice in personal finance.

Google SERP – How to invest in stocks

And helps them build website authority and confidence with every click.

Drives Targeted Traffic

When someone finds your site through organic search, they’re currently looking for what you offer.

Think about someone searching for “how to optimize a blog post.”

They’re not casually browsing—they’re sitting at their computer, working on content, and need guidance right now.

Google SERP – How to optimize a blog post

If your site ranks highly for this search, you’re reaching them at the perfect moment:

When they’re most likely to read your advice, implement your tips, or purchase a solution that helps them succeed.

That’s the power of organic traffic.

By consistently appearing in these high-intent searches, you connect with people precisely when your expertise matters most.

Generates Long-Term Leads

While paid campaigns stop delivering the moment you pause them, organic traffic compounds over time.

The key is creating content that matches what your audience is searching for at every funnel stage.

Including when they’re researching a problem, comparing solutions, or ready to make a purchase.

This builds a sustainable pipeline of qualified leads that continues growing long after you publish the content.

The best part? You don’t have to depend on daily ad spend.

The Value of Paid vs. Organic Traffic 

Important: Consider organic traffic an investment rather than a quick win. While your exact timeline will vary based on industry, competition, and content strategy, it can take four to six months (or more) before you start seeing significant organic traffic growth.


How to Check Organic Traffic

Tracking your organic traffic reveals which content drives visitors and growth opportunities.

It also proves your SEO ROI.

These three tools make it easy.

Organic Research

Semrush’s Organic Research tool goes beyond basic traffic metrics to show you the full picture of your organic performance.

Note: A free Semrush account gives you 10 searches in Organic Research per day. Or you can use this link to access a 14-day trial on a Semrush Pro subscription.


Start by entering your domain and clicking “Search.”

For this example, I used Sweet Peas and Saffron, a meal prep site.

Organic Research – Sweetpeasandsaffron – Search

Right away, you’ll see interesting data on your site’s organic performance.

For example, I learned that Sweet Peas and Saffron has:

  • 88.8K ranking keywords
  • 110.2K monthly organic traffic
  • $33.8K in traffic value

Organic Research – Sweetpeasandsaffron – Overview

Click the “Positions” tab to discover every keyword you rank for.

Filter by “Positions,” “Volume,” “Intent,” “SERP Features,” and more for deeper insights into your search performance.

Organic Research – Sweetpeasandsaffron – Positions – Filters

I filtered Sweet Peas and Saffron’s results by “Position #1” and learned they have 791 organic keywords in the first spot on Google.

Terms like “air fryer chickpeas” and “asian salad dressing” drive consistent organic traffic to this site every single month.

Organic Research – Sweetpeasandsaffron – Organic – Position filters

The “Position Changes” report shows where you’re gaining or losing ground in search results.

This helps you spot trends and react quickly to ranking drops.

Organic Research – Sweetpeasandsaffron – Position Changes Trend

For example, Sweet Pea and Saffron’s post, “Easy Homemade Fajita Seasoning,” has recently seen a 2.6K decrease in traffic.

Organic Research – Sweetpeasandsaffron – Position Changes – Top Page Changes

When you see drops like this, you can:

  • Check if your content needs updating
  • See if competitors have published better content
  • Look for technical issues affecting the page
  • Review if the search intent has changed

The sooner you identify these issues, the faster you can fix them and recover your rankings.

Google Analytics

Google Analytics 4 (GA4) shows you where your organic traffic comes from and how visitors behave on your site.

Here’s how to check your organic traffic on this platform:

Click “Reports” in the left-hand menu.

GA4 – Reports

Click “Life cycle,” > “Acquisition” > “Traffic acquisition.”

GA4 – Reports – Traffic acquisition

Now, you’ll see an overview of your site’s performance, including organic search.

GA4 – Traffic acquisition report

For a breakdown of which search engines drive the most organic traffic to your site, scroll to the traffic acquisition report.

Next, click the plus sign to add a secondary dimension.

GA4 – Traffic acquisition report – Plus sign

Select “Session source.”

GA4 – Session source

Now, you’ll see an organic traffic breakdown by search engine.

GA4 – Organic Search – Session source

Pro tip: Want to see which individual pages get the most organic traffic? Go to “Engagement” > “Pages and Screens” and add a secondary dimension of “Session source / medium.”


Google Search Console

While GA4 tracks all search engines, Google Search Console (GSC) focuses solely on Google traffic—giving you detailed data about your Google search performance.

Start by opening your GSC account and clicking “Performance” > “Search results” in the left sidebar.

Google Search Console – Performance – Search Results

Scroll to see the top queries and pages that attract organic traffic to your site.

You’ll learn how many clicks and impressions each one gets.

And each term’s position on the search engine results pages (SERPs).

GSC – Performance – Pages

How to Increase Organic Traffic in 6 Steps

There are dozens of ways to improve organic traffic.

But these six high-impact tactics consistently deliver the best results.

Organic Traffic Growth Checklist

1. Fix Technical Issues

Technical problems can limit your organic traffic growth.

This is why it’s important to identify and fix them.

First things first:

Make sure Google is indexing your content.

Go to Google Search Console and click “Indexing” > “Pages.”

If you see lots of non-indexed pages, don’t panic just yet. This number will vary for every site.

And preventing certain pages from being indexed can actually be a good thing.

What matters is that your most important pages are properly indexed.

GSC – Backlinko – Page indexing

View the “Why pages aren’t indexed” report to check for issues.

This helps you distinguish between:

  • Intentionally excluded pages, such as duplicate content, form submissions, paginated pages, or anything else you don’t want indexed
  • Important pages that should be indexed but aren’t due to redirect errors, accidental noindex tags, and other issues

GSC – Why pages aren't indexed

Click any error to get details and request immediate indexing if needed.

This can be a fast and easy way to recover lost organic traffic.

GSC – URL is not on Google

Next, use Google’s PageSpeed Insights to check your site’s loading speed and Core Web Vitals—Google’s key metrics for measuring user experience.

Slow, poorly performing pages drive visitors away and hurt your rankings.

PageSpeed Insights – Backlinko – Mobile

The tool will tell you if your site passes or fails the assessment.

It’ll also tell you what went wrong and how to fix it—like resizing images and reducing JavaScript execution time.

PageSpeed Insights – Reducing JavaScript execution time

For a complete technical health checkup, use Semrush’s Site Audit to scan your entire site.

It uncovers hidden issues that could be limiting your organic growth:

  • Crawlability problems
  • Internal linking errors
  • Performance issues
  • Site structure problems

Site Audit – Backlinko – Overview

Review the list of “Errors” first—these are critical issues that could have the biggest impact on your organic traffic potential.

Like duplicate content, broken internal links, and returned status codes.

Site Audit – Backlinko – Errors

2. Select Keywords with Low Difficulty for Your Site

Want to know the secret to ranking faster?

Don’t go straight for keywords with the highest monthly search volume.

Start with lower-difficulty terms that are easier to rank for.

Here’s how to do it with Semrush’s Keyword Magic Tool:

Enter a broad term like “LinkedIn marketing” and click “Search.”

Keyword Magic Tool – LinkedIn marketing – Search

Note: A free Semrush account gives you 10 searches in the Keyword Magic Tool per day. Or you can use this link to access a 14-day trial on a Semrush Pro subscription.


The tool will return a long list of keywords and metrics for each one.

Including:

  • Search volume: How many times per month a keyword is searched
  • Keyword difficulty (KD): This is a score from one to 100, showing how hard it’ll be to rank in Google’s top 10 for the given term
  • Intent: The reason behind a user’s search: they want information (informational), they want to compare options (commercial), they’re trying to find a specific site (navigational), or they want to buy something (transactional)

Keyword Magic Tool – LinkedIn marketing – Keywords

LinkedIn marketing” gets 4,400 searches per month but has a daunting keyword difficulty of 95.

Let’s filter the results so we can find terms with lower difficulty.

Click “KD” on the menu and enter “0” and “49.”

Now, you’ll see only terms that are “easy” or “possible” to rank for.

Keyword Magic Tool – LinkedIn marketing – KD filter

As you review the list, look for terms that have:

  • Search intent aligned with your content type (how-to, product reviews, etc.)
  • Realistic competition levels for your site
  • Enough monthly searches to be worth targeting

Pro tip: Enter your domain into the AI-powered search bar in the Keyword Magic Tool. You’ll get a personalized difficulty score that shows which keywords are actually within reach for your site.

Keyword Magic Tool – LinkedIn marketing – AI powered – Backlinko


3. Refresh Existing Content

Want faster results?

Update existing content instead of only creating new posts.

Since Google already knows these pages exist, you’ll be more likely to see improvements quickly.

Your first step is to identify underperforming content that could benefit from a refresh.

Head to Google Search Console and click “Performance” > “Search results.”

Filter for pages with a low click-through rate (CTR) (under 3%).

Click the filter menu icon in the right-hand corner and select “Clicks.”

Performance – Search results – CTR – Filter by clicks

Select “Smaller than” and enter “3.”

Click “Done.”

GSC – Filtered by clicks & smaller then

Now, you’ll only see pages on your site with a 3% (or less) CTR.

Take note of the pages that have decent impressions.

Keep in mind that what’s decent for you will depend on your site, but think 500+.

These pages should have the highest potential to gain ground quickly because Google is showing them to users already.

GSC – Pages – Filtered results

Next, create a tracking spreadsheet for this project.

This can be a simple Google or Excel spreadsheet.

Google spreadsheet – Content Refresh Project

Then, search for your target keyword and analyze the top five ranking pages.

Look specifically for:

  • Search intent: Does their content better match what searchers want?
  • Visuals: Do they use more helpful images, tables, infographics, or pro tips?
  • Content depth: What topics, examples, or case studies do they cover that you don’t?
  • Updated information: Are they citing newer stats or trends?
  • Metadata: Are their title tags and meta descriptions more compelling?
  • Structure: Is their content easier to scan and understand, with descriptive subheads?
  • Writer expertise: Do they show more authority on the topic?

Make targeted improvements based on your findings.

Some pages might need a complete rewrite, while others may only need updated statistics or better examples.

Track your results in Google Search Console and Google Analytics.

Look for increases in:

4. Target Evergreen Topics

Unlike news articles or trend pieces that quickly fade, evergreen topics maintain consistent organic search volume because they solve timeless problems.

Think about searches like “how to tie a tie” or “how to cook quinoa”—these questions never go out of style.

Case in point:

The search term “how to cook quinoa” gets 33.1K monthly searches.

Keyword Overview – How to cook quinoa

If you search Google for this term, you’ll see countless articles.

But one of the top-ranking articles is from a recipe site called Love & Lemons.

Google SERP – How to cook quinoa

Love & Lemons gets 149K organic visitors every month from this single article.

So, how’d they turn one evergreen topic into a massive organic traffic source?

Organic Research – Love and Lemons – Traffic

They chose a topic they knew would interest their target audience: cooking quinoa.

And created a guide that included:

  • Clear explanations of quinoa basics and nutrition facts
  • Detailed, tested cooking instructions
  • Expert tips from real kitchen experience
  • Multiple cooking methods for different preferences
  • Strategic internal links to related quinoa recipes

This comprehensive approach works because it answers every question a reader might have about cooking quinoa.

What is quinoa

How can you do this, too?

The key is choosing topics that solve ongoing problems and answering fundamental questions.

(Decent search volume helps, too.)

Find keywords like this for your site by using tools like Exploding Topics, Google Trends, and Google autocomplete searches.

Google Suggest – How to cook

5. Add Internal Links

Internal linking helps Google find, index, and understand your content.

Which is why it’s helpful for increasing organic traffic.

But it can also keep visitors around for longer. Meaning you retain more of the traffic you worked so hard to get.

Start by looking for pages on your site with no internal links (also called “orphan pages”).

Use a tool that automates the process of finding orphan pages, like Site Audit or RankMath.

Site Audit – Backlinko – Issues – Orphan pages

But don’t add just any links.

Your internal links need to:

  • Match the context of the surrounding content
  • Use descriptive anchor text that explains the destination
  • Help users find related information they want

For example, instead of “click here,” use specific anchor text like “AI writing tools” for an AI writing tools comparison page.

Or “keyword research guide” for a keyword tutorial.

This helps Google and visitors understand what they’ll find on the linked page.

Backlinko – ChatGPT Alternatives – Linked page

Pro tip: Every time you publish a new post, spend five to 10 minutes adding relevant internal links from your existing content. This can help Google discover and rank your new content faster.


Bonus Step: Optimize Your Link Flow

  • Prioritize link equity by linking from high-authority pages to newer or lower-performing ones
  • Use topic clusters to organize related content into silos that strengthen overall relevance
  • Ensure crawl efficiency by avoiding excessive links on a single page

6. Build a Backlink Strategy

Backlinks remain one of Google’s strongest ranking signals.

The more relevant sites that link to you, the more organic traffic you can attract.

But how do you earn these valuable links?

Start by creating content that naturally attracts them.

The most linkable content types we’ve found include:

  • Original research and industry studies
  • Comprehensive how-to guides that fill knowledge gaps
  • Free tools and templates
  • Expert roundups with unique insights
  • Attractive visuals (like infographics)

For example, we created a detailed analysis of Google’s ranking factors that has attracted 33.7K backlinks to date.

Backlink Analytics – Backlinko – Ranking Factors – Backlinks

Why did it work so well?

Because it’s a comprehensive resource with over 200 ranking factors.

And we constantly update it to ensure it features the freshest information and studies.

Backlinko – Google Ranking Factors

But creating linkable assets isn’t your only option for boosting backlinks and organic traffic.

Build genuine industry relationships by joining relevant Slack, Facebook, and Reddit communities and participating in discussions.

Focus on adding value first—answer questions, share insights, and build real connections.

Only include links to your site when relevant and helpful (and if allowed by the community).

Reddit – r/NativePlantGardening

Want to accelerate your link building?

Study what’s already working in your industry.

Use Semrush’s Backlink Analytics to monitor your competitors’ backlinks.

Backlink Analytics – Aznps – Overview

When studying competitor backlinks, look for patterns.

Do certain topics or content formats consistently earn more links?

Use these insights to create similar (but better) resources.

And reach out to those same sites to start building your backlink profile.

Future-Proof Your Traffic Strategy

You now have a proven system for growing organic traffic.

But this is just one part of a successful traffic strategy.

Relying solely on organic traffic is risky, even for the most experienced SEO or marketer.

Here’s why:

  • Algorithm updates can instantly impact your rankings
  • Core updates happen multiple times per year
  • Industry shifts can change how people search
  • New competitors can push you down in search results

The smarter approach?

Building a diversified traffic strategy that includes everything from organic and social to email marketing.

Get started by reading 28 Ways to Increase Traffic to Your Website, featuring proven multi-channel strategies.


The post Organic Traffic: What It Is & How to Increase It (6 Ways) appeared first on Backlinko.

Read more at Read More

What Is JavaScript SEO? 6 Best Practices to Boost Rankings

JavaScript has enabled highly interactive and dynamic websites. But it also presents a challenge: ensuring your site is crawlable, indexable, and fast.

That’s why JavaScript SEO is essential.

When applied correctly, these strategies can significantly boost organic search performance.

For instance, book retailer Follet saw a remarkable recovery after fixing JavaScript issues:

JavaScript SEO Improvements

That’s the impact of effective JavaScript SEO.

In this guide, you’ll:

  • Get an introduction to JavaScript SEO
  • Understand the challenges with using JavaScript for search
  • Learn best practices to optimize your JavaScript site for organic search

What Is JavaScript SEO?

JavaScript SEO is the process of optimizing JavaScript websites. It ensures search engines can crawl, render, and index them.

Aligning JavaScript websites with SEO best practices can boost organic search rankings. All without hurting the user experience.

However, there are still uncertainties surrounding JavaScript and SEO’s impact.

Common JavaScript Misconceptions

Misconception Reality
Google can handle all JavaScript perfectly. Since JavaScript is rendered in two phases, delays and errors can occur. These issues can stop Google from crawling, rendering, and indexing content, hurting rankings.
JavaScript is only for large sites. JavaScript is versatile and benefits websites of varying sizes. Smaller sites can use JavaScript in interactive forms, content accordions, and navigation dropdowns
JavaScript SEO is optional. JavaScript SEO is key for finding and indexing content, especially on JavaScript-heavy sites.

Benefits of JavaScript SEO

Optimizing JavaScript for SEO can offer several advantages:

  • Improved visibility: Crawled and indexed JavaScript content can boost search rankings
  • Enhanced performance: Techniques like code splitting deliver only the important JavaScript code. This speeds up the site and reduces load times.
  • Stronger collaboration: JavaScript SEO encourages SEOs, developers, and web teams to work together. This helps improve communication and alignment on your SEO project plan.
  • Enhanced user experience: JavaScript boosts UX with smooth transitions and interactivity. It also speeds up and makes navigation between webpages more dynamic.

Side note: JavaScript can impact PageSpeed and Core Web Vitals scores.


How Search Engines Render JavaScript

To understand JavaScript’s SEO impact, let’s explore how search engines process JavaScript pages.

Google has outlined that it processes JavaScript websites in three phases:

  1. Crawling
  2. Processing
  3. Indexing

Googlebot – Crawl Render Index

Crawling

When Google finds a URL, it checks the robots.txt file and meta robots tags. This is to see if any content is blocked from being crawled or rendered.

If a link is discoverable by Google, the URL is added to a queue for simultaneous crawling and rendering.

Rendering

For traditional HTML websites, content is immediately available from the server response.

In JavaScript websites, Google must execute JavaScript to render and index the content. Due to resource demands, rendering is deferred until resources are available with Chromium.

Indexing

Once rendered, Googlebot reads the HTML, adds new links to the crawl list, and indexes the content.

How JavaScript Affects SEO

Despite its growing popularity, the question often arises: Is JavaScript bad for SEO?

Let’s examine aspects that can severely impact SEO if you don’t optimize JavaScript for search.

Rendering Delays

For Single Page Applications (SPAs) — like Gmail or Twitter, where content updates without page refreshes — JavaScript controls the content and user experience.

If Googlebot can’t execute the JavaScript, it may show a blank page.

This happens when Google struggles to process the JavaScript. It hurts the page’s visibility and organic performance.

To test how Google will see your SPA site if it can’t execute JavaScript, use the web crawler Screaming Frog. Configure the render settings to “Text Only” and crawl your site.

Note: You’ll need an SEO Spider Licence to access this setting.

Screaming Frog – Crawl Config – Rendering


Expert tip: Use Screaming Frog’s “Disable JavaScript” feature. It simulates how search engines crawl your site without executing scripts. This allows you to identify missing content or rendering issues.


Indexing Issues

JavaScript frameworks (like React or Angular, which help build interactive websites) can make it harder for Google to read and index content.

For example, Follet’s online bookstore migrated millions of pages to a JavaScript framework.

Google had trouble processing the JavaScript, causing a sharp decline in organic performance:

Impact from Rendering Issues

Crawl Budget Challenges

Websites have a crawl budget. This refers to the number of pages Googlebot can crawl and index within a given timeframe.

Large JavaScript files consume significant crawling resources. They also limit Google’s ability to explore deeper pages on the site.

Core Web Vitals Concerns

JavaScript can affect how quickly the main content of a web page is loaded. This affects Largest Contentful Paint (LCP), a Core Web Vitals score.

For example, check out this performance timeline:

LCP Breakdown – Render Delay

Section #4 (“Element Render Delay”) shows a JavaScript-induced delay in rendering an element.

This negatively impacts the LCP score.

JavaScript Rendering Options

When rendering webpages, you can choose from three options:

Server-Side Rendering (SSR), Client-Side Rendering (CSR), or Dynamic Rendering.

Let’s break down the key differences between them.

Server-Side Rendering (SSR)

SSR creates the full HTML on the server. It then sends this HTML directly to the client, like a browser or Googlebot.

Server Side Rendering Process

This approach means the client doesn’t need to render the content.

As a result, the website loads faster and offers a smoother experience.

Benefits of SSR Drawbacks of SSR
Improved performance Higher server load
Search engine optimization Longer time to interactivity
Enhanced accessibility Complex implementation
Consistent experience Limited caching

Client-Side Rendering (CSR)

In CSR, the client—like a user, browser, or Googlebot—receives a blank HTML page. Then, JavaScript runs to generate the fully rendered HTML.

Client Side Rendering Process

Google can render client-side, JavaScript-driven pages. But, it may delay rendering and indexing.

Benefits of CSR Drawbacks of CSR
Reduced server load Slower initial load times
Enhanced interactivity SEO challenges
Improved scalability Increased complexity
Faster page transitions Performance variability

Dynamic Rendering

Dynamic rendering, or prerendering, is a hybrid approach.

Tools like Prerender.io detect Googlebot and other crawlers. They then send a fully rendered webpage from a cache.

Dynamic Rendering Process

This way, search engines don’t need to run JavaScript.

At the same time, regular users still get a CSR experience. JavaScript is executed and content is rendered on the client side.

Google says dynamic rendering isn’t cloaking. The content shown to Googlebot just needs to be the same as what users see.

However, it warns that dynamic rendering is a temporary solution. This is due to its complexity and resource needs.

Benefits of Dynamic Rendering Drawbacks of Dynamic Rendering
Better SEO Complex setup
Crawler compatibility Risk of cloaking
Optimized UX Tool dependency
Scalable for large sites Performance latency

Which Rendering Approach is Right for You?

The right rendering approach depends on several factors.

Here are key considerations to help you determine the best solution for your website:

Rendering Option Best for When to Choose Requirements
Server-Side Rendering (SSR) SEO-critical sites (e.g., ecommerce, blogs)

Sites relying on organic traffic

Faster Core Web Vitals (e.g., LCP)

Need timely indexing and visibility

Users expect fast, fully-rendered pages upon load

Strong server infrastructure to handle higher load

Expertise in SSR frameworks (e.g., Next.js, Nuxt.js)

Client-Side Rendering (CSR) Highly dynamic user interfaces (e.g., dashboards, web apps)

Content not dependent on organic traffic (e.g. behind login)

SEO is not a top priority

Focus on reducing server load and scaling for large audiences

JavaScript optimization to address performance issues

Ensuring crawlability with fallback content

Dynamic Rendering JavaScript-heavy sites needing search engine access

Large-scale, dynamic content websites

SSR is resource-intensive for the entire site

Need to balance bot crawling with user-focused interactivity

Pre-rendering tool like Prerender.io

Bot detection and routing configuration

Regular audits to avoid cloaking risks

Knowing these technical solutions is important. But the best approach depends on how your website uses JavaScript.

Where does your site fit?

  • Minimal JavaScript: Most content is in the HTML (e.g., WordPress sites). Just make sure search engines can see key text and links.
  • Moderate JavaScript: Some elements load dynamically, like live chat, AJAX-based widgets, or interactive product filters. Use fallbacks or dynamic rendering to keep content crawlable.
  • Heavy JavaScript: Your site depends on JavaScript to load most content, like SPAs built with React or Vue. To make sure Google can see it, you may need SSR or pre-rendering.
  • Fully JavaScript-rendered: Everything from content to navigation relies on JavaScript (e.g., Next.js, Gatsby). You’ll need SSR or Static Site Generation (SSG), optimized hydration, and proper metadata handling to stay SEO-friendly.

The more JavaScript your site relies on, the more important it is to optimize for SEO.

JavaScript SEO Best Practices

So, your site looks great to users—but what about Google?

If search engines can’t properly crawl or render your JavaScript, your rankings could take a hit.

The good news? You can fix it.

Here’s how to make sure your JavaScript-powered site is fully optimized for search.

1. Ensure Crawlability

Avoid blocking JavaScript files in the robots.txt file to ensure Google can crawl them.

In the past, HTML-based websites often blocked JavaScript and CSS.

Now, crawling JavaScript files is crucial for accessing and rendering key content.

2. Choose the Optimal Rendering Method

It’s crucial to choose the right approach based on your site’s needs.

This decision may depend on your resources, user goals, and vision for your website. Remember:

  • Server-side rendering: Ensures content is fully rendered and indexable upon page load. This improves visibility and user experience.
  • Client-side rendering: Renders content on the client side, offering better interactivity for users
  • Dynamic rendering: Sends crawlers pre-rendered HTML and users a CSR experience

Rendering Options

3. Reduce JavaScript Resources

Reduce JavaScript size by removing unused or unnecessary code. Even unused code must be accessed and processed by Google.

Combine multiple JavaScript files to reduce the resources Googlebot needs to execute. This helps improve efficiency.

Pro tip: Use PageSpeed Insights to find JavaScript issues, like render-blocking scripts. Follow its suggestions, such as deferring non-critical scripts or minifying code.


4. Defer Scripts Blocking Content

You can defer render-blocking JavaScript to speed up page loading.

Use the “defer” attribute to do this, as shown below:

http://your-script.js

This tells browsers and search engines to run the code once the main CSS and JavaScript have loaded.

5. Manage JavaScript-Generated Content

Managing JavaScript content is key. It must be accessible to search engines and provide a smooth user experience.

Here are some best practices to optimize it for SEO:

Provide Fallback Content

  • Use the <noscript> tag to show essential info if JavaScript fails or is disabled
  • Ensure critical content like navigation and headings is included in the initial HTML

For example, Yahoo uses a <noscript> tag. It shows static product details for JavaScript-heavy pages.

Yahoo – Noscript tag

Optimize JavaScript-Based Pagination

  • Use HTML <a> tags for pagination to ensure Googlebot can crawl each page
  • Dynamically update URLs with the History API for “Load More” buttons
  • Add rel=”prev” and rel=”next” to indicate paginated page relationships

For instance, Skechers employs a “Load More” button that generates accessible URLs:

Sketchers – Load More

Test and Verify Rendering

  • Use Google Search Console’s (GSC) URL Inspection Tool and Screaming Frog to check JavaScript content. Is it accessible?
  • Test JavaScript execution using browser automation tools like Puppeteer to ensure proper rendering

Confirm Dynamic Content Loads Correctly

  • Use loading=”lazy” for lazy-loaded elements and verify they appear in rendered HTML
  • Provide fallback content for dynamically loaded elements to ensure visibility to crawlers

For example, Backlinko lazy loads images within HTML:

Image loading – Lazy

6. Create Developer-Friendly Processes

Working closely with developers is key to integrating JavaScript and SEO best practices.

Here’s how you can streamline the process:

  1. Spot the issues: Use tools like Screaming Frog or Chrome DevTools. They can find JavaScript rendering issues. Document these early.
  2. Write actionable tickets: Write clear SEO dev tickets with the issue, its SEO impact, and step-by-step instructions to fix it. For example, here’s a sample dev ticket:

    Sample – Dev Ticket

  3. Test and validate fixes: Conduct quality assurance (QA) to ensure fixes are implemented correctly. Share updates and results with your team to maintain alignment.
  4. Collaborate in real time: Use project management tools like Notion, Jira, or Trello. These help ensure smooth communication between SEOs and developers.

By building developer-friendly processes, you can solve JavaScript SEO issues faster. This also creates a collaborative environment that helps the whole team.

Communicating SEO best practices for JavaScript usage is as crucial as its implementation.

JavaScript SEO Resources + Tools

As you learn how to make your javascript SEO friendly​, several tools can assist you in the process.

Educational Resources

Google has provided or contributed to some great resources:

Understand JavaScript SEO Basics

Google’s JavaScript basics documentation explains how it processes JavaScript content.

Google JavaScript – Basics

What you’ll learn:

  • How Google processes JavaScript content, including crawling, rendering, and indexing
  • Best practices for ensuring JavaScript-based websites are fully optimized for search engines
  • Common pitfalls to avoid and strategies to improve SEO performance on JavaScript-driven websites

Who it’s for: Developers and SEO professionals optimizing JavaScript-heavy sites.

Rendering on the Web

The web.dev article Rendering on the Web is a comprehensive resource. It explores various web rendering techniques, including SSR, CSR, and prerendering.

Webdev – Rendering on Web

What you’ll learn:

  • An in-depth overview of web rendering techniques
  • Performance implications of each rendering method. And how they affect user experience and SEO.
  • Actionable insights for choosing the right rendering strategy based on your goals

Who it’s for: Marketers, developers, and SEOs wanting to boost performance and visibility.

Diagnostic Tools

Screaming Frog & Sitebulb

Crawlers such as Screaming Frog or Sitebulb help identify issues affecting JavaScript.

How? By simulating how search engines process your site.

Key features:

  • Crawl JavaScript websites: Detect blocked or inaccessible JavaScript files using robots.txt configurations
  • Render simulation: Crawl and visualize how JavaScript-rendered pages appear to search engines
  • Debugging capabilities: Identify rendering issues, missing content, or broken resources preventing proper indexing

Example use case:

  • Use Screaming Frog’s robots.txt settings to emulate Googlebot. The tool can confirm if critical JavaScript files are accessible.

Screaming Frog – Robots Settings

When to use:

  • Debugging JavaScript-related indexing problems
  • Testing rendering issues with pre-rendered or dynamic content

Semrush Site Audit

Semrush’s Site Audit is a powerful tool for diagnosing JavaScript SEO issues.

Key features:

  • Crawlability checks: Identifies JavaScript files that hinder rendering and indexing
  • Rendering insights: Detects JavaScript-related errors impacting search engines’ ability to process content
  • Performance metrics: Highlights Core Web Vitals like LCP and Total Blocking Time (TBT)
  • Actionable fixes: Provides recommendations to optimize JavaScript code, improve speed, and fix rendering issues

Site Audit – Backlinko – Overview

Site Audit also includes a “JS Impact” report, which focuses on uncovering JavaScript-related issues.

It highlights blocked files, rendering errors, and performance bottlenecks. The report provides actionable insights to enhance SEO.

Site Audit – Backlinko – JS Impact

When to use:

  • Identify rendering blocking issues caused by JavaScript
  • Troubleshoot performance issues after implementing large JavaScript implementations

Google Search Console

Google Search Console’s Inspection Tool helps analyze your JavaScript pages. It checks how Google crawls, renders, and indexes them.

GSC – URL Inspection – JS Console Messages

Key features:

  • Rendering verification: Check if Googlebot successfully executes and renders JavaScript content
  • Crawlability insights: Identify blocked resources or missing elements impacting indexing
  • Live testing: Use live tests to ensure real-time changes are visible to Google

Example use case:

  • Inspecting a JavaScript-rendered page to see if all critical content is in the rendered HTML

When to use:

  • Verifying JavaScript rendering and indexing
  • Troubleshooting blank or incomplete content in Google’s search results

Pro tip: Use GSC’s “Coverage Report.” It can find resources blocked by robots.txt or delayed by heavy JavaScript. Regularly reviewing this report helps maintain optimal crawlability.


Performance Optimization

You may need to test your JavaScript website’s performance. These tools granularly break down performance:

WebPageTest

WebPageTest helps analyze website performance, including how JavaScript affects load times and rendering.

The screenshot below shows high-level performance metrics for a JavaScript site. It includes when the webpage was visible to users.

Webpagetest – Page Performance Metrics

Key features:

  • Provides waterfall charts to visualize the loading sequence of JavaScript and other resources
  • Measures critical performance metrics like Time to First Byte (TTFB) and LCP
  • Simulates slow networks and mobile devices to identify JavaScript bottlenecks

Use case: Finding scripts or elements that slow down page load and affect Core Web Vitals.

GTMetrix

GTmetrix helps measure and optimize website performance, focusing on JavaScript-related delays and efficiency.

Key features:

  • Breaks down page performance with actionable insights for JavaScript optimization
  • Provides specific recommendations to minimize and defer non-critical JavaScript
  • Visualizes load behavior with video playback and waterfall charts to pinpoint render delays

GTmetrix – Waterfall – Images – Report

Use case: Optimizing JavaScript delivery to boost page speed and user experience. This includes minifying, deferring, or splitting code.

Chrome DevTools & Lighthouse

Chrome DevTools and Lighthouse are free Chrome tools. They assess site performance and accessibility. Both are key for JavaScript SEO.

Key features:

  • JavaScript execution analysis: Audits JavaScript execution time. It also identifies scripts that delay rendering or impact Core Web Vitals.
  • Script optimization: Flags opportunities for code splitting, lazy loading, and removing unused JavaScript
  • Network and coverage insights: Identifies render-blocking resources, unused JavaScript, and large file sizes
  • Performance audits: Lighthouse measures critical Core Web Vitals to pinpoint areas for improvement
  • Render simulation: It emulates devices, throttles network speeds, and disables JavaScript. This alleviates rendering issues.

For example, the below screenshot is taken with DevTools’s Performance panel. After page load, various pieces of data are recorded to assess the culprit of heavy load times.

Chrome – Performance report

Use cases:

  • Testing JavaScript-heavy pages for performance bottlenecks, rendering issues, and SEO blockers
  • Identifying and optimizing scripts, ensuring key content is crawlable and indexable

Specialized Tools

Prerender.io helps JavaScript-heavy websites by serving pre-rendered HTML to bots.

This allows search engines to crawl and index content while users get a dynamic CSR experience.

PreRender – Cache Manager

Key features:

  • Pre-rendered content: Serves a cached, fully rendered HTML page to search engine crawlers like Googlebot
  • Easy integration: Compatible with frameworks like React, Vue, and Angular. It also integrates with servers like NGINX or Apache.
  • Scalable solution: Ideal for large, dynamic sites with thousands of pages
  • Bot detection: Identifies search engine bots and serves optimized content
  • Performance optimization: Reduces server load by offloading rendering to Prerender.io’s service

Benefits:

  • Ensures full crawlability and indexing of JavaScript content
  • Improves search engine rankings by eliminating blank or incomplete pages
  • Balances SEO performance and user experience for JavaScript-heavy sites

When to use:

  • For Single-Page Applications or dynamic JavaScript frameworks
  • As an alternative to SSR when resources are limited

Find Your Next JavaScript SEO Opportunity Today

Most JavaScript SEO problems stay hidden—until your rankings drop.

Is your site at risk?

Don’t wait for traffic losses to find out.

Run an audit, fix rendering issues, and make sure search engines see your content.

Want more practical fixes?

Check out our guides on PageSpeed and Core Web Vitals for actionable steps to speed up your JavaScript-powered site.

The post What Is JavaScript SEO? 6 Best Practices to Boost Rankings appeared first on Backlinko.

Read more at Read More

Semrush Pricing: How to Choose the Right Plan

Choosing a Semrush plan isn’t always obvious.

Paid plans start at $139.95 and go up to enterprise solutions. Not to mention the various add-ons and apps.

You don’t want to overpay for features you don’t need. Or pick a cheaper plan that limits your ability to grow.

In this guide, you’ll learn which Semrush plan matches your needs, whether you’re a solo blogger tracking 100 keywords or an agency managing 40+ client websites.

Semrush’s Core Pricing Plans Explained

Semrush offers four main subscription tiers:

  • Pro plan at $139.95/month: Best for freelancers and startups
  • Guru plan at $249.95/month: Caters to growing businesses and small agencies
  • Business plan at $499.95/month: Serves larger agencies and enterprises
  • Enterprise plan (custom pricing): For organizations that need custom solutions

Semrush Pricing

Beyond these core plans, Semrush also offers a range of other tools through the App Center. This is where you can add specialized tools for needs like local SEO and social media management.

There’s also a free plan, and you can get a free trial of the Pro and Guru subscriptions too.

Note: The Semrush free trial is usually 7 days, but you can use this link to access a 14-day trial on a Semrush Pro subscription.


Pro Plan ($139.95/month): Entry-Level Features

Semrush’s Pro plan offers a wealth of keyword research, backlink analysis, and competitor research features.

At $139.95/mo, it’s Semrush’s cheapest plan, and is ideal for freelance SEOs, bloggers, and small business owners.

Semrush Pricing – Pro

The Pro plan lets you set up 5 projects and track up to 500 keywords with Position Tracking.

But these limits don’t apply to things like keyword and competitor research. Instead, you’re limited in the number of daily “requests” you can make.

This is the number you’ll want to pay attention to if you plan to use the tool suite for more than just tracking your own projects.

With the Pro plan, you can generate up to 3,000 reports per day across various analytics tools, with each report showing up to 10,000 results.

For example, in the Keyword Magic Tool to generate thousands of keyword ideas:

Keyword Magic Tool – Content creation – Keywords

You won’t have access to historical data in these reports. But you’ll still be able to filter for metrics like search volume, keyword difficulty, search intent, and more.

In terms of technical SEO, Pro plan users can crawl up to 100,000 pages with Site Audit. This is enough for beginners and owners of smaller sites. But it can be a bit limiting for large ecommerce stores or agencies managing massive sites.

You won’t get API access with the Pro plan (probably not a concern for most people). You also won’t get access to some content marketing features (see the Guru plan section below).

Pro Plan Limits

  • 5 projects
  • 500 keywords to track
  • 3,000 daily reports
  • 100,000 pages to crawl
  • 10,000 results per report
  • 250 keyword metrics updates per month
  • 500 SEO Ideas Units (used in tools like the On Page SEO Checker)
  • 5 scheduled PDF reports

Guru Plan ($249.95/month): Advanced Features for Growing Teams

Semrush’s Guru plan significantly expands on the Pro plan’s capabilities.

At $249.95/mo, it’s ideal for growing marketing teams and small agencies that need more comprehensive tools and data access.

Semrush Pricing – Guru

The plan increases your project limit to 15 and lets you track up to 1,500 keywords.

Other limit increases over the Pro plan include:

  • 5,000 reports per day (vs. 3,000)
  • 30,000 results per report (vs. 10,000)
  • 1,000 keyword metrics updates per month (vs. 250)
  • 300,000 Site Audit URL crawls (vs. 100,000)

You’ll also get access to tools like:

Topic research, for finding and prioritizing new content ideas:

Topic Research – Organic coffee – Content Ideas

Content Marketing Template, to streamline your optimizations:

SEO Content Template – Recommendations

You’ll also be able to integrate with Looker Studio, which further expands your reporting capabilities.

And you’ll get access to historical data within Semrush itself—all the way back to 2012:

Domain Overview – Backlinko – Historical data

The Guru plan gives you access to the essential Semrush toolkit. And its limits are likely enough for most SEOs and business owners, with the exception of large agencies, big ecommerce stores (300K+ pages), and enterprises.

Guru Plan Limits

  • 15 Projects
  • 1,500 keywords to track
  • 5,000 daily reports
  • 300,000 pages to crawl
  • 30,000 results per report
  • 1,000 keyword metrics updates per month
  • 800 SEO Ideas Units
  • 20 scheduled PDF reports

Business Plan ($499.95/month): Enterprise-Grade Capabilities

The Business plan, at $499.95/mo, targets larger agencies and marketing teams that need extensive data access and advanced features.

This plan also offers much higher limits than the Pro and Guru plans across the board.

Semrush Pricing – Business

Business plans allow for 40 projects, 5,000 keywords to track, and 10,000 daily reports.

You can use Semrush’s most advanced features, including API access, extended limits for site audits, and white-label reporting options.

You’ll also get access to new metrics, like Share of Voice for tracking your overall online presence compared to your competitors:

Position Tracking – Backlinko – Share of Voice

And for PPC optimization, you’ll also be able to see up to 50,000 results per PLA listings report (as opposed to 10 on the Pro and Guru plans):

PLA Research – Ebay – PLA Copies

These capabilities make it particularly valuable for agencies managing multiple client accounts. It’s also ideal for large in-house teams coordinating complex marketing campaigns.

For example, the API access allows teams to integrate Semrush data directly into their custom reporting dashboards or internal tools. This can streamline workflows and provide more customizable (and therefore more impactful) data analysis.

Business Plan Limits

  • 40 projects
  • 5,000 keywords to track
  • 10,000 daily reports
  • 1,000,000 pages to crawl
  • 50,000 results per report
  • 5,000 keyword metrics updates per month
  • 2,000 SEO Ideas Units
  • 50 scheduled PDF reports

Enterprise Plan: Custom Solutions for Large Organizations

The Enterprise tier moves beyond Semrush’s standardized pricing to offer customized solutions for large organizations with complex needs.

Unlike the fixed-price plans, Enterprise solutions are tailored to each organization’s specific requirements and scale.

Semrush – Enterprise plan

The Semrush Enterprise platform is an entirely separate solution from the “core” Semrush platform. You get access to everything in the Business tier, but you also get a completely new dashboard with enterprise-level SEO and automation tools and capabilities.

Keyword winner/loser summary

You’ll also get access to vetted SEO experts, seamless document sharing functionality, and extensive reporting and automation features.

It’s designed for enterprise-level businesses (think Samsung, Salesforce, and SAP). This means it’s way beyond what the average person needs.

But for those with huge data, automation, and optimization requirements, Semrush Enterprise is an incredibly powerful platform.

Free Plan vs. Free Trial

Semrush’s free plan offers a solid introduction to the platform’s capabilities. You can access basic keyword research, site auditing, and competitive analysis features
for free.

This makes it an excellent option for those just starting their SEO journey or wanting to test the platform before committing.

However:

The free plan comes with significant limitations in terms of the number of reports you can generate and the depth of data you can access.

You’re limited to:

  • 10 daily requests in many of the tools
  • 1 project
  • 100 URL crawls per month

Arguably, the most notable restriction is that you can only track 10 keywords. So you can’t monitor a full SEO campaign effectively.

That’s why the free trial of Semrush’s paid plans offers a better way to evaluate Semrush’s full capabilities.

For 7 days, you can access all features of your chosen plan, helping you make an informed decision about whether the investment makes sense for your needs.

Note: Test out the paid plan features for an extra week with a 14-day trial on a Semrush Pro subscription.


Other Semrush Pricing

Semrush offers a range of add-ons you can tag onto your subscription. These include local SEO packages, the .Trends suite for market research, and a social media management platform.

Here’s a breakdown of the add-ons and their pricing:

Add-on Pricing Key Features
Extra users $45-$100/month, depending on plan Add extra users to your plan (with shared limits)
Local Essential/Advanced $50-$60/month Listing management, GBP optimization, and map rank tracker
.Trends $289/month per user Consumer trends and market research tools
Social Media Management $19.99-$39.99/month Social posting, tracking, and analytics
Agency Growth Kit $69-$249/month Lead management, CRM, and client portal
ImpactHero $200/month Buyer journey optimization

Semrush also has an extensive App Center. Here, you’ll find apps to help with pretty much every aspect of SEO and digital marketing.

Semrush Apps Collection

You can get free trials of many apps, and their prices vary.

Semrush Pricing Plans Compared to Competitors

Semrush is often a bit pricier than some of its competitors, at least at some of the plan levels.

But price isn’t everything, and it’s worth comparing the different platforms in detail to understand the value each one can provide for YOUR specific situation.

Semrush vs. Ahrefs Pricing

Semrush and Ahrefs have fairly similar pricing structures. Semrush is slightly more expensive in each of the three pricing brackets, but notably only by $0.95 in the middle tier (Guru/Standard).

Ahrefs – Pricing

On the face of it, the two options in all three cases are fairly similar:

  • Semrush offers 5, 15, and 40 projects, while Ahrefs offers 5, 20, and 50 projects
  • While Semrush lets you track 500, 1.5K, and 5K keywords, Ahrefs lets you track 750, 2K, and 5K
  • Semrush lets you audit 100K, 300K, and 1M URLs, while Ahrefs’ limits are 100K, 500K, and 1.5M

But it’s worth noting that Ahrefs’ cheapest (Lite) plan limits you to just 500 credits across various tools per month.

With a Semrush Pro subscription (the cheapest one Semrush offers), the limits are 3,000 reports/requests per day.

But the numbers aren’t everything. There are other differences between the two tools that you’ll need to factor in when making your choice.

For more on that, check out our full guide to Semrush vs. Ahrefs.


Semrush vs. Moz Pricing

Moz offers two cheaper pricing plans than both Semrush and Ahrefs. These are pretty limited (with the cheapest only letting you track 50 keywords per month).

But they’re still viable options for those on a budget that are just starting out with their first SEO tool.

Moz – Pricing

Moz does offer feature-rich plans at higher price points, but often with lower limits than Semrush plans.

For example, the most expensive Moz plan still only lets you track 25 sites (compared to 40) and 3,000 tracked keywords (vs. 5,000 on a Semrush Business plan).

It’s also worth noting that beyond the pricing plans, the two platforms are very different. For example, Semrush’s database has more than 26.4 billion keywords compared to Moz’s 1.25 billion.

To read more about the main differences between the two, check out this Semrush vs. Moz comparison.


How to Choose the Right Semrush Plan

The most basic way to choose between the different Semrush pricing plans is to consider your budget and your reporting needs.

Rght Semrush Plan

If you have a limited budget, the free and Pro plans are going to be the obvious choice. But once you start needing larger reports or to track 1500+ keywords, the Guru and Business plans are the ones to go for.

But let’s see which plans are best for which types of business and website owners.

Solopreneurs and Bloggers

For solopreneurs and blog owners, the Pro plan is usually going to be enough. You can manage up to 5 projects, and 500 keywords will be enough tracking capabilities for most small sites.

The reporting limits are generous enough too, and most beginners won’t max out on them.

But many people will be fine with the free Semrush plan—particularly if you have pretty limited keyword research needs and your site is smaller than 100 pages. It’ll give you a good feel for the platform’s core features before you sign up for a subscription.

You’ll want to upgrade to the Pro plan when you:

  • Track more than 10 keywords
  • Need daily position monitoring
  • Have a site that grows beyond 100 pages (for Site Audit crawls)
  • Want to analyze backlink opportunities

Note: Test out the Pro plan’s features with a 14-day free trial.


Small Business Owners

Small businesses typically find the sweet spot with the Guru plan. The additional keyword tracking and content marketing features make it ideal for sites of all sizes with moderate content production workflows.

The Pro plan suits you if:

  • You manage a single business website
  • You need basic competitive analysis
  • Content creation isn’t your primary focus

The Guru plan becomes a better option when:

  • You manage multiple business websites, or multiple client sites
  • Content marketing is a key part of your strategy
  • You need access to historical data
  • You need to crawl up to 300K pages per month (Site Audit)

Medium-Sized Businesses and Ecommerce Stores

Mid-sized companies often benefit most from the Business plan—particularly if they’re managing multiple websites or serving many clients.

The Guru plan becomes essential for teams or individuals that:

  • Create 10+ pieces of content per month
  • Require advanced topic research tools
  • Send lots of reports to stakeholders or clients
  • Manage multiple sites or brands

Agencies

Agencies are going to be better off with the Business plan in most cases. The limits of the lower plans are just unlikely to be enough for agencies with many clients.

However, if your agency is just starting out, the Guru plan offers a cost-effective way to serve up to 15 clients with some powerful tools.

Note: If you need to increase any particular limit, you can contact the sales team or pay for more via your subscription dashboard.

Semrush – Subscription info


Start with Guru if you:

  • Serve up to 15 clients
  • Need white-label reports
  • Require content marketing tools
  • Want historical data access

Choose the Business plan when you:

  • Manage 15+ client accounts
  • Need API access
  • Require advanced white-labeling
  • Share reports across large teams

Semrush also offers the Agency Growth Kit. This starts at $69/month (on top of your subscription) for a CRM, client portals, and white-lable PDF reports.

Semrush – CRM

But for those who want to boost their visibility on the Agency Partners platform or who need unlimited client portals, pricing increases to $149 and then $249 per month.

Enterprise Businesses

Enterprise organizations should consider the custom Enterprise tier. This platform was built from the ground up for enterprise-scale operations.

The platform offers a range of enterprise-specific solutions and features the core subscriptions don’t offer. You can read more about Semrush Enterprise here.

The Business plan is the best of the three core plans if your business isn’t quite ready for the enterprise offering. Its generous limits, integration capabilities, and API access mean it’s a cost-effective choice for many big brands and businesses.

Still Not Sure Which Plan to Choose?

Choosing the right Semrush plan ultimately depends on your specific marketing goals and resources.

To dive deeper into Semrush’s capabilities and make a more informed decision, check out our comprehensive Semrush review and complete guide to using Semrush effectively.

The post Semrush Pricing: How to Choose the Right Plan appeared first on Backlinko.

Read more at Read More

Scaling content creation without compromising quality (with template)

SEO is, for a large part, all about getting the right content in front of the right audience. When you’ve been doing that for a while, there comes a time when you want to scale content production. Scaling content creation means you aim to make more content to reach new targets. While that’s a good idea, you need to find a way to scale while keeping the same level of quality you’ve always had. Let’s go over how to scale your content production step by step, showing common problems and solutions.

What is content scaling?

Content scaling is about making your content process more efficient. The goal should be to make more content without lowering the quality. First, you must examine every step of your content creation process — from brainstorming to research, editing, publishing, and reporting. Once you have the process detailed, you can find ways to do those tasks faster and predictably. 

A well-scaled process helps you create a lot of content. This approach helps you build a solid system rather than adding more articles. For instance, your content team could develop a checklist to help review articles, introduce a content calendar to improve planning and set up clear tone-of-voice guidelines. These steps help you stay consistent and true to your brand — whether you produce one weekly article or dozens. 

Why scaling content matters

Scaling content production can directly help your business. If you actively publish high-quality content on your site, search engines will understand that your site is active and reliable. By targeting the right audience with the right search intent and message, you could improve your search visibility and generate more traffic for your content. Search engines are likelier to see you as trustworthy when you publish high-quality content.

In addition, producing content more consistently and following a plan can help you reach a bigger audience. More articles mean more opportunities to write about topics that interest your different audience groups. In the end, this will broaden your brand’s presence. You’ll have a bigger chance of people seeing you as a trusted source if you offer helpful insights and solutions to their problems.

All your content can help potential customers make decisions. This content is another way to address their concerns and answer questions. By doing this strategically, you can continue to engage your audience and nudge them closer to making that final decision. Of course, whether that decision is a sale, information request, or newsletter signup doesn’t matter.

Scaling your content production also supports your branding. When you create well-organized content over a longer period, you can support your brand voice and recognition. That reliability helps build trust and strengthens your reputation. 

The biggest challenges in scaling content

If you want to scale your content production, you must overcome several hurdles, which, if you don’t consider, will impact the quality and consistency of your content. 

Quality control and consistency

When you produce more content, you need to make sure that every piece represents your brand well. However, catching errors or maintaining the proper tone becomes harder because you have more content to review. If you don’t do this well, there’s a risk that your articles will vary in tone or style. Without proper guidelines or a good editorial process, your content quality may suffer when you publish more and more.

For example, you can miss issues like tone, formatting, or factual errors without a standard editing checklist. If you do this for a while and people start to notice, they can form a different view of your brand. It would almost look like you don’t care about these issues. You need to set clear quality benchmarks and a solid review process. Consistent editing with fixed content rules helps everything you publish meet the same standards.

Handling different audience needs

In an ideal world, you write for different groups. You cannot target one group only. Every segment has its own interests, problems, and ideas. But if you scale your output, you risk writing mainly generic articles. No one will like that content.

If you haven’t yet sorted your audience, do so and focus your content on these specific groups. As a result, your content will be more useful for the people in those groups.

Process difficulty and extra management work

More content means more parts to manage. Each article needs research, writing, review, checking, and then publishing. This is fine if you publish a few posts a month because you can handle these steps by hand. But growing your output complicates things when you face many deadlines, writers, or quality checks.

Complexity leads to bottlenecks. If you struggle with one thing, that might eventually slow down everything. Think of it like this: when you don’t scale your editorial process, you will eventually have a pile of articles that need approval. This grinds your publication flow to a halt. Develop a system that divides tasks into repeatable steps. Use content calendars and checklists to track progress and make managing projects easier. 

Balancing speed and thoughtfulness

Scaling content production can lead to pressure to cut corners to meet deadlines. When the speed of publication comes into play, there’s a high chance that content will become less developed. This shouldn’t happen. Every piece of content should be carefully planned and produced. Rushing only leads to content that lacks depth, accuracy, or clarity. 

Of course, this is easier said than done. You have to find ways to increase efficiency without sacrificing the quality of your content. Start by streamlining your process, breaking it up into smaller tasks. Set up a system that monitors quality while giving you enough room to be flexible.  

Building a repeatable content creation process

Scaling your content production reliably requires setting up a solid content process. That process should be easily repeatable and have clear tasks, which will help keep your team on track. 

Map the entire content workflow

Describe each content task and work your way through the list of what has to be done. Write down a list of all phases, ranging from conception through publication. This will help you understand where delays or errors creep in. Consider drawing a flow diagram or another visual. This list will act as your directive.  

Create a content calendar

Use a content calendar to plan your publishing schedule. Proper planning helps you keep track of deadlines, even if they are for different outlets. Thanks to your content plan, your team can write content in advance and, hopefully, without stressing out about deadlines too much.

Develop detailed briefs and outlines

Content briefs are a great way to align writers — see below for an example. A brief like this should, at least,  include the subject, target audience, key messages, and keywords that the writer should target. Once approved, create an outline for the content and fill in the structure. A good content brief speeds up the writing process while ensuring that content is targeted well. 

Implement a style guide

A style guide can help you ground every piece of content in a consistent tone of voice and formatting. This guide should include rules for tone, punctuation, formatting, and whatever else makes sense to share. You can easily share this guide with anyone on your team; even freelancers enjoy using it. 

Use checklists for each stage

You’ll find it easier to manage once you break the process down into small tasks. Make a checklist for tasks such as researching, writing, and editing. Having a proper checklist helps you make sure that you don’t forget anything. This could be checking facts, improving readability, or using proper SEO tactics. Your lists will help you scale your content production while maintaining quality output.

Standardize tools and platforms

Use well-known tools to manage tasks in your team. Think of project management tools like Jira or Asana, shared calendars in CoSchedule, Canva for visual designs, and document templates in Microsoft Office. Many companies use Google Docs to collaborate on documents. In those cases, you can use one of the standardized Google Docs extensions, which are easier to scale.

Write a good manual or checklist for these tools so that anyone — from in-house writers to external freelancers — follows the same steps. Standardization makes this work and helps apply important SEO best practices properly.

All of these things help your team routinely produce quality content. Making the process repeatable reduces the chance of errors and wasted time, so you can scale without losing what makes your content awesome. 

Strategies to scale without losing quality

Careful planning is one of the best ways to scale your content without lowering its quality. Another great option is to use clear methods to make your work more effective. 

Develop a strong content strategy and workflow 

As always, start with a solid plan that includes your goals, topics, and the audience you want to reach. Creating content for your audience is much easier when everyone truly understands who those people are. A good workflow avoids delays and helps people move from one task to another.

Use a detailed content calendar

We’ve discussed the importance of content calendars, and you really have to see these as your roadmap. A calendar shows all upcoming publications, deadlines, and the status of various projects. A good calendar keeps everyone up to date at all times and makes sure the work is nicely spread out. Good planning prevents missed deadlines.

Use template structures

Templates help you standardize your work, as they offer a reusable structure for common types of content. Each type of content can have its own structure to fill in. These templates help writers speed up their work while maintaining consistency across articles. 

Repurpose content thoughtfully

Look at what you already have and see how it can be adapted into a different form. For example, you can split a long-form article into several videos or a series of shorter posts. This strategy saves time while also delivering fresh material in new formats. Make sure to adapt the new content to the correct audience. 

Assign clear roles within your team 

Find out your team members’ strengths and have them do what they do best. A  writer should handle the initial draft while an editor reviews the work. Your trusted subject matter expert should check the content for accuracy. Clear roles help people do what they do best, which helps preserve content quality.

Maintaining high-quality content at scale

It isn’t easy to maintain content quality when scaling content production. To make the process more manageable, you should establish habits and use tools that help you make sure that every piece of content meets your standards. 

Follow your style guide

Setting up a good style guide keeps your writing consistent. Your style guide should include information on your content’s tone of voice, the terminology you can and can’t use, and how you structure and format it. Share this guide with your team.

Schedule periodic audits

Similarly, regularly review your existing content to see if it’s outdated or needs to adapt to changes in your brand messaging. This helps keep your older content relevant and accurate. 

Use tools when appropriate

Tools can help scale your content production. Even a tool like our Yoast SEO plugin can help your content work. Good content tools can help with formatting, improving readability, checking for keyword placement, and some even help with on-page SEO.

Using Generative AI for scaling content output

Using AI to scale content production might seem like a good idea, but please be careful. Generative AI can definitely be a valuable tool for content processes. However, AI is not without issues and needs interaction from real people.

Human oversight makes sure that the output aligns with your brand’s voice and content standards. You can use generative AI as a starting point or a helpful assistant, but not as a complete replacement for your real writers. Your use of AI should have a clear process to bring the content up to your desired quality level.

Conclusion to scaling content production

Scaling up content production shouldn’t mean lower quality. Mostly, it’s about knowing the content process inside out. Once you have that, you can lay out the steps for everyone to follow. With a good process, you can meet your goals and still maintain the quality of the content. Be sure to set up content templates, calendars, and clear roles for your team. Make the adjustments and see how this can lead to better results. 

Bonus: Content brief template for SEO

Are you looking for a basic content brief template that helps scale your content production? Check out the one below:

Content brief section Details
Title/headline suggestion [Insert title]
Primary keyword [Main keyword]
Secondary keywords [Keyword 1], [Keyword 2]
Search intent [Informational, commercial, transactional, etc.]
Audience persona [If needed, description of audience persona]
Content objective [What is the content meant to achieve]
Benchmark content [URLs of best-in-class content about this topic]
Word count range [Word count]
Tone and style guidelines [Tone and style]
Outline/sections Introduction;
Main points/headings;
Subheadings;
Conclusion
SEO requirements Meta title: [Title];
Meta description: [Description];
Header tags: H1, H2, H3;
URL: [Proposed URL for content]
Call to action [What do you want people to do/click on?]
Internal and external links Internal: [Links]
External: [Links]
Visuals and multimedia [List of visuals]
Examples/references [Links to examples/references]
Deadline and submission details [Deadline and submission instructions]

The post Scaling content creation without compromising quality (with template) appeared first on Yoast.

Read more at Read More

A guide to web crawlers: What you need to know

A guide to web crawlers: What you need to know

Understanding the difference between search bots and scrapers is crucial for SEO

Website crawlers fall into two categories: 

  • First-party bots, which you use to audit and optimize your own site.
  • Third-party bots, which crawl your site externally – sometimes to index your content (like Googlebot) and other times to extract data (like competitor scrapers).

This guide breaks down first-party crawlers that can improve your site’s technical SEO and third-party bots, exploring their impact and how to manage them effectively.

First-party crawlers: Mining insights from your own website

Crawlers can help you identify ways to improve your technical SEO. 

Enhancing your site’s technical foundation, architectural depth, and crawl efficiency is a long-term strategy for increasing search traffic.

Occasionally, you may uncover major issues – such as a robots.txt file blocking all search bots on a staging site that was left active after launch. 

Fixing such problems can lead to immediate improvements in search visibility.

Now, let’s explore some crawl-based technologies you can use.

Googlebot via Search Console

You don’t work in a Google data center, so you can’t launch Googlebot to crawl your own site. 

However, by verifying your site with Google Search Console (GSC), you can access Googlebot’s data and insights. (Follow Google’s guidance to set yourself up on the platform.)

GSC is free to use and provides valuable information – especially about page indexing. 

GSC page indexing

There’s also data on mobile-friendliness, structured data, and Core Web Vitals:

GSC Core Web Vitals

Technically, this is third-party data from Google, but only verified users can access it for their site. 

In practice, it functions much like the data from a crawl you run yourself.

Screaming Frog SEO Spider

Screaming Frog is a desktop application that runs locally on your machine to generate crawl data for your website. 

They also offer a log file analyzer, which is useful if you have access to server log files. For now, we’ll focus on Screaming Frog’s SEO Spider.

At $259 per year, it’s highly cost-effective compared to other tools that charge this much per month. 

However, because it runs locally, crawling stops if you turn off your computer – it doesn’t operate in the cloud. 

Still, the data it provides is fast, accurate, and ideal for those who want to dive deeper into technical SEO.

Screaming Frog main interface

From the main interface, you can quickly launch your own crawls. 

Once completed, export Internal > All data to an Excel-readable format and get comfortable handling and pivoting the data for deeper insights. 

Screaming Frog also offers many other useful export options.

Screaming Frog export options

It provides reports and exports for internal linking, redirects (including redirect chains), insecure content (mixed content), and more.

The drawback is it requires more hands-on management, and you’ll need to be comfortable working with data in Excel or Google Sheets to maximize its value.

Dig deeper: 4 of the best technical SEO tools

Ahrefs Site Audit

Ahrefs is a comprehensive cloud-based platform that includes a technical SEO crawler within its Site Audit module. 

To use it, set up a project, configure the crawl parameters, and launch the crawl to generate technical SEO insights.

Ahrefs Overview

Once the crawl is complete, you’ll see an overview that includes a technical SEO health rating (0-100) and highlights key issues. 

You can click on these issues for more details, and a helpful button appears as you dive deeper, explaining why certain fixes are necessary.

Ahrefs why and how to fix

Since Ahrefs runs in the cloud, your machine’s status doesn’t affect the crawl. It continues even if your PC or Mac is turned off. 

Compared to Screaming Frog, Ahrefs provides more guidance, making it easier to turn crawl data into actionable SEO insights. 

However, it’s less cost-effective. If you don’t need its additional features, like backlink data and keyword research, it may not be worth the expense.

Semrush Site Audit

Next is Semrush, another powerful cloud-based platform with a built-in technical SEO crawler. 

Like Ahrefs, it also provides backlink analysis and keyword research tools.

Semrush Site Audit

Semrush offers a technical SEO health rating, which improves as you fix site issues. Its crawl overview highlights errors and warnings.

As you explore, you’ll find explanations of why fixes are needed and how to implement them.

Semrush why and how to fix

Both Semrush and Ahrefs have robust site audit tools, making it easy to launch crawls, analyze data, and provide recommendations to developers. 

While both platforms are pricier than Screaming Frog, they excel at turning crawl data into actionable insights. 

Semrush is slightly more cost-effective than Ahrefs, making it a solid choice for those new to technical SEO.

Get the newsletter search marketers rely on.



Third-party crawlers: Bots that might visit your website

Earlier, we discussed how third parties might crawl your website for various reasons. 

But what are these external crawlers, and how can you identify them?

Googlebot

As mentioned, you can use Google Search Console to access some of Googlebot’s crawl data for your site. 

Without Googlebot crawling your site, there would be no data to analyze.

(You can learn more about Google’s common crawl bots in this Search Central documentation.)

Google’s most common crawlers are:

  • Googlebot Smartphone.
  • Googlebot Desktop.

Each uses separate rendering engines for mobile and desktop, but both contain “Googlebot/2.1” in their user-agent string.

If you analyze your server logs, you can isolate Googlebot traffic to see which areas of your site it crawls most frequently. 

This can help identify technical SEO issues, such as pages that Google isn’t crawling as expected. 

To analyze log files, you can create spreadsheets to process and pivot the data from raw .txt or .csv files. If that seems complex, Screaming Frog’s Log File Analyzer is a useful tool.

In most cases, you shouldn’t block Googlebot, as this can negatively affect SEO. 

However, if Googlebot gets stuck in highly dynamic site architecture, you may need to block specific URLs via robots.txt. Use this carefully – overuse can harm your rankings.

Fake Googlebot traffic

Not all traffic claiming to be Googlebot is legitimate. 

Many crawlers and scrapers allow users to spoof user-agent strings, meaning they can disguise themselves as Googlebot to bypass crawl restrictions.

For example, Screaming Frog can be configured to impersonate Googlebot. 

However, many websites – especially those hosted on large cloud networks like AWS – can differentiate between real and fake Googlebot traffic. 

They do this by checking if the request comes from Google’s official IP ranges. 

If a request claims to be Googlebot but originates outside of those ranges, it’s likely fake.

Other search engines

In addition to Googlebot, other search engines may crawl your site. For example:

  • Bingbot (Microsoft Bing).
  • DuckDuckBot (DuckDuckGo).
  • YandexBot (Yandex, a Russian search engine, though not well-documented).
  • Baiduspider (Baidu, a popular search engine in China).

In your robots.txt file, you can create wildcard rules to disallow all search bots or specify rules for particular crawlers and directories.

However, keep in mind that robots.txt entries are directives, not commands – meaning they can be ignored.

Unlike redirects, which prevent a server from serving a resource, robots.txt is merely a strong signal requesting bots not to crawl certain areas.

Some crawlers may disregard these directives entirely.

Screaming Frog’s Crawl Bot

Screaming Frog typically identifies itself with a user agent like Screaming Frog SEO Spider/21.4.

The “Screaming Frog SEO Spider” text is always included, followed by the version number.

However, Screaming Frog allows users to customize the user-agent string, meaning crawls can appear to be from Googlebot, Chrome, or another user-agent. 

This makes it difficult to block Screaming Frog crawls. 

While you can block user agents containing “Screaming Frog SEO Spider,” an operator can simply change the string.

If you suspect unauthorized crawling, you may need to identify and block the IP range instead. 

This requires server-side intervention from your web developer, as robots.txt cannot block IPs – especially since Screaming Frog can be configured to ignore robots.txt directives.

Be cautious, though. It might be your own SEO team conducting a crawl to check for technical SEO issues. 

Before blocking Screaming Frog, try to determine the source of the traffic, as it could be an internal employee gathering data.

Ahrefs Bot

Ahrefs has a crawl bot and a site audit bot for crawling.

  • When Ahrefs crawls the web for its own index, you’ll see traffic from AhrefsBot/7.0.
  • When an Ahrefs user runs a site audit, traffic will come from AhrefsSiteAudit/6.1.

Both bots respect robots.txt disallow rules, per Ahrefs’ documentation. 

If you don’t want your site to be crawled, you can block Ahrefs using robots.txt. 

Alternatively, your web developer can deny requests from user agents containing “AhrefsBot” or “AhrefsSiteAudit“.

Semrush Bot

Like Ahrefs, Semrush operates multiple crawlers with different user-agent strings. 

Be sure to review all available information to identify them properly.

The two most common user-agent strings you’ll encounter are:

  • SemrushBot: Semrush’s general web crawler, used to improve its index.
  • SiteAuditBot: Used when a Semrush user initiates a site audit.

Rogerbot, Dotbot, and other crawlers

Moz, another widely used cloud-based SEO platform, deploys Rogerbot to crawl websites for technical insights. 

Moz also operates Dotbot, a general web crawler. Both can be blocked via your robots.txt file if needed.

Another crawler you may encounter is MJ12Bot, used by the Majestic SEO platform. Typically, it’s nothing to worry about.

Non-SEO crawl bots

Not all crawlers are SEO-related. Many social platforms operate their own bots. 

Meta (Facebook’s parent company) runs multiple crawlers, while Twitter previously used Twitterbot – and it’s likely that X now deploys a similar, though less-documented, system.

Crawlers continuously scan the web for data. Some can benefit your site, while others should be monitored through server logs.

Understanding search bots, SEO crawlers and scrapers for technical SEO

Managing both first-party and third-party crawlers is essential for maintaining your website’s technical SEO.

Key takeaways

  • First-party crawlers (e.g., Screaming Frog, Ahrefs, Semrush) help audit and optimize your own site.
  • Googlebot insights via Search Console provide crucial data on indexation and performance.
  • Third-party crawlers (e.g., Bingbot, AhrefsBot, SemrushBot) crawl your site for search indexing or competitive analysis.
  • Managing bots via robots.txt and server logs can help control unwanted crawlers and improve crawl efficiency in specific cases.
  • Data handling skills are crucial for extracting meaningful insights from crawl reports and log files.

By balancing proactive auditing with strategic bot management, you can ensure your site remains well-optimized and efficiently crawled.

Read more at Read More

PPC budgeting in 2025: When to adjust, scale, and optimize with data

PPC budgeting in 2025- When to adjust, scale, and optimize with data

Budgeting for paid ad campaigns has long been a static process – set a monthly budget, monitor spending, and adjust incrementally as needed. 

This method works for industries with stable demand and predictable conversion rates but falls short in dynamic, competitive markets.

Still, static budgets aren’t obsolete. In industries with long sales cycles, consistent conversion trends, or strict financial planning – like B2B SaaS and healthcare – planned budgets remain essential.

The key isn’t choosing between static and dynamic budgeting; it’s knowing when and how to adjust PPC spend using data-driven signals.

The role of Smart Bidding and Performance Max in budgeting

Automation has changed our budgeting strategies, but it hasn’t eliminated the need for human oversight. 

While Google’s Smart Bidding and Performance Max (PMax) campaigns help optimize performance, they do not fully control budget allocation the way some advertisers may assume.

Smart Bidding: What it does (and doesn’t do) for budgeting

Smart Bidding (i.e., Target ROAS, Target CPA, Maximize Conversions, and Maximize Conversion Value) uses real-time auction signals to adjust bids but does not shift budgets between campaigns. 

If a campaign has an insufficient budget, smart bidding won’t automatically pull spend from another campaign; this still requires manual adjustments or automated budget rules.

To overcome the budget allocation limitations of Smart Bidding, use:

  • Portfolio bidding strategies: Setting bid strategies at the campaign level lets you use a common bidding approach (e.g., Target ROAS or Target CPA) across multiple campaigns. This enables more efficient spending across campaigns with similar goals without manual adjustments.
  • Shared budgets: Assigning a single budget across multiple campaigns ensures high-performing campaigns receive adequate funding while preventing overspending on lower-performing ones.

Dig deeper: How each Google Ads bid strategy influences campaign success

Performance Max: A black box for budget allocation?

PMax automates asset and bid optimization across multiple Google properties (Search, Display, YouTube, Discovery, etc.), but you don’t control which channel yorur budget goes to. 

Google’s algorithm decides how much to allocate to each network, which can sometimes result in excessive spend on lower-performing placements like Display rather than Search.

Instead of relying solely on PMax, run separate Search campaigns alongside it to ensure an adequate budget is allocated to high-intent traffic.

Dig deeper: How to make search and PMax campaigns complement each other

Balancing automation and control: Avoid these PPC budget pitfalls

While automation streamlines bidding, it can also lead to costly mistakes. 

Watch out for these common budget-wasting pitfalls and learn to stay in control.

Overspending on low-value traffic

Smart Bidding sometimes aggressively increases bids to meet a Target ROAS or Target CPA, which can inflate CPCs without increasing conversion volume.

Solution

  • Set bid caps when using Maximize Conversion Value to prevent excessive CPC increases.
  • Monitor search terms to ensure increased bids aren’t capturing low-intent queries.

Advanced tip

When setting a tCPA or tROAS, allow a 10-20% margin for flexibility to help Google’s algorithm optimize effectively.

For example, if your ideal tCPA is $100, setting it to $115 gives Google room to secure conversions that may exceed your target while still delivering strong performance. 

Since tCPA operates as an average, not every lead will cost the same amount.

Once you are consistently hitting your target, gradually lower the tCPA (or raise the tROAS) to improve budget efficiency without restricting conversions.

Underfunding efficient campaigns

If a campaign has a long conversion delay (i.e., B2B lead gen), Smart Bidding may incorrectly shift the budget elsewhere before enough data accumulates.

Solution

  • Extend conversion windows in Smart Bidding settings. The default is 30 days, but advertisers can adjust the window from one day up to 90 days
  • Manually monitor lagging conversions and adjust budgets proactively.

Lack of budget control in PMax campaigns

Performance Max doesn’t allow advertisers to set separate budgets for Search, YouTube, and Display. 

As a result, Google may (advertiser sentiment is that they do) favor low-cost clicks from Display rather than higher-intent Search traffic.

Solution

  • Run branded and high-intent non-branded Search campaigns separately to control budget spend on direct-response traffic.
  • Use brand exclusions in PMax to prevent Google from serving brand search queries within PMax, ensuring that branded traffic remains in the dedicated Search campaign.
  • Apply negative keywords via account-level negatives. While PMax doesn’t allow campaign-level negatives, account-level negative keyword lists can help block irrelevant or redundant queries. The maximum number of negative keywords allowed to be applied is 100. Google has stated that it created this limit because PMax isn’t meant to be a heavily restricted campaign type.
  • By monitoring your search impression share, you can identify when branded queries are slipping into PMax instead of the dedicated Search campaign. This will allow you to adjust bid strategies and audience signals accordingly. 
  • Use audience exclusions in PMax to prevent excessive Display spend on irrelevant audiences.

Advanced tip

Tools like Optmyzr can help advertisers determine how their budget is allocated in PMax with the PMax Channel Distribution feature. 

Although we may not have much control over the allocation, we can at least be aware of it. 

Dig deeper: How to manage a paid media budget: Allocation, risk and scaling

How to use first-party data to improve budget allocation

An underutilized strategy for improving budgeting is leveraging first-party data to allocate spend toward high-value audiences. 

As privacy restrictions tighten and tracking capabilities decline, it’s important to shift your focus from broad automated bidding to first-party audience targeting.

Use customer match to prioritize high-value audiences

Instead of spending equally across all users, advertisers can upload Customer Match lists (based on past purchasers, high-LTV customers, or CRM data) and adjust budgets accordingly.

Example

  • If historical data shows that repeat customers generate a higher ROAS than new users, more budget should be allocated to remarketing campaigns targeting Customer Match audiences.

Advanced tip

To maximize campaign efficiency, consider using value-based bidding (VBB) to ensure your budget prioritizes high-value conversions rather than just the volume of leads. 

By assigning different conversion values based on customer lifetime value (LTV), using Customer Match, GA4 insights, or CRM data, you can direct more spending toward audiences that generate the highest long-term revenue.

Changes to customer match lists

Google recently introduced two key updates to Customer Match lists that will impact how advertisers manage audience data.

To stay compliant and maximize audience targeting, be sure to regularly refresh your lists and align your data collection with Google’s updated policies.

Apply GA4 data for smarter budget scaling

Google Analytics 4 (GA4) provides insights into conversion paths, high-value audience segments, and multi-channel attribution. 

Instead of relying solely on Google Ads conversion tracking, use GA4 to determine which audience segments should receive higher budgets.

Best practice

  • Create custom lists/audiences around users with high engagement signals (repeat visits, add-to-cart actions, lead form interactions) and allocate more budget toward these users.
  • Create custom lists/audiences around low-intent users who bounce after viewing one page. To reduce wasted ad spend, decrease your bids or exclude them.

Dig deeper: How to leverage Google Analytics 4 and Google Ads for better audience targeting

Get the newsletter search marketers rely on.



Budget scaling strategies: When and how to increase PPC spend

Scaling your PPC campaigns requires a structured, gradual approach. 

Increasing budgets too aggressively can cause Smart Bidding to overcompensate, leading to inefficient scaling and missed revenue opportunities.

Incremental budget scaling

Instead of doubling your budget overnight, it is better to gradually increase it by 10-20% daily. 

This gives Smart Bidding algorithms time to adjust without overspending or wasting budget.

This will also allow us better control as we can monitor performance changes due to budget shifts more closely.

Example

  • If a campaign is hitting its conversion goals consistently, increase the budget by 15% per week while monitoring conversion trends.

Cross-campaign budget reallocation

Rather than increasing spend across the board, shift budget strategically between:

  • Branded campaigns (lower-funnel, high-converting).
  • Non-branded search campaigns (high-growth potential).
  • Remarketing campaigns (high-value repeat customers).

Dayparting for more efficient spend

Instead of distributing the budget equally across all hours, allocate more to high-converting time periods.

Example

  • If the lead volume is highest between 8 a.m. and 2 p.m., increase bids and budget during these hours.
  • If your business hours are from 12 p.m. to 10 p.m., lower your bids during the hours you aren’t operating to prevent unnecessary ad expenses.

Industry-specific budgeting approaches

As we all know, no two industries are the same, so the approach to budgeting should also be different. Here’s how different business models should think about budget allocation:

B2B lead generation

Budgeting for B2B lead generation requires a long-term view. 

Unlike ecommerce, where purchases can happen quickly, B2B sales cycles can range from a week to over a year, depending on the contract size and decision-making process. 

As such, budget pacing should be planned over months. Don’t make frequent (i.e., daily or weekly) adjustments that could cause instability in the account. 

Because the cycle is longer, conversions often take some time to materialize, so conversion delays should be considered when evaluating Smart Bidding performance. 

If budgets are adjusted too soon based on incomplete data, campaigns may be underfunded before the true impact of conversions is realized.

Dig deeper: Paid search for lead gen: Tips for new accounts with limited budgets

Ecommerce

Seasonality plays a large role in budgeting decisions for ecommerce brands. 

Aggressively increase budgets ahead of major sales events, like Black Friday, Cyber Monday, and holiday shopping, to capitalize on higher purchase intent. 

Reacting to performance mid-season will likely result in missed opportunities if the budget is exhausted too early. 

Also, rather than spreading spend evenly across all potential buyers, prioritize high-LTV customers using Customer Match lists and past purchase data. 

This ensures that ad spend is directed toward audiences likely to generate repeat purchases and higher average order values (AOVs).

Dig deeper: Lead gen vs. ecommerce: How to tailor your PPC strategies for success

Local businesses

Budget allocation for local businesses should be narrowly geo-targeted. 

Instead of distributing spend evenly across an entire service area (although you should have some presence in the area), analyze past geographic conversion data to determine which locations typically generate the highest return. 

The budget should then be allocated accordingly, ensuring that high-performing areas receive the majority of ad spend.

Another important factor is setting up call tracking. 

Since many conversions happen over the phone rather than through online forms, integrate call-tracking data to identify which campaigns generate high-quality leads. 

By analyzing call duration, lead quality, and customer inquiries, you can refine budget allocation to optimize for calls that convert into sales or appointments.

Dig deeper: 9 essential geotargeting tactics for Google Ads

Each industry requires a different budgeting approach tailored to its sales cycles, customer behavior, and conversion patterns. 

Understanding these nuances ensures that your PPC budgets are allocated strategically for maximum impact, whether it’s long-term pacing for B2B, seasonal surges for ecommerce, or localized targeting for service-based businesses.

A smarter approach to budgeting

Budgeting for your PPC campaigns doesn’t involve choosing between static and dynamic models; it involves strategically using both.

  • Smart Bidding and PMax improve efficiency but require human oversight.
  • First-party data should play a bigger role in spend allocation.
  • Budget scaling should be incremental and structured.
  • Industry-specific needs should dictate budget pacing strategies.

The best budgets are adaptable, data-driven, and aligned with long-term profitability rather than short-term spend fluctuations. 

Those who master this approach will gain a competitive advantage in an increasingly automated advertising landscape.

Read more at Read More

SEO prioritization: How to focus on what moves the needle

SEO prioritization- How to focus on what moves the needle

If you feel like you’re being pulled in different directions with your SEO program, you aren’t alone. 

How do you know where to focus first for the most impact? And when that’s done, what do you do next?

It can be challenging to decide which SEO tasks to prioritize because they all impact the end user in some way – but some more than others. This is where discernment comes into play.

This article will help you build a path to get your SEO program organized from point A to point B and figure out how to prioritize tasks to get ROI quicker.

Frameworks for identifying high-impact SEO opportunities

When every SEO task feels urgent, knowing where to focus first can make or break your strategy. These three frameworks can help you prioritize what moves the needle.

1. Technical SEO audit

A technical SEO audit is your roadmap for identifying and fixing the issues that directly impact search visibility and user experience. 

The right audit reveals the most urgent technical barriers to ranking – and helps you prioritize based on impact.

But not all audits are created equal. Here’s a breakdown of the different types:

Basic SEO audit

  • This is where automated software scans your site and flags common SEO issues. While the insights can be helpful, they come in a generic, one-size-fits-all report. 
  • This type of audit is ideal if you’re working with a tight budget or just want to get a basic overview before bringing in an expert. 
  • It’s never a bad idea, but it won’t provide an in-depth analysis.

Mid-level SEO audit

  • Here, you can expect a professional SEO specialist or vendor to go beyond automated reports and offer additional insights that software alone might miss. 
  • While these can pinpoint issues that require attention, they may not provide detailed solutions. 
  • This approach is useful when you need to identify potential problem areas but aren’t ready for a full-scale SEO strategy.

Comprehensive SEO audit

  • This is a full technical audit conducted by experienced technical SEOs. 
  • This deep dive involves top-tier tools, data analysis, and an in-depth website and SEO review by skilled analysts specializing in technical SEO and business strategy. 
  • Tools assist the process, but the real value comes from expert analysis, which makes it a time-intensive but highly valuable investment.

Knowing these key differences in audits can help you make an informed decision before you invest. 

Dig deeper: Technical SEO: Don’t rush the process

2. The Eisenhower Matrix

The Eisenhower Matrix is a powerful tool for prioritizing tasks by urgency and importance. 

Applying it to your SEO strategy helps you determine which tasks need immediate attention and which can wait.

To get started, divide tasks into four quadrants:

Quadrant 1: Urgent and important

  • These are the critical issues that directly impact rankings and user experience. 
  • For example, this could be a slow site or fixing a misconfigured robots.txt file that is blocking search engines from crawling and indexing key pages.
  • Whatever tasks you put in this category will be non-negotiable. Addressing these items can sometimes have an immediate impact on your ability to compete.

Quadrant 2: Important but not urgent

  • These will be the longer-term strategies that build sustainable growth.
  • For instance, maybe developing a long-term content strategy focused on topic authority and evergreen content falls here.
  • These efforts don’t require immediate attention but are essential for long-term SEO success.  

Quadrant 3: Urgent but not important

  • This bucket is for handling tasks that are time-sensitive but don’t significantly influence rankings or user experience.
  • This could be something like responding to a minor Google Search Console alert about a non-critical issue.
  • While these tasks may not have a high impact, taking care of them prevents minor issues from accumulating into big projects.

Quadrant 4: Neither urgent nor important

  • Anything that falls into this category is something you avoid. 
  • One example might be spending hours tweaking meta descriptions that already meet best practices without significant SEO gains.
  • These activities consume time and resources without delivering meaningful results.
Eisenhower matrix example

Using the Eisenhower Matrix helps your SEO by enhancing:

  • Clarity: Identify and fix what demands attention now versus what can wait.
  • Efficiency: Prioritize the highest ROI tasks without getting bogged down.
  • Focus: Stay aligned with business goals, eliminating distractions.

3. The Pareto Principle (80/20 Rule)

The Pareto Principle suggests that 80% of outcomes come from 20% of efforts. 

In SEO, focusing on the most impactful tasks helps you drive faster, more meaningful results without spreading yourself too thin.

Keyword targeting

It’s common for a small subset of your keywords to drive most organic traffic. 

Instead of spreading your efforts thin across all keywords, focus on optimizing the ones that deliver the most value.

  • Use SEO tools to identify the top-performing 20% of keywords that bring in most of your traffic and conversions.
  • Prioritize pages that rank between Positions 5 and 20 for those high-value keywords. These are low-hanging fruit that can move up with improvements.
  • Expand content for high-value keywords by answering related questions and creating supporting content. 

Content focus

Most of your website’s traffic and engagement likely comes from a handful of high-performing pages. 

Instead of endlessly creating new content, invest in improving the 20% of pages that already generate the most traffic and leads.

  • Identify your top 20% of pages by traffic and conversions using analytics tools.
  • Revamp those pages by updating outdated content to enhance optimization and engagement. 
  • Create supporting content to build topical authority around your best pages.

Technical fixes

Technical SEO can feel overwhelming because there’s always more to fix. But, a small subset of technical issues typically has the most impact on site performance.

Focus on fixing the top 20% of technical issues that cause 80% of your performance problems.

Prioritize high-impact fixes like: 

  • Resolving crawl errors so search engines can access your site.
  • Improving page load speed for user experience and rankings.
  • Fixing broken links to avoid losing link equity and frustrating users.
  • Optimizing usability to retain visitors and improve your ability to compete in the search results.

Dig deeper: Prioritizing SEO strategies: Where to focus your efforts

Having a framework for approaching your SEO program helps you stay organized.

Within that framework, you must consider how you will execute both short-term wins and longer-term strategies.

Balancing long-term strategies with quick wins

To succeed in SEO, you must balance short-term wins with long-term growth.

Quick wins can show immediate improvements, but foundational efforts are what build lasting authority.

To achieve the best results, it’s important to defer resources to both.

Quick wins

Quick wins are tactical SEO tasks that can be implemented quickly to produce noticeable results. 

These tasks usually involve optimizing existing content or resolving certain technical issues. 

They may not require large investments of time or resources but can lead to meaningful improvements in rankings, traffic, or user experience.

What constitutes a quick win? 

  • Tasks that are simple to implement. 
  • Things that address known website performance issues. 
  • Fixes that improve both search engine visibility and user experience. 

Examples of SEO quick wins include:

  • Fixing technical errors, like resolving 404 pages, broken links, and crawl issues.
  • Improving site speed.
  • Optimizing existing content by adding internal links, updating outdated information, or including relevant keywords.

Quick wins are valuable because they deliver early signs of progress. This helps build momentum and gain stakeholder buy-in. 

However, relying solely on quick wins isn’t enough to achieve a sustainable SEO program. 

That’s where long-term strategies come in. 

Long-term strategies

Long-term strategies require more time and effort but are key to creating a strong foundation. 

These strategies help your website become more authoritative, trustworthy, and relevant in the eyes of both search engines and your audience.

Examples of long-term strategies include:

  • Content creation that targets important keywords and answers user questions in-depth. Try SEO siloing to build authority around a topic.
  • Earning backlinks through your high-quality content and partnerships. 
  • Refreshing top-performing content to make sure it remains evergreen and relevant. I recommend spending 50% of your content resources on maintaining older but high-performing content. 
  • Continuing education so you can stay ahead of the curve. Consider annual SEO training with additional learning opportunities throughout the year. Search evolves fast, and you want to be able to forecast what’s coming up so you can start working on it early. 

Foundational efforts don’t deliver instant results, but as your site’s authority grows, you’ll see compounding benefits with higher rankings, more traffic, and increased user trust.

Fast gains, lasting growth: Crafting a balanced SEO plan

A good SEO roadmap should include both short-term quick wins and long-term projects. But where to start? 

Here’s one scenario: You could focus 70% of your time on quick wins early on to show immediate results and 30% on long-term efforts. 

Over time, you might adjust the balance to a 50/50 split as your site becomes more stable and foundational work becomes a bigger priority.

Dig deeper: 3 quick SEO wins to kick-start growth next year

Focus on what matters most for lasting results

Prioritizing your SEO strategies is the key to driving meaningful results. 

SEO isn’t about doing everything at once. It’s about doing the right things at the right time. 

When you focus on high-impact tasks and continuously refine your approach, you’ll build a more competitive search engine presence that pays off for years to come.

Read more at Read More

X launches AI-powered tools that create ads, analyze campaigns

X launched two new features to help advertisers automate ad creation and analyze real-time ad campaign performance. The new features – Prefill with Grok and Analyze Campaign with Grok – are (as the names imply) powered by Grok, X’s AI assistant.

Prefill with Grok. Enter your website URL and Grok will generate ad copy, imagery, and a call-to-action headline. You can tweak as needed. Here’s what it looks like:

Analyze Campaign with Grok. Grok will analyze campaign data and offer insights and recommendations to optimize targeting and creative strategy.

What’s next. The rollout began Feb. 21. It will continue in phases, expanding to more advertisers.

Why we care. This move aims to streamline the ad creation process and make data-driven optimizations faster, cutting down on manual effort and potentially boosting campaign performance.

The announcement. Grok for Advertisers: Introducing New AI Tools for Brands.

Read more at Read More

MTA vs. MMM: Which marketing attribution model is right for you?

MTA vs. MMM- Which marketing attribution model is right for you?

Measuring marketing effectiveness is essential for any business investing in multiple channels. 

Two popular approaches – multi-touch attribution and marketing mix modeling – help marketers understand which strategies drive results. 

This article tackles the key differences between each attribution method to help you determine which one best fits your business needs.

The growing need for smarter marketing attribution

With Google’s recent update to its open-source marketing mix model, Meridian, interest in marketing mix analysis and channel modeling has surged. 

While enterprise brands have long benefited from these insights, smaller businesses running multi-channel marketing can also gain value. 

Two leading methodologies have emerged to tackle this challenge: 

  • Multi-touch attribution (MTA).
  • Marketing mix modeling (MMM). 

Both aim to measure marketing effectiveness but differ significantly in methodology, scope, and application.

Every business investing in marketing needs to assess whether its efforts are paying off. 

SEO, email campaigns, search ads, and social media all demand time and budget. 

But without the right measurement approach, it’s difficult to know which channels truly drive results.

Many marketers rely on in-platform data, but this only provides a partial view due to differing attribution models and settings. 

Third-party attribution tools attempt to bridge the gap, but they often favor specific marketing channels and impose predefined attribution rules, which may not align with long-term business goals.

For businesses serious about optimizing their marketing, a customized approach is essential – one that fully leverages their own data while integrating additional insights. 

This is where MTA and MMM shine.

Dig deeper: 7 must-know marketing attribution definitions to avoid getting gamed

Understanding the basics

Multi-touch attribution

Multi-touch attribution is a digital-first methodology that tracks individual customer interactions across various touchpoints in their journey to purchase. 

It assigns credit to each marketing touchpoint based on its contribution to the final conversion. 

Operating at a granular, user-level scale, MTA collects data from cookies, device IDs, and other digital identifiers to create a detailed picture of the customer journey.

MTA is commonly supported by marketing channels like Google Ads, which offer different attribution settings – data-driven being the most recommended. 

However, first and last touch models are not considered part of MTA, as they only account for a single touchpoint.

Beyond in-platform attribution, most analytics tools also support multi-touch attribution. 

For SMBs with strong tracking and high data quality, these tools can be sufficient. 

However, taking attribution to the next level requires a customized MTA by:

  • Using a tool that allows customization.
  • Or building custom attribution reports, often in combination with a data warehouse. 

A tailored MTA ensures attribution is aligned with your business and customer journey, leading to more accurate insights.

The need for a customized MTA becomes clear with the following example:

Imagine a user encounters two social touchpoints – an Instagram ad and a TikTok ad – before converting through a Google Search ad. 

A standard MTA might allocate 20% credit to each social channel for awareness and 60% to Google Search, assuming search played the most crucial role due to its intent-driven nature.

  • Instagram ad: 20%
  • TikTok ad: 20%
  • Google Search: 60%

You might conclude that increasing your Google Ads budget and investing more in search is the right move.

While this could work, it could also backfire – without a customized MTA, your decision-making may be flawed.

Let’s take a closer look at the user journey to see what might be wrong:

  • Instagram ad – Cold awareness: 50%
  • TikTok ad – Remarketing: 40%
  • Google Search – Branded search: 10%

Instead of Google Search being the primary driver, it could be that:

  • Instagram is generating initial awareness.
  • TikTok is handling remarketing.
  • Google is simply capturing conversions from users already familiar with your brand. 

In this case, increasing Google Ads spend wouldn’t necessarily drive more sales. It would just reinforce the final step while neglecting the earlier, more influential touchpoints.

With this in mind, MTA weightings can look completely different. 

Investing more in cold traffic and remarketing while minimizing spend on Google Search might be the smarter approach, as search doesn’t generate demand but rather supports the last step and defends your brand against competitors.

This example highlights why a customized MTA is essential. It allows you to tailor attribution to your specific strategy, funnel, and customer journey. 

However, if data quality is poor or customization is lacking, it can lead to inaccurate insights, poor decisions, and short-term thinking.

Marketing mix modeling

Marketing mix modeling, on the other hand, takes a top-down, aggregate approach. 

It analyzes historical marketing spend across channels along with external factors to assess their impact on business outcomes. 

Using advanced statistical techniques, MMM identifies correlations between marketing investments and results.

An effective marketing mix model incorporates both historical and current data, making it resilient to outliers and short-term fluctuations. 

Depending on the model, it also allows for the inclusion of seasonal trends, industry benchmarks, growth rates, and marketing volume. 

Additionally, MMM can account for brand awareness and loyalty in base sales, as well as measure incremental sales.

Get the newsletter search marketers rely on.



MTA vs. MMM: Key differences

MTA vs. MMM - Key differences

MTA is a valuable tool for digital marketing teams that need immediate insights and real-time tracking to optimize campaigns quickly. 

Its granular data helps marketers refine conversion paths and personalize customer interactions. 

However, increasing privacy restrictions and the phase-out of third-party cookies make MTA more challenging to implement effectively. 

Additionally, its digital-first nature means it struggles to account for offline marketing efforts and may lead businesses to prioritize short-term conversions over long-term brand growth.

MMM, by contrast, provides a broader, privacy-friendly approach that captures both digital and offline marketing performance. 

It is particularly useful for long-term budget planning, helping businesses allocate resources effectively across multiple channels. 

However, its reliance on historical data and aggregate trends makes it less suited for rapid campaign adjustments. 

Companies that operate across both digital and traditional marketing channels may benefit from combining MTA’s real-time insights with MMM’s strategic guidance for a more balanced approach.

Dig deeper: How to evolve your PPC measurement strategy for a privacy-first future

Open-source marketing mix models

Open-source marketing mix models are widely used for several reasons. 

They are free, making them an attractive alternative to expensive enterprise tools. 

Another key advantage is transparency. Since these models can be reviewed, businesses are not reliant on “black box” solutions. 

Some of the most notable open-source models include:

To determine which model best suits your needs, it’s helpful to experiment by uploading test datasets and exploring their functionalities. 

While these models share a common approach, they differ in customization depth and fine-tuning capabilities. 

In my experience, Meridian is the most advanced, offering deep integration with first-party, organic, and third-party data. However, its complexity may require a steeper learning curve. 

For a quicker setup, Robyn from Meta is a solid starting point.

Hybrid approach

As marketing measurement evolves, organizations increasingly adopt hybrid approaches that combine the strengths of both MTA and MMM. This unified framework aims to:

  • Leverage MTA’s granular digital insights for tactical optimization.
  • Use MMM for strategic planning and budget allocation.
  • Cross-validate findings between both methodologies.
  • Provide a more complete view of marketing effectiveness.

For digital-first companies, MTA is often the preferred starting point, offering real-time insights for rapid campaign adjustments.

In contrast, businesses investing heavily in traditional marketing tend to benefit more from MMM, as it:

  • Aligns with privacy regulations.
  • Accounts for external factors.
  • Delivers a holistic view of marketing performance.

A hybrid approach provides the best of both worlds – combining MTA’s agility with MMM’s long-term perspective.

While managing both requires additional resources, businesses implementing this strategy gain precise, channel-specific insights and a broader strategic understanding.

This dual approach is particularly valuable for organizations balancing short-term performance optimization with sustainable, long-term growth.

Boost your marketing performance with the right attribution model

Both MTA and MMM offer valuable insights into marketing effectiveness, but they serve different purposes and have distinct advantages.

As the marketing landscape becomes more complex and privacy-focused, it’s essential to assess your measurement needs and capabilities to determine the best approach – or a combination of both.

The future of marketing measurement likely lies in hybrid solutions that blend MTA’s granular insights with MMM’s strategic perspective while adapting to evolving privacy regulations and technological changes.

By integrating these methodologies, you’ll be better equipped to optimize marketing investments and drive long-term business growth.

Read more at Read More

How to Improve Largest Contentful Paint (LCP) in Under an Hour

Largest Contentful Paint (LCP) is one of Google’s three Core Web Vitals.

Like the other two (Cumulative Layout Shift and Interaction to Next Paint), it’s not exactly clear what it means.

Lots of tools can show your LCP score and outline ways to improve it. But their tips are often generic, and lack the detail you need to actually take action.

So, in this guide I’ll walk you through actionable steps to improve your LCP. I’ll separate them by:

  • Their potential impact
  • The effort required to make the fix
  • Which specific aspect of your LCP score they help with

But first, let’s talk about what LCP actually means for your website (jump to this part for the fixes).

What Does Largest Contentful Paint Even Mean?

Largest Contentful Paint measures how long it takes for the main content of your webpage to appear on your user’s screen—whether that’s a hero image, heading, or block of text.

It’s not the most intuitive phrase, so let’s break it down word by word:

  • Largest: The biggest piece of visible content on the screen. This could be a large image, a big headline, or any major element that stands out.
  • Contentful: It’s something that has actual content—like text or an image—and isn’t just a background or frame.
  • Paint: This refers to how your browser “draws” (or renders) that element on your screen.

Largest Contentful Paint

For example, imagine clicking a link to read a news article.

The page might load various elements quickly, like the header menu at the top and placeholders for ads.

But if the article text takes five seconds to show up, that’s a poor experience. That delay is what LCP measures.

When you think about LCP, think about your visitors. It’s the difference between someone seeing your main product image or headline right away versus waiting and possibly leaving.

A faster LCP generally means a better user experience. And a better experience means happier visitors who trust your site and want to hang around (and potentially buy from you).

Further reading: For more on how loading speed can affect your website experience and optimization, check out our full guide to page speed and SEO.


What Is a Good LCP Score?

Google provides three scoring ranges for LCP:

  • Good: 2.5 seconds or faster
  • Needs Improvement: 2.5 to 4 seconds
  • Poor: Longer than 4 seconds

Google's largest contentful paint guidelines

These benchmarks serve as useful guidelines, but your users’ actual experience matters most.

A visually rich photography portfolio might take longer to load but still satisfy visitors. Meanwhile, a simple text-based article that loads in three seconds might frustrate users who
expect instant access.

So, focus on your audience’s expectations and behavior. Check your analytics to see if slower LCP correlates with higher bounce rates or lower conversion rates.

These numbers tell you more about your site’s real performance than any benchmark can.

If your conversion rate is 10x the industry average, it likely won’t make a massive dent in your bottom line if you improve your LCP score.

But if people aren’t staying long on your important pages, improving your LCP score could help boost your site’s performance. This, in turn, can lead to better results for your business.

How to Measure Your LCP Score

There are lots of tools you can use to measure your LCP. But you don’t want to just get your score.

You also want to learn these two things:

  • What your LCP element is
  • Which stage of your LCP is longest

Finding these two pieces of information is key for prioritizing which methods you should use to improve your LCP.

For example, you could spend hours minifying your code, inlining your CSS, and deferring JavaScript. But it won’t make much of a difference if your LCP element is a hero image you just

haven’t optimized yet.

As for the stages:

LCP is made up of four stages:

  1. Time to First Byte (TTFB)
  2. Resource load delay
  3. Resource load time
  4. Element render delay

Each stage is affected by different factors (and methods of optimization). So, if you can identify which stages of your LCP are taking the longest, you can prioritize your fixes accordingly.

Here are two ways to find this information.

Note: With many tools, you’ll get different LCP scores depending on whether you check the mobile or desktop version of your site. Optimizing for both helps improve your experience for all users.


Google PageSpeed Insights

Google’s PageSpeed Insights (PSI) is a popular choice if you want a simple, web-based report.

Just plug in your URL, and you’ll get a quick overview of your Core Web Vitals, including LCP.

Pagespeed Insights – Backlinko LCP

PSI is great if you’re not a big fan of digging around in complex dashboards. It gives you clear visuals and actionable tips without much fuss.

It also has a handy diagnostics section which tells you some of the main ways you can reduce your score. Just make sure you select the “LCP” option next to “Show audits relevant to.”

Pagespeed Insights – LCP Audits

Click the “Largest Contentful Paint element” option to see which element on that page is the LCP element.

Pagespeed Insights – LCP Element

It also shows you the breakdown (as a percentage) of each stage of your LCP. From the example above, you can see the vast majority (88%) of our LCP time comes from the render delay stage.

Knowing this lets us focus our efforts on the methods in the next section that specifically help reduce that stage of the LCP score.

Chrome DevTools

Chrome’s DevTools can give you detailed, real-time feedback on various aspects of your page’s performance.

It’s especially useful for testing changes on the fly, but it might feel a bit overwhelming if you’re completely new to web development.

Access it in Chrome on any webpage by right clicking and selecting “Inspect.”

Backlinko post – Inspect

In the interface that appears, head to the “Performance” tab.

(You can select the three dots next to the cog icon and change where the dock goes—I find horizontal is best for analyzing LCP.)

This view shows your LCP score. If you hover over the “LCP element” underneath the score, you’ll see which part of the content is the largest contentful element.

Chrome – Performance LCP – By phase

Then, get a breakdown of the LCP stages by clicking the “Record and reload” button. This will run the performance checks again on the page, and you’ll see more information along with a
waterfall chart.

Ignore that for now, and instead click the “LCP by phase” drop-down. This breaks the LCP down into its four constituent parts, showing the actual time for each stage along with a percentage.

Chrome – Performance LCP – By phase

As before, you can use this information to prioritize your optimization efforts and more effectively improve your LCP.

How to Improve Your LCP

You can improve your LCP in several ways, and some methods will help you more than others.

The table below sorts the methods by impact, also indicating the effort level each one requires and which stage of your LCP it’ll help reduce.

Method Impact Helps Reduce
Use a CDN High Resource Load Time
Optimize Your Images High Resource Load Time
Use WordPress Plugins High Potentially all stages
Implement Caching High Time to First Byte
Use a Faster Host High Time to First Byte
Minify Your Code Medium Resource Load Time
Optimize Your Fonts Medium Resource Load Time
Avoid Lazy Loading Above the Fold Medium Resource Load Delay
Remove Elements You Don’t Need Medium Element Render Delay
Use Defer/Async for JS Medium Element Render Delay
Inline Critical CSS Medium Element Render Delay
Switch to SSR Medium Element Render Delay
Preload Important Resources Medium Resource Load Delay

These values are all relative.

Your own skill level, your website’s setup, and your budget will affect how easy or cost-effective these changes are for you.

I’ve taken each method in isolation, as the relative impact of each fix may decrease as you implement each one.

For example, if you implement lots of these methods but don’t use a CDN, your LCP score will likely improve to the point that using a CDN might not make much difference to the score
(although it may still improve your user experience).

Finally, a few of these might help reduce different stages of your LCP. As with every change you make to your website, there’s usually a bit of overlap in terms of what it’ll affect.

I’ll explain more of the nuances and who each fix is best suited to below.

Free Resource: Stay on top of your LCP optimization efforts with our free LCP progress tracker.



Use a CDN

Impact: High | Effort: Low | Helps Reduce: Resource Load Time

A Content Delivery Network (CDN) stores (cached) copies of your content across servers around the world. When people visit your site, they’re served files from the closest server to them.

That means faster load times for your users.

Implement a CDN

If you’re running a small local blog, you might not absolutely need a CDN. But if you have visitors from all over, a CDN can boost your LCP by reducing the travel time for your data.

This is most impactful for:

  • Websites with visitors from multiple regions
  • Sites with lots of large images or media files
  • Anyone wanting to improve global load times without lots of coding

How to Implement This

You can sign up for a CDN service like Cloudflare, KeyCDN, or StackPath. They’ll provide instructions for changing your domain’s settings to route traffic through their servers.

Once set up, the CDN will serve your website files to users from the server that’s physically located closest to them.

There are cheap and free options, but it can get expensive for larger sites with lots of traffic.

If you use WordPress or a similar content management system (CMS), there are often plugins that make the setup process even smoother.


Optimize Your Images

Impact: High | Effort: Medium | Helps Reduce: Resource Load Time

Large image files are a common reason for poor LCP scores. This is especially true if you use a large hero image at the top of your pages or blog posts.

By compressing images before uploading them, you reduce their file size to make them load faster.

This is most impactful for:

  • Sites with lots of large product or blog images
  • Photographers or ecommerce stores with high-res visuals
  • Anyone looking for a straightforward way to speed up load times

How to Implement This

You can optimize your images using online tools, and there are lots of free options. Or you can use plugins that auto-compress images when you upload them to your content management system.

Squoosh is a free tool that lets you tweak the optimization settings, choose a format to convert to, and resize the image:

Squoosh – Editor

To do this in bulk, you can also use a tool like TinyPNG:

TinyPNG – Homepage

Just keep an eye on quality—if you compress too much, your images might look blurry. But most of the time, you can shrink them a lot without anyone noticing.

Pro tip: Beyond images, it’s usually best to avoid having a video above the fold. This can lead to poor LCP scores.



Use WordPress Plugins

Impact: High | Effort: Low | Helps Reduce: Potentially all stages

For many WordPress users, plugins are the easiest way to speed up your site and fix LCP issues with minimal effort. They can handle image optimization, caching, code minification, and
more—all from a simple dashboard.

The caveat is that the best ones aren’t always free. So you’re often paying a convenience cost. But there are still some unpaid options out there.

Another downside is the risk of plugin “bloat,” which can slow your site if you install too many or choose poorly optimized ones.

Compatibility issues may also pop up, especially if you try to use multiple optimization plugins at one time.

But as long as you don’t have hundreds of plugins, and check for compatibility, I find the benefits typically outweigh the downsides here.

Note: If you use a different CMS, like Shopify, there are likely apps or add-ons that can help with your LCP score.

Shopify app store – Page speed


This is most impactful for:

  • WordPress users without technical know-how
  • Anyone who wants a quick fix for multiple performance issues
  • Those willing to spend a bit of money to solve a lot of issues at once (although there are free options)

How to Implement This

There are lots of WordPress plugins that are great for improving your LCP in particular, and your page speed in general.

One example is WP Rocket. It’s a paid WordPress optimization plugin that does a lot of the things on this list for you.

Including:

  • Image optimization
  • Code minification
  • Preloading/prefetching resources
  • CDN implementation
  • Caching

WP Rocket – File Optimization

There are lots of customization options, making this a useful plugin a quick and fairly easy solution to improve your LCP.

Autoptimize is a free WordPress plugin that does a lot of the same things as WP Rocket.

It does lack a few features, like generating critical CSS and caching. But it’s a good starting point for beginners on a budget with a WordPress site.

Autoptimize – WP Plugin


Implement Caching

Impact: High | Effort: Low | Helps Reduce: Time to First Byte

Caching stores parts of your site on your user’s browser so it doesn’t have to request them from scratch every time they visit the site.

Activate browser caching

This can speed up your LCP because your server won’t need to work as hard to deliver the key page elements the next time the user visits.

Caching makes load time much quicker for second visits

Many hosting providers include caching by default.

You can also install plugins that handle caching for you.

This is most impactful for:

  • Sites with repeat visitors (e.g., blogs, online magazines)
  • Websites on platforms that generate pages dynamically (like WordPress)
  • Sites experiencing slow server response times

How to Implement This

If your host offers caching, enable it in your hosting dashboard. Otherwise, consider a caching plugin.

If you use a CDN, it already relies on caching to serve your content to users with faster load times.

Note: You only need to use one effective caching setup or plugin at a time. Using multiple can lead to no performance improvements at best, and various compatibility issues at worst.



Use a Faster Web Host

Impact: High | Effort: Low | Helps Reduce: Time to First Byte

Switching to a more powerful hosting plan or provider can make a big difference in how quickly your site’s main content loads.

That’s because your web host’s speed is going to have the largest impact on your Time to First Byte.

How does webhosting works?

This is often the simplest route if you don’t want to tinker with technical details. However, premium hosting can be expensive.

If you have a small site or a tight budget, you might find it hard to justify the cost for LCP gains alone. But for large businesses or sites that generate a lot of revenue, investing in better hosting can pay off.

Note: This is also unlikely to put a dent in your LCP if your host is already pretty quick. I’d generally only recommend considering this option if your Time to First Byte is exceptionally long. Or if you’re noticing other performance issues or extended periods of website downtime.


This is most impactful for:

  • High-traffic sites that need consistent speed
  • Businesses with a budget to invest in premium hosting
  • Sites that have outgrown their current hosting plan

How to Implement This

When upgrading your web host, look for:

  • Reliable uptime
  • Scalability
  • Good support
  • Security features
  • Robust backup options

Migrating your site can be as simple as using a migration plugin if you’re on WordPress, or asking your new host for help.

It’s usually fairly straightforward if you’re staying with your current host and just upgrading your plan. But moving hosts can be a little more effort-intensive.


Minify Your Code

Impact: Medium | Effort: Low | Helps Reduce: Resource Load Time

Minifying code involves stripping out anything “unnecessary,” like extra spaces or new lines, from your site’s HTML, CSS, and JavaScript files. This makes them smaller and faster to load.

Minifying code to reduce load times

If you’re not a developer, you can still do this using tools or plugins that automate the process (like WP Rocket mentioned above).

Just be sure to back up your site or test it in a staging environment. Sometimes, minification can cause layout or script issues.

This is most impactful for:

  • Sites with lots of CSS and JavaScript files
  • Pages that rely on multiple libraries or frameworks

How to Implement This

You can minify your code with free tools like Minifier:

Minifier – Homepage

If you use a CMS like WordPress, use plugins (e.g., WP Rocket, Autoptimize) that automatically shrink your CSS, JS, and HTML.

Here’s how it looks in the “File Optimization” screen of WP Rocket:

WP Rocket – Minify

Most plugins let you choose which files to minify, so if you see any issues, uncheck or exclude the problematic file and test again.

Alternatively, reach out to a developer to help with this instead.


Optimize Your Fonts

Impact: Medium | Effort: Medium | Helps Reduce: Resource Load Time

Fancy fonts can look great, but they can also slow down your page.

Custom fonts often have to be downloaded from a separate server. If you optimize or host them locally, you reduce delays that stop your text (like big headlines) from being visible.

You do want to maintain your site’s style, so it’s a balance between looking good and loading fast. Some sites solve this by using system fonts that don’t need extra downloads.

This is most impactful for:

  • Sites using multiple custom fonts or large font families
  • Design-heavy pages with fancy typography
  • Anyone noticing a “flash of invisible text” when pages load

How to Implement This

Hosting fonts locally is often faster than pulling them from external servers. If you use Google Fonts, you can download them and serve them from your own domain.

But honestly, this just won’t be necessary for most site owners. While it might reduce your LCP, it’s unlikely to be a massive gain and may not be worth the effort.

Alternatively, let a plugin handle font optimization for you. Minimize the number of font weights you use—if you only need bold and regular, don’t load the entire family.


Avoid Lazy Loading Above the Fold

Impact: Medium | Effort: Low | Helps Reduce: Resource Load Delay

Lazy loading is a feature that only loads images when you scroll down to them. In other words, images only load when they’re in the user’s “viewport” (on their screen).

Lazy loading images to improve page speed

It’s great for boosting page load time, and is typically regarded as a best practice for fast websites.

But if you lazy load images that are right at the top of your page, your visitors will see a blank space before anything else pops in. That can really hurt your LCP.

The idea behind lazy loading is to not load images the user doesn’t need to see yet. But when it’s the first image you want a user to see as soon as they land on your page, clearly you don’t want to delay loading at all.

So, it’s usually best to load above-the-fold content right away, then lazy load what’s below.

This is most impactful for:

  • Sites that lazy load everything by default
  • Above-the-fold areas with key images or banners
  • Pages where the main header image is crucial for user engagement

How to Implement This

Many lazy-loading tools let you exclude certain images. Find the settings or plugin option that specifies “above the fold” or “first contentful paint” images, and disable lazy loading for those.

In WP Rocket, you do that in the “Media” area:

WP Rocket – LazyLoad

If you’re not using a CMS like WordPress, just make sure the LCP image’s HTML looks like this, with either loading=“eager” or no loading attribute (“eager” is the default):

Image loading – Eager

Rather than like this, with the loading=“lazy” attribute:

Image loading – Lazy


Remove Elements You Don’t Need

Impact: Medium | Effort: Medium | Helps Reduce: Element Render Delay

Every script, image, or widget on your site adds to the time it takes for your page to fully load. So you need to think carefully about what appears above the fold.

If there’s a huge banner, multiple images, or extra code that doesn’t add real value, consider removing it or placing it below the fold.

Just make sure you don’t strip away elements that are crucial for your users or your brand message.

This is most impactful for:

  • Content-heavy sites filled with widgets or ads
  • Homepages stuffed with multiple banners, slideshows, or animations
  • Anyone looking to simplify their design without sacrificing core features

How to Implement This

Audit your site’s above-the-fold area and ask, “Does this element help my user right away?”

If not, move it below the fold (or remove it entirely).

Think about collapsing large sign-up forms or extra images.

Removing unnecessary scripts, like old tracking codes, can also help. To pinpoint snippets you might want to remove, look out for the “Reduce unused JavaScript” opportunity in PageSpeed Insights:

Pagespeed Insights – LCP Reduce JavaScript


Use Defer/Async for JS

Impact: Medium | Effort: Medium | Helps Reduce: Element Render Delay

JavaScript files can block the rendering of your page if they load first. By deferring or asynchronously loading scripts, you let your main content appear before any heavy scripts run.

This helps your LCP because the biggest chunk of your page shows up without waiting for all your JS to finish loading.

The main reason you’ll likely want to look into async and defer is if the tool you’re measuring your LCP with says you have render blocking resources.

Like this:

Pagespeed Insights – Eliminate render blocking resources

Basically, without any attributes, the browser will attempt to download and then execute your JavaScript as it encounters it. This can lead to slower load times, and longer LCP times if it blocks the LCP element from loading.

With async, it won’t pause parsing (breaking down and analyzing) of the HTML during the download stage. But it still pauses as the script executes after downloading.

With defer, the browser doesn’t pause HTML parsing for the download or execution of your JavaScript. This can lead to lower LCP scores, but it means your JavaScript won’t execute until the browser has finished parsing the HTML.

Async vs defer JavaScript

You might need a developer’s help if you’re not sure which scripts to defer or load asynchronously, or how to do it.

Some optimization plugins for platforms like WordPress can also handle this for you.

This is most impactful for:

  • Sites that rely on several JavaScript libraries
  • Pages slowed down by loading scripts too early
  • Website owners looking for a middle-ground solution without full SSR (more on that below)

How to Implement This

If you’re on WordPress, look for an optimization plugin that includes deferring or async-loading scripts.

In custom setups, you’d add attributes like “defer” or “async” to your script tags in the HTML.

Script – Defer type

Just make sure you don’t delay any critical scripts (like core functionality) too much.


Inline Critical CSS

Impact: Medium | Effort: High | Helps Reduce: Element Render Delay

Inlining CSS means putting small blocks of CSS code right into your HTML, so your page doesn’t need to fetch a separate file for that part.

It can speed up how quickly your main elements appear. But you can’t inline everything, or you’d end up with a massive HTML file that defeats the purpose.

This method can be helpful for critical (above-the-fold) styles, but it shouldn’t replace your entire stylesheet.

Plus, the Chrome team’s official guidance on this is that:

“In general, inlining your style sheet is only recommended if your style sheet is small since inlined content in the HTML cannot benefit from caching in subsequent page loads. If a style sheet is so large that it takes longer to load than the LCP resource, then it’s unlikely to be a good candidate for inlining.”


This is most impactful for:

  • Sites with a small amount of critical CSS for the header area
  • Minimalist designs that don’t rely on big external stylesheets
  • Anyone looking to shave off small load delays

How to Implement This

Identify the essential CSS you need to style your page’s top section, and place it directly in the HTML <head>. This can reduce the time it takes to render the crucial above-the-fold part.

Keep the rest of your CSS in external files to avoid bloating your HTML. Some performance plugins can automate this “critical CSS” approach for you.

Autoptimize offers a cheap solution, while it’s baked into plugins like NitroPack and WP Rocket.

While there are also dedicated critical CSS plugins, I’d generally recommend going for a more feature-rich option for a bit of extra money (if you have the budget). You’ll typically get more value than spending $10 a month on one feature that may have limited impact on your LCP.


Switch to SSR

Impact: Medium | Effort: High | Helps Reduce: Element Render Delay

CSR (Client-Side Rendering) means your user’s browser does a lot of the work to build the page.

SSR (Server-Side Rendering) means most of the work happens before the page hits the user’s browser.

SSR can help LCP for sites heavy in JavaScript, because the biggest content is already “pre-built” for the user. But switching from CSR to SSR can be a big project if you’re not familiar with it.

For some sites, it’s overkill. For others, it’s the key to big performance gains.

Server side vs client side rendering

This is one method where you really need to weigh up the benefits and how they might apply to your specific situation:

  • Run a fairly standard blog, service website, or ecommerce store? Switching to SSR might bring noticeable performance gains.
  • Got a highly interactive web app? You might want to stick with CSR for a better user experience.

Generally, if you implement other methods like caching and using a CDN, you’ll see performance benefits with SSR that outweigh the potential server load increase.

This is most impactful for:

  • JavaScript-heavy web apps (e.g., React, Vue)
  • Sites noticing a significant delay before content appears
  • Advanced users or teams that can handle more complex architecture

How to Implement This

Switching from Client-Side Rendering to Server-Side Rendering (or a hybrid approach) typically involves using frameworks (like Next.js for React) that pre-render your content on the server.

This can speed up LCP since the browser receives a ready-made page. However, it’s a bigger project requiring code changes and a good understanding of your tech stack.

If you’re not comfortable with that, you might need to hire a developer or agency.


Preload Important Resources

Impact: Medium | Effort: Medium | Helps Reduce: Resource Load Delay

Preloading tells the browser which files it should grab or prepare in advance.

It can shave off a bit of loading time and help your main content appear slightly faster. For many small sites, these optimizations won’t create dramatic changes.

But on bigger sites or those with lots of images and unique fonts, it can make a difference.

This is most impactful for:

  • Sites that rely on off-site resources (e.g., fonts or images)
  • Those comfortable editing HTML headers or using plugins that can do this at scale

How to Implement This

You can preload fonts and images by adding special link tags in your site’s <head>. They tell the browser to grab or prepare certain resources before they’re actually needed.

You simply add rel=“preload” to the <link> tag. Like this:

Link rel – Preload

How much effort this requires depends on your specific setup and how many pages you want to deploy it on. But it’s a fairly simple process that can help reduce your LCP score.

Note: As with a lot of the other methods on this list, WordPress plugins can help here too.


Boost Your Rankings by Improving Your Page Experience

Improving your LCP is one way to boost your overall page experience for users.

In turn, this can actually end up having an impact on your rankings beyond Google’s page experience signals.

Check out our guide to user behavior and SEO to learn how the way your users behave on your website could potentially impact how Google ranks your site.


(It makes optimizing for factors like LCP and the other Core Web Vitals A LOT more important.)

The post How to Improve Largest Contentful Paint (LCP) in Under an Hour appeared first on Backlinko.

Read more at Read More