Posts

Localized SEO for LLMs: How Best Practices Have Evolved

Large language models (LLMs) like ChatGPT, Perplexity, and Google’s AI Overviews are changing how people find local businesses. These systems don’t just crawl your website the way search engines do. They interpret language, infer meaning, and piece together your brand’s identity across the entire web. If your local visibility feels unstable, this shift is one of the biggest reasons.

Traditional local SEO like Google Business Profile optimization, NAP consistency, and review generation still matter. But now you’re also optimizing for models that need better context and more structured information. If those elements aren’t in place, you fade from LLM-generated answers even if your rankings look fine. When you’re focusing on a smaller local audience, it’s essential that you know what you have to do.

Key Takeaways

  • LLMs reshape how local results appear by pulling from entities, schema, and high-trust signals, not just rankings.
  • Consistent information across the web gives AI models confidence when choosing which businesses to include in their answers.
  • Reviews, citations, structured data, and natural-language content help LLMs understand what you do and who you serve.
  • Traditional local SEO still drives visibility, but AI requires deeper clarity and stronger contextual signals.
  • Improving your entity strength helps you appear more often in both organic search and AI-generated summaries.

How LLMs Impact Local Search

Traditional local search results present options: maps, listings, and organic rankings. 

Search results for "Mechanic near Milkwaukee."

LLMs don’t simply list choices. They generate an answer based on the clearest, strongest signals available. If your business isn’t sending those signals consistently, you don’t get included.

An AI overview for "Where can I find a good mechanic near Milkwaukee?"

If your business information is inconsistent and your content is vague, the model is less likely to confidently associate you with a given search. That hurts visibility, even if your traditional rankings haven’t changed. As you can see above, these LLM responses are the first thing that someone can see in Google, not an organic listing. This doesn’t even account for the growing number of users turning to LLMs like ChatGPT directly to answer their queries, never using Google at all.

How LLMs Process Local Intent

LLMs don’t use the same proximity-driven weighting as Google’s local algorithm. They infer local relevance from patterns in language and structured signals.

They look for:

  • Reviews that mention service areas, neighborhoods, and staff names
  • Schema markup that defines your business type and location
  • Local mentions across directories, social platforms, and news sites
  • Content that addresses questions in a city-specific or neighborhood-specific way

If customers mention that you serve a specific district, region, or neighborhood, LLMs absorb that. If your structured data includes service areas or specific location attributes, LLMs factor that in. If your content references local problems or conditions tied to your field, LLMs use those cues to understand where you fit. 

This is important because LLMs don’t use GPS or IP address at the time of search like Google does. They are reliant on explicit mentions and pull conversational context, IP-derived from the app to get a general idea, so it’s not as proximity-exact relevant to the searcher.

These systems treat structured data as a source of truth. When it’s missing or incomplete, the model fills the gaps and often chooses competitors with stronger signals.

Why Local SEO Still Matters in an AI-Driven World of Search

Local SEO is still foundational. LLMs still need data from Google Business Profiles, reviews, NAP citations, and on-site content to understand your business. 

NAP info from the better business bureau.

These elements supply the contextual foundation that AI relies on.

The biggest difference is the level of consistency required. If your business description changes across platforms or your NAP details don’t match, AI models sense uncertainty. And uncertainty keeps you out of high-value generative answers. If a user has a more specific branded query for you in an LLM, a lack of detail may mean outdated/incorrect info is provided about your business.

Local SEO gives you structure and stability. AI gives you new visibility opportunities. Both matter now, and both improve each other when done right.

Best Practices for Localized SEO for LLMs

To strengthen your visibility in both search engines and AI-generated results, your strategy has to support clarity, context, and entity-level consistency. These best practices help LLMs understand who you are and where you belong in local conversations.

Focus on Specific Audience Needs For Your Target Areas

Generic local pages aren’t as effective as they used to be. LLMs prefer businesses that demonstrate real understanding of the communities they serve.

Write content that reflects:

  • Neighborhood-specific issues
  • Local climate or seasonal challenges
  • Regulations or processes unique to your region
  • Cultural or demographic details

If you’re a roofing company in Phoenix, talk about extreme heat and tile-roof repair. If you’re a dentist in Chicago, reference neighborhood landmarks and common questions patients in that area ask.

The more local and grounded your content feels, the easier it is for AI models to match your business to real local intent.

Phrase and Structure Content In Ways Easy For LLMs to Parse

LLMs work best with content that is structured clearly. That includes:

  • Straightforward headers
  • Short sections
  • Natural-language FAQs
  • Sentences that mirror how people ask questions

Consumers type full questions, so answer full questions.

Instead of writing “Austin HVAC services,” address:
“What’s the fastest way to fix an AC unit that stops working in Austin’s summer heat?”

Google results for "What's the fastest way to fix an AC unit thtat stops working in Austin's summer heat?"

LLMs understand and reuse content that leans into conversational patterns. The more your structure supports extraction, the more likely the model is to include your business in summaries.

Emphasize Your Localized E-E-A-T Markers

LLMs evaluate credibility through experience, expertise, authority, and trust signals, just as humans do.

Strengthen your E-E-A-T through:

  • Case details tied to real neighborhoods
  • Expert commentary from team members
  • Author bios that reflect credentials
  • Community involvement or partnerships
  • Reviews that speak to specific outcomes

LLMs treat these details as proof you know what you’re talking about. When they appear consistently across your web presence, your business feels more trustworthy to AI and more likely to be recommended.

Use Entity-Based Markup

Schema markup is one of the clearest ways to communicate your identity to AI. LocalBusiness schema, service area definitions, department structures, product or service attributes—all of it helps LLMs recognize your entity as distinct and legitimate.

An example of schema markup.

Source

The more complete your markup is, the stronger your entity becomes. And strong entities show up more often in AI answers.

Spread and Standardize Your Brand Presence Online

LLMs analyze your entire digital footprint, not just your site. They compare how consistently your brand appears across:

  • Social platforms
  • Industry directories
  • Local organizations
  • Review sites
  • News or community publications

If your name, address, phone number, hours, or business description differ between platforms, AI detects inconsistency and becomes less confident referencing you. It’s also important to make sure more subjective factors like your brand voice and value propositions are also consistent across all these different platforms.

One thing that you may not be aware of is that ChatGPT uses Bing’s index, so Bing Places is one area to prioritize building your presence. While it’s not necessarily going to mirror how Bing will display in the search engine, it uses the data. Things like Apple Maps, Google Mps, and Waze are also priorities to get your NAP info.

Standardization builds authority. Authority increases visibility.

Use Localized Content Styles Like Comparison Guides and FAQs

LLMs excel at interpreting content formats that break complex ideas into digestible pieces.

Comparison guides, cost breakdowns, neighborhood-specific FAQs, and troubleshooting explainers all translate extremely well into AI-generated answers. These formats help the model understand your business with precision.

A comparison between two plumbing services.

If your content mirrors the structure of how people search, AI can more easily extract, reuse, and reference your insights.

Internal Linking Still Matters

Internal linking builds clarity, something AI depends on. It shows which concepts relate to each other and which topics matter most.

Connect:

  • Service pages to related location pages
  • Blog posts to the services they support
  • Local FAQs to broader category content

Strong internal linking helps LLMs follow the path of your expertise and understand your authority in context.

Tracking Results in the LLM Era

Rankings matter, but they no longer tell the full story. To understand your AI visibility, track:

  • Branded search growth
  • Google Search Console impressions
  • Referral traffic from AI tools
  • Increases in unlinked brand mentions
  • Review volume and review language trends

This is easier with the advent of dedicated AI visibility tools like Profound. 

The Profound Interface.

The goal here is to have a method to reveal whether LLMs are pulling your business into their summaries, even when clicks don’t occur.

As zero-click results grow, these new metrics become essential.

FAQs

What is local SEO for LLMs?

It’s the process of optimizing your business so LLMs can recognize and surface you for local queries.

How do I optimize my listings for AI-generated results?

Start with accurate NAP data, strong schema, and content written in natural language that reflects how locals ask questions.

What signals do LLMs use to determine local relevance?

Entities, schema markup, citations, review language, and contextual signals such as landmarks or neighborhoods.

Do reviews impact LLM-driven searches?

Yes. The language inside reviews helps AI understand your services and your location.

Conclusion

LLMs are rewriting the rules of local discovery, but strong local SEO still supplies the signals these models depend on. When your entity is clear, your citations are consistent, and your content reflects the real needs of your community, AI systems can understand your business with confidence.

These same principles sit at the core of both effective LLM SEO and modern local SEO strategy. When you strengthen your entity, refine your citations, and create content grounded in real local intent, you improve your visibility everywhere—organic rankings, map results, and AI-generated answers alike.

Read more at Read More

AI search is growing, but SEO fundamentals still drive most traffic

AI search is growing, but SEO fundamentals still drive most traffic

Generative AI is everywhere right now. It dominates conference agendas, fills LinkedIn feeds, and is reshaping how many businesses think about organic search. 

Brands are racing to optimize for AI Overviews, build vector embeddings, map semantic clusters, and rework content models around LLMs.

What gets far less attention is a basic reality: for most websites, AI platforms still drive a small share of overall traffic. 

AI search is growing, no question. 

But in most cases, total referral sessions from all LLM platforms combined amount to only about 2% to 3% of the organic traffic Google alone delivers.

AI referral sessions vs Google organic clicks

Despite that gap, many teams are spending more time chasing AI strategies than fixing simple, high-impact SEO fundamentals that continue to drive measurable results. 

Instead of improving what matters most today, they are overinvesting in the future while underperforming in the present.

This article examines how a narrow focus on AI can obscure proven SEO tactics and highlights practical examples and real-world data showing how those fundamentals still move the needle today.

1. Quick SEO wins are still delivering outsized gains

In an era where everyone is obsessed with things like vector embeddings and semantic relationships, it’s easy to forget that small updates can have a big impact. 

For example, title tags are still one of the simplest and most effective SEO levers to pull. 

And they are often one of the on-page elements that most websites get wrong, either by targeting the wrong keywords, not including variations, or targeting nothing at all.

Just a few weeks ago, a client saw a win by simply adding “& [keyword]” to the existing title tag on their homepage. Nothing else was changed.

Keyword rankings shot up, as did clicks and impressions for queries containing that keyword.

Results - Updating existing title tags
Results - Updating existing title tags Oct-Nov 2025

This was all achieved simply by changing the title tag on one page. 

Couple that with other tactics, such as on-page copy edits, internal linking, and backlinking across multiple pages, and growth will continue. 

It may seem basic, but it still works. 

And if you only focus on advanced GEO strategies, you may overlook simple tactics that provide immediate, observable impact. 

2. Content freshness and authority still matter for competitive keywords

Another tactic that has faded from view with the rise of AI is what’s often called the skyscraper technique. 

It involves identifying a set of keywords and the pages that already rank for them, then publishing a materially stronger version designed to outperform the existing results.

It’s true that the web is saturated with content on similar topics, especially for keywords visible in most research tools.

But when a site has sufficient authority, a clear right to win, and content freshness, this approach can still be highly effective.

I’ve seen this work repeatedly. 

Here’s Google Search Console data from a recent article we published for a client on a popular, long-standing topic with many competing pages already ranking. 

The post climbed to No. 2 almost immediately and began generating net-new clicks and impressions.

Results - Skyscraper content

Why did it work? 

The site has strong authority, and much of the content ranking ahead of it was outdated and stale.

If you’re hesitant to publish the thousandth article on an established topic, that hesitation is understandable. 

This approach won’t work for every site. But ignoring it entirely can mean passing up clear, high-confidence wins like these.

Get the newsletter search marketers rely on.


3. User experience remains a critical conversion lever

Hype around AI-driven shopping experiences has led some teams to believe traditional website optimization is becoming obsolete. 

There is a growing assumption that AI assistants will soon handle most interactions or that users will convert directly within AI platforms without ever reaching a website.

Some of that future is beginning to take shape, particularly for ecommerce brands experimenting with features like Instant Checkout in ChatGPT

But many websites are not selling products. 

And even for those that are, most brands still receive a significant volume of traffic from traditional search and continue to rely on calls to action and on-page signals to drive conversions.

It also makes little difference how a user arrives – via organic search, paid search, AI referrals, or direct visits. 

A fast site, a strong user experience, and a clear conversion funnel remain essential.

There are also clear performance gains tied to optimizing these elements. 

Here are the results we recently achieved for a client following a simple CTR test:

Results - CTR test

Brands that continue to invest in user experience and conversion rate optimization will outperform those that do not. 

That gap is likely to widen the longer teams wait for AI to fully replace the conversion funnel.

AI is reshaping search, but what works still matters

There is no dispute that AI is reshaping the search landscape. 

It’s changing user behavior, influencing SERPs, and complicating attribution models. 

The bigger risk for many businesses, however, is not underestimating AI but overcorrecting for it.

Traditional organic search remains the primary traffic source for most websites, and SEO fundamentals still deliver when executed well. 

  • Quick wins are real. 
  • Higher-quality content continues to be rewarded. 
  • User experience optimization shows no signs of becoming irrelevant. 

These are just a few examples of tactics that remain effective today.

Importantly, these efforts do not operate in isolation. 

Improving a website’s fundamentals can strengthen organic visibility while also supporting paid search performance and LLM visibility.

Staying informed about AI developments and planning for what’s ahead is essential. 

It should not come at the expense of the strategies that are currently driving measurable growth.

Read more at Read More

Google expands Performance Max channel reporting to MCCs

Google’s token auction: When LLMs write the ads in real time

Google appears to be rolling out the Performance Max Channel Performance report at the MCC level, giving agencies and large advertisers a long-awaited view of channel-level performance across multiple accounts.

What’s new: The Channel Performance report, previously limited to individual accounts, is now surfacing in some manager (MCC) accounts. Google had previously confirmed the feature was coming, but this marks one of the first confirmed sightings in live environments.

Why we care. MCC-level visibility allows agencies to analyze how Performance Max allocates spend and drives results across channels—Search, Display, YouTube, Discover, Gmail, and Shopping—without logging into each account individually. That’s a major efficiency gain for teams managing large portfolios.

What to watch. When and how quickly the feature becomes available across all MCCs, and whether Google expands the report with deeper metrics or export options.

First seen. This update was first picked up by head of Ecommerce Insights at Smarter Ecommerce, Mike Ryan, who very recently published a guide on How to use Google’s Channel Performance reports.

Bottom line. MCC-level Channel Performance reporting signals another step toward making Performance Max less of a black box—especially for agencies that need cross-account insight at scale.

Read more at Read More

Why Google is deleting reviews at record levels

Why Google is deleting reviews at record levels

In 2025, Google is removing reviews at unprecedented rates – and it is not accidental.

Our industry analysis of 60,000 Google Business Profiles shows that deletions are being driven by a mix of:

  • Automated moderation.
  • Industry-wide risk factors.
  • Increased enforcement against incentivized reviews.
  • Local regulatory pressure.

Together, these forces have significant implications for businesses and local search visibility.

Review deletions are on the up globally

Weekly deleted reviews - Jan to Jul 2025

Data collected from tens of thousands of Google Business Profile listings across multiple countries by GMBapi.com show a sharp increase in deleted reviews between January and July 2025. 

The surge began accelerating toward the end of Q1 and gained momentum mid-year, with a growing share of monitored locations experiencing at least one review removal in a given week.

This is not limited to negative feedback. 

While one-star reviews continue to be taken down, five-star reviews now account for a sizable share of deletions. 

That pattern suggests Google is applying stricter enforcement, including on positive reviews, as it works to maintain authenticity and trust. 

More recently, Google has begun asking members of its Local Guide community whether businesses are incentivizing reviews, likely in response to AI-driven flags for suspicious activity.

Dig deeper: Google’s review deletions: Why 5-star reviews are disappearing

Not all industries are treated the same

Review deletion patterns vary significantly by business category.

Restaurants account for the highest volume of deleted reviews, followed by home services, brick-and-mortar retail, and construction. 

These categories generate large volumes of reviews, and removals occur across both recent and older submissions. 

That distribution points to ongoing enforcement, not isolated cleanup efforts.

By contrast, medical services, beauty, and professional services see fewer deletions overall. 

However, closer analysis reveals distinct and consistent patterns within those categories.

What review ratings reveal about industry bias

Top 10 meta categories- Deleted review rating mix

Looking at deleted reviews as a share of total removals within each category reveals distinct moderation patterns.

In restaurants and general retail, deleted reviews are relatively evenly distributed across one- to five-star ratings. 

By contrast, medical services and home services show a strong skew toward five-star review deletions, with far fewer removals in the middle of the rating spectrum. 

That imbalance suggests positive reviews in higher-risk or regulated categories face closer scrutiny, likely tied to concerns around trust, safety, and compliance.

These differences do not appear to stem from manual, category-specific policy decisions. 

Instead, they reflect how Google’s automated systems adjust enforcement based on perceived industry risk.

Dig deeper: 7 local SEO wins you get from keyword-rich Google reviews

Get the newsletter search marketers rely on.


Timing matters: Early vs. retroactive deletions

The age of a review plays a significant role in when it is removed.

In medical and home services, a large share of deleted reviews disappear within the first six months after posting. 

That timing points to early intervention by automated systems evaluating language, reviewer behavior, and other risk signals.

Restaurants and brick-and-mortar retail show a different pattern. 

Many deleted reviews in these categories are more than two years old, suggesting retroactive enforcement as detection systems improve or new suspicious patterns emerge. 

It may also reflect efforts to refresh older review profiles.

For businesses, this means reviews can disappear long after they are posted, often without warning.

Geography adds further complexity

Industry alone does not tell the full story. Location matters.

Top 10 meta categories by deleted reviews (stacked by rating)

In English-speaking markets such as the U.S., UK, Canada, and Australia, deleted reviews skew heavily toward five-star ratings. 

That trend aligns with increased AI-driven moderation aimed at reducing review spam and incentivized positive feedback.

Germany stands apart. 

Analysis of thousands of German business listings shows a higher share of deleted reviews are low-rated, and most are removed within weeks of posting. 

This pattern aligns with Germany’s strict defamation laws, which permit businesses to legally challenge negative reviews and require platforms to take prompt action upon notification.

In short:

  • AI-driven enforcement dominates in many English-speaking markets.
  • Legal takedowns play a much larger role in Germany.

What this means for local SEO and small business owners

The rise in review deletions creates two primary challenges.

  • Trust erosion: When legitimate reviews, whether positive or negative, disappear without explanation, confidence in review platforms begins to weaken.
  • Data distortion: Deleted reviews affect star ratings, performance benchmarks, and conversion signals that businesses rely on for local SEO and reputation management.

For SEO practitioners, small businesses, and multi-location brands, review monitoring is no longer optional. 

Understanding when, where, and which reviews are removed is now as important as generating them.

Dig deeper: Why Google reviews will power up your local SEO

The forces reshaping review visibility

Three developments are shaping review visibility:

  • More automated moderation, with AI evaluating reviews in real time and retroactively.
  • Greater legal influence in regions with strict defamation laws.
  • Increased reliance on third-party monitoring tools as businesses seek independent records of review deletion activity.

As moderation becomes more automated and more influenced by local law, sentiment alone will not guarantee review visibility. 

In local SEO, reviews – especially recent ones with detailed context – remain a critical authority signal for both users and search engines.

Staying ahead now means not only collecting new reviews, but also closely tracking and understanding removals. 

Reputation management increasingly requires attention on both fronts.

Read more at Read More

Image SEO for multimodal AI

Decoding the machine gaze- Image SEO for multimodal AI

For the past decade, image SEO was largely a matter of technical hygiene:

  • Compressing JPEGs to appease impatient visitors.
  • Writing alt text for accessibility.
  • Implementing lazy loading to keep LCP scores in the green. 

While these practices remain foundational to a healthy site, the rise of large, multimodal models such as ChatGPT and Gemini has introduced new possibilities and challenges.

Multimodal search embeds content types into a shared vector space. 

We are now optimizing for the “machine gaze.” 

Generative search makes most content machine-readable by segmenting media into chunks and extracting text from visuals through optical character recognition (OCR). 

Images must be legible to the machine eye. 

If an AI cannot parse the text on product packaging due to low contrast or hallucinates details because of poor resolution, that is a serious problem.

This article deconstructs the machine gaze, shifting the focus from loading speed to machine readability.

Technical hygiene still matters

Before optimizing for machine comprehension, we must respect the gatekeeper: performance. 

Images are a double-edged sword. 

They drive engagement but are often the primary cause of layout instability and slow speeds. 

The standard for “good enough” has moved beyond WebP. 

Once the asset loads, the real work begins.

Dig deeper: How multimodal discovery is redefining SEO in the AI era

Designing for the machine eye: Pixel-level readability

To large language models (LLMs), images, audio, and video are sources of structured data. 

They use a process called visual tokenization to break an image into a grid of patches, or visual tokens, converting raw pixels into a sequence of vectors.

This unified modeling allows AI to process “a picture of a [image token] on a table” as a single coherent sentence.

These systems rely on OCR to extract text directly from visuals. 

This is where quality becomes a ranking factor.

If an image is heavily compressed with lossy artifacts, the resulting visual tokens become noisy.

Poor resolution can cause the model to misinterpret those tokens, leading to hallucinations in which the AI confidently describes objects or text that do not actually exist because the “visual words” were unclear.

Reframing alt text as grounding

For large language models, alt text serves a new function: grounding. 

It acts as a semantic signpost that forces the model to resolve ambiguous visual tokens, helping confirm its interpretation of an image.

As Zhang, Zhu, and Tambe noted:

  • “By inserting text tokens near relevant visual patches, we create semantic signposts that reveal true content-based cross-modal attention scores, guiding the model.” 

Tip: By describing the physical aspects of the image – the lighting, the layout, and the text on the object – you provide the high-quality training data that helps the machine eye correlate visual tokens with text tokens.

The OCR failure points audit

Search agents like Google Lens and Gemini use OCR to read ingredients, instructions, and features directly from images. 

They can then answer complex user queries. 

As a result, image SEO now extends to physical packaging.

Current labeling regulations – FDA 21 CFR 101.2 and EU 1169/2011 – allow type sizes as small as 4.5 pt to 6 pt, or 0.9 mm, on compact packaging. 

  • “In case of packaging or containers the largest surface of which has an area of less than 80 cm², the x-height of the font size referred to in paragraph 2 shall be equal to or greater than 0.9 mm.” 

While this satisfies the human eye, it fails the machine gaze. 

The minimum pixel resolution required for OCR-readable text is far higher. 

Character height should be at least 30 pixels. 

Low contrast is also an issue. Contrast should reach 40 grayscale values. 

Be wary of stylized fonts, which can cause OCR systems to mistake a lowercase “l” for a “1” or a “b” for an “8.”

Beyond contrast, reflective finishes create additional problems. 

Glossy packaging reflects light, producing glare that obscures text. 

Packaging should be treated as a machine-readability feature.

If an AI cannot parse a packaging photo because of glare or a script font, it may hallucinate information or, worse, omit the product entirely.

Originality as a proxy for experience and effort

Originality can feel like a subjective creative trait, but it can be quantified as a measurable data point.

Original images act as a canonical signal. 

The Google Cloud Vision API includes a feature called WebDetection, which returns lists of fullMatchingImages – exact duplicates found across the web – and pagesWithMatchingImages. 

If your URL has the earliest index date for a unique set of visual tokens (i.e., a specific product angle), Google credits your page as the origin of that visual information, boosting its “experience” score.

Dig deeper: Visual content and SEO: How to use images and videos

Get the newsletter search marketers rely on.


The co-occurrence audit

AI identifies every object in an image and uses their relationships to infer attributes about a brand, price point, and target audience. 

This makes product adjacency a ranking signal. To evaluate it, you need to audit your visual entities.

You can test this using tools such as the Google Vision API. 

For a systematic audit of an entire media library, you need to pull the raw JSON using the OBJECT_LOCALIZATION feature. 

The API returns object labels such as “watch,” “plastic bag” and “disposable cup.”

Google provides this example, where the API returns the following information for the objects in the image:

Name mid Score Bounds
Bicycle wheel /m/01bqk0 0.89648587 (0.32076266, 0.78941387), (0.43812272, 0.78941387), (0.43812272, 0.97331065), (0.32076266, 0.97331065)
Bicycle /m/0199g 0.886761 (0.312, 0.6616471), (0.638353, 0.6616471), (0.638353, 0.9705882), (0.312, 0.9705882)
Bicycle wheel /m/01bqk0 0.6345275 (0.5125398, 0.760708), (0.6256646, 0.760708), (0.6256646, 0.94601655), (0.5125398, 0.94601655)

Good to know: mid contains a machine-generated identifier (MID) corresponding to a label’s Google Knowledge Graph entry. 

The API does not know whether this context is good or bad. 

You do, so check whether the visual neighbors are telling the same story as your price tag.

Lord Leathercraft blue leather watch band

By photographing a blue leather watch next to a vintage brass compass and a warm wood-grain surface, Lord Leathercraft engineers a specific semantic signal: heritage exploration. 

The co-occurrence of analog mechanics, aged metal, and tactile suede infers a persona of timeless adventure and old-world sophistication.

Photograph that same watch next to a neon energy drink and a plastic digital stopwatch, and the narrative shifts through dissonance. 

The visual context now signals mass-market utility, diluting the entity’s perceived value.

Dig deeper: How to make products machine-readable for multimodal AI search

Quantifying emotional resonance

Beyond objects, these models are increasingly adept at reading sentiment. 

APIs, such as Google Cloud Vision, can quantify emotional attributes by assigning confidence scores to emotions like “joy,” “sorrow,” and “surprise” detected in human faces. 

This creates a new optimization vector: emotional alignment. 

If you are selling fun summer outfits, but the models appear moody or neutral – a common trope in high-fashion photography – the AI may de-prioritize the image for that query because the visual sentiment conflicts with search intent.

For a quick spot check without writing code, use Google Cloud Vision’s live drag-and-drop demo to review the four primary emotions: joy, sorrow, anger, and surprise. 

For positive intents, such as “happy family dinner,” you want the joy attribute to register as VERY_LIKELY

If it reads POSSIBLE or UNLIKELY, the signal is too weak for the machine to confidently index the image as happy.

For a more rigorous audit:

  • Run a batch of images through the API. 
  • Look specifically at the faceAnnotations object in the JSON response by sending a FACE_DETECTION feature request. 
  • Review the likelihood fields. 

The API returns these values as enums or fixed categories. 

This example comes directly from the official documentation:

          "rollAngle": 1.5912293,
          "panAngle": -22.01964,
          "tiltAngle": -1.4997566,
          "detectionConfidence": 0.9310801,
          "landmarkingConfidence": 0.5775582,
          "joyLikelihood": "VERY_LIKELY",
          "sorrowLikelihood": "VERY_UNLIKELY",
          "angerLikelihood": "VERY_UNLIKELY",
          "surpriseLikelihood": "VERY_UNLIKELY",
          "underExposedLikelihood": "VERY_UNLIKELY",
          "blurredLikelihood": "VERY_UNLIKELY",
          "headwearLikelihood": "POSSIBLE"

The API grades emotion on a fixed scale. 

The goal is to move primary images from POSSIBLE to LIKELY or VERY_LIKELY for the target emotion.

  • UNKNOWN (data gap).
  • VERY_UNLIKELY (strong negative signal).
  • UNLIKELY.
  • POSSIBLE (neutral or ambiguous).
  • LIKELY.
  • VERY_LIKELY (strong positive signal – target this).

Use these benchmarks

You cannot optimize for emotional resonance if the machine can barely see the human. 

If detectionConfidence is below 0.60, the AI is struggling to identify a face. 

As a result, any emotion readings tied to that face are statistically unreliable noise.

  • 0.90+ (Ideal): High-definition, front-facing, well-lit. The AI is certain. Trust the sentiment score.
  • 0.70-0.89 (Acceptable): Good enough for background faces or secondary lifestyle shots.
  • < 0.60 (Failure): The face is likely too small, blurry, side-profile, or blocked by shadows or sunglasses. 

While Google documentation does not provide this guidance, and Microsoft offers limited access to its Azure AI Face service, Amazon Rekognition documentation notes that

  • “[A] lower threshold (e.g., 80%) might suffice for identifying family members in photos.”

Closing the semantic gap between pixels and meaning

Treat visual assets with the same editorial rigor and strategic intent as primary content. 

The semantic gap between image and text is disappearing. 

Images are processed as part of the language sequence.

The quality, clarity, and semantic accuracy of the pixels themselves now matter as much as the keywords on the page.

Read more at Read More

How to build search visibility before demand exists

How to build search visibility before demand exists

Discovery now happens before search demand is visible in Google.

In 2026, interest forms across social feeds, communities, and AI-generated answers – long before it shows up as keyword search volume. 

By the time demand appears in SEO tools, the opportunity to shape how a concept is understood has already passed.

This creates a problem for how search marketing research is typically done. 

Keyword tools, search volume, and Google Trends are lagging indicators. 

They reveal what people cared about yesterday, not what they are starting to explore now. 

In a landscape shaped by AI Overviews, social SERPs, and shrinking organic real estate, arriving late means competing inside narratives already defined by someone else.

Exploding Topics sits upstream of this shift. 

It helps surface emerging themes, behaviors, and conversations while they are still forming – before they harden into keywords, content clusters, and product categories. 

Used properly, it is not just a trend tool. It is a way to plan SEO, content, digital PR, and social-led search proactively.

This article breaks down how to use Exploding Topics to identify future entities, validate them through social search, and build search visibility before demand peaks.

Use Exploding Topics Trend Analytics to identify future entities – not just topics

Most marketers who use Exploding Topics already understand its value for content ideation, and we will cover that. 

But its bigger opportunity is identifying future entities – concepts that search engines and AI systems will soon recognize as distinct “things,” not just keyword variations.

This matters because modern search no longer operates purely on keywords. 

Google’s AI Overviews, ChatGPT, and other LLM-powered systems organize information around entities and relationships. 

Once an entity is established, the narrative around it hardens. 

Arrive late, and you are competing inside a story that has already been defined. 

Exploding Topics gives you visibility early enough to act before that happens.

Example: Weighted sleep masks

In Exploding Topics, you might notice “weighted sleep mask” rising steadily. 

Search volume remains low, and most keyword tools understate its importance. 

At a glance, it looks like a niche product trend that is easy to ignore.

Look closer, and the signals are stronger:

  • The phrase is consistent and repeatable.
  • Adjacent topics are rising alongside it, including deep pressure sleep, anxiety sleep tools, and vagus nerve stimulation.
  • Questions that signal intent are increasing.
  • Early discussion focuses on understanding the concept, not just buying a product.

This is the point where something shifts from being a product with an adjective to a named solution. In other words, it is becoming an entity.

The traditional play

Most brands wait until:

  • Search demand becomes obvious, acting in December 2025 rather than July 2025.
  • Competitors launch dedicated product pages.
  • Affiliates and publishers surface “best” and “vs.” content.

Only then do they create:

  • A category page.
  • A “What is a weighted sleep mask?” article or social-search activation.
  • SEO content designed to chase presence, such as FAQs, SERP features, and rankings.

By this point, the entity already exists, and the story around it has largely been written by someone else. 

In this case, NodPod is clearly dominating the entity.

Acting earlier, while the entity is forming

Using Exploding Topics well means acting earlier, while the entity is still being defined. Instead of starting with a product page, you:

  • Publish a clear, authoritative explanation of what a weighted sleep mask is.
  • Explain why deep pressure can help with sleep and anxiety.
  • Address who it is for – and who it is not.
  • Create supporting content that adds context, such as comparisons with weighted blankets or safety considerations.

This work can be done quickly and at scale through reactive PR and social search activations. 

You are not optimizing for keywords yet. 

You are teaching social algorithms, search engines, and AI systems what the concept means and associating your brand with that explanation from the start.

This is how brands can win at search in 2026 and beyond. 

This early, proactive approach:

  • Helps search systems understand new concepts faster.
  • Increases the chance your framing is reused in AI-generated answers.
  • Positions your brand as the authority on the entity – not just a seller within the conversation.

Dig deeper: Beyond Google: How to put a total search strategy together

Validate emerging entities through social search

Identifying an emerging entity is only the first step. 

The real risk is not being early to a conversation. It is being early to something that never takes off.

This is where many SEO teams stall. 

They wait for search volume and arrive too late, publish on instinct and hope demand follows, or freeze under uncertainty and do nothing.

There is a better middle ground: validate emerging entities through social search research and activation tests before scaling them into owned SEO and on-site experiences.

Exploding Topics is straightforward. It shows what might matter. Social platforms tell you whether your audience actually cares.

How social search becomes your validation layer

Once Exploding Topics surfaces a potential emerging entity, the next step is not Keyword Planner. 

It is native search across platforms such as TikTok, Reddit, and YouTube, using either built-in trend tools or basic platform search.

You are looking for signals like:

  • Multiple creators independently explaining the same concept.
  • Comment sections filled with questions such as “Does this actually work?” or “Is this safe?”.
  • Repeated framing, metaphors, or demonstrations.
  • Early how-to or comparison content, even if production quality is low.

These signals point to intent. 

Curiosity is turning into understanding. 

Historically, this phase has always preceded measurable search demand.

Revisiting the weighted sleep mask example

After spotting “weighted sleep mask” in Exploding Topics, you might search for it on TikTok.

What you want to see is a lack of heavy brand advertising. 

Mature ecommerce pushes or TikTok Shop funnels suggest the market is already established. 

Instead, look for creators – not brand channels – testing products, discussing solutions, and exploring the underlying problem.

  • Focus on videos that explain pains, needs, and motivations, such as why pressure may help with anxiety. 
  • Check the comments for comparisons to other solutions. 
  • Look for questions raised in videos and comment threads.

Tools like Buzzabout.AI can help do this at scale through topic analysis and AI-assisted research.

These signals answer two critical questions:

  • Are people actively trying to understand this concept?
  • What language, framing, and objections are forming before SEO data exists?

That is validation.

Rethinking how SEO strategy gets built

This is where search strategy shifts. 

Instead of asking, “Is there enough volume to justify content creation?” the better question is, “Is there enough curiosity to justify building authority early?”

If social signals are weak:

  • Pause.
  • De-risk by testing with creators outside your owned channels.
  • Avoid heavy investment in content that takes months to rank.

If signals are strong:

  • Scale with confidence.
  • Work with creators and activate brand channels.
  • Invest in entity pages, hubs, FAQs, comparisons, and PLP optimization.

In this model, fast-moving social platforms become the testing layer.

SEO is not the experiment, it’is the compounding layer.

Dig deeper: Social and UGC: The trust engines powering search everywhere

Get the newsletter search marketers rely on.


Editorial digital PR that earns links and LLM citations

Most digital PR still works backward.

  • A trend reaches mainstream awareness.
  • Journalists write about it.
  • Brands scramble to comment.
  • PR teams try to extract links from a story that already exists. 

The result is short-term coverage, diluted impact, and little lasting search advantage.

Exploding Topics makes it possible to reverse that dynamic by surfacing editorial narratives before they are obvious and positioning your brand as one of the sources that helps define them.

In 2026, this matters more than ever. 

Links still matter, but they are no longer the only outcome that counts. 

Brand mentions, explanations, and citations increasingly feed the systems behind AI Overviews, ChatGPT, Perplexity, and other LLM-driven discovery experiences.

Why early narratives outperform reactive PR

When a topic is everywhere, journalists are aggregating. When a topic is emerging, they are still asking questions.

Exploding Topics surfaces concepts at the stage where:

  • There is no consensus narrative.
  • Definitions are inconsistent.
  • Journalists are looking for clarity, not quotes.
  • “What is this?” stories have not yet been written.

This is the point where brands can move from commenting on a conversation to shaping it.

From trend-jacker to narrative owner

Instead of pitching “our brand’s take on X,” you lead with early signals you are seeing, why a concept is emerging now, and what it suggests about consumer behavior or the market.

The difference is subtle but important.

You are no longer reacting to coverage that already exists. 

You are creating the framing that journalists, publishers, and, eventually, AI systems reuse. 

LLMs do not learn from rankings alone. 

They learn from editorial context, repeated explanations, and how trusted publications describe and define emerging concepts over time.

Done consistently, this approach compounds. 

As your brand becomes associated with spotting and explaining emerging narratives early, you move from reactive commentary to trusted source. 

Journalists begin to recognize where useful insight comes from, and that trust carries into more established coverage later on. You are no longer pitching for inclusion. 

Your perspective is actively sought out.

The result is early narrative ownership and stronger access when mainstream coverage follows.

An editorial window before mainstream coverage

Before “weighted sleep mask” became a crowded ecommerce term in early 2025, there was a clear editorial window.

Journalists had not yet published stories asking:

  • “What is a weighted sleep mask?”
  • “Are weighted sleep masks safe?”
  • “Do they actually work for anxiety?” 

That was the opportunity.

A PR-led approach at this stage includes:

  • Supplying journalists with expert explanations of deep pressure and sleep.
  • Sharing early insight into why the product category is emerging.
  • Contextualizing it alongside weighted blankets and other anxiety tools.

The result is not just coverage. It connects PR to search, curiosity, and discovery by helping define the concept itself. 

That earns links, builds brand mentions, and signals authority around emerging entities that LLMs are more likely to cite and summarize over time.

Dig deeper: Why PR is becoming more essential for AI search visibility

Content roadmaps and briefs that don’t rely on search volume

Search volume is a poor starting point for content briefing.

It reflects interest only after a topic is established, language has stabilized, and the SERP is already crowded. 

Used as a primary input, it pushes teams to chase demand instead of building authority. 

That is why so many brands end up rewriting the same “What is X?” post year after year.

Better briefs start upstream. 

They use Exploding Topics to spot what is forming and social search to understand how people are trying to make sense of it.

Reframing the briefing process

The core shift is moving away from briefs built around keywords and volumes and toward briefs built around audience intent.

That means focusing on three things:

  • Problems people are beginning to articulate.
  • Concepts that are not yet clearly defined or are actively debated.
  • Language that is inconsistent, emotional, or exploratory.

When content is approached this way, the objective changes. 

It is no longer “create X to rank for Y.” 

It becomes “explain X so the audience does not experience Y.” 

That shift matters.

Designing content that compounds instead of expiring

The goal for SEO content teams in 2026 and beyond should be to brief content that defines a concept clearly. That includes:

  • Connecting it to adjacent ideas.
  • Comparing it to established solutions.
  • Answering questions within conversations that are still forming.

This does not always require written content. 

The same work can happen through social search activations or digital PR.

Approached this way, content grows into demand rather than chasing it.

Instead of being rewritten every time search volume changes, it evolves through updates, expansion, and, where possible, stronger internal linking. 

As interest grows, the content does not need replacing. It needs refining. 

This is the type of material AI and LLMs tend to reference – timely, clear, explanatory, and grounded in real questions.

Publication isn’t the end

Publishing and waiting for content to rank is no longer the end of the brief.

Teams need a clear plan for distribution and reuse.

For emerging topics, that means contributing insight in relevant Reddit threads, Discord communities, niche forums, and creator comment sections. 

Not to drop links, but to answer questions, share explanations, and test framing in public. 

Those conversations feed back into the content itself, improving clarity and increasing the likelihood that your explanation is the one others repeat.

With a social search activation approach, brands can scale messaging quickly by working with partners who interpret and distribute the brief in their own voice. 

When this works, SEO content stops being static and starts acting like a living reference point – one that contributes to culture and builds lasting brand recognition.

Dig deeper: Beyond SERP visibility: 7 success criteria for organic search in 2026

Where this leaves SEO in 2026

Search demand does not appear fully formed. 

It develops across social platforms, communities, and AI-driven discovery long before it registers as keyword volume.

  • Exploding Topics helps surface what is emerging. 
  • Social search shows whether people are trying to understand it. 
  • Digital PR shapes how those ideas are defined and cited. 
  • SEO compounds that work by reinforcing narratives that are already taking shape, rather than trying to test or invent them after the fact.

In this model, SEO is the layer that turns early insight and clear explanation into durable visibility across Google, social platforms, and AI-generated answers.

Search no longer starts on Google. The teams that act on that reality will influence what people search for next.

Read more at Read More

What Is LLMs.txt? & Do You Need One?

Most site owners don’t realize how much of their content large language models (LLMs) already gather. ChatGPT, Claude, and Gemini pull from publicly available pages unless you tell them otherwise. That’s where LLMs.txt for SEO comes into the picture.

LLMs.txt gives you a straightforward way to tell AI crawlers how your content can be used. It doesn’t change rankings, but it adds a layer of control over model training, something that wasn’t available before.

This matters as AI-generated answers take up more real estate in search results nowadays. Your content may feed those answers unless you explicitly opt out. LLMs.txt provides clear rules for what’s allowed and what isn’t, giving you leverage in a space that has grown quickly without much input from site owners.

Whether you allow or restrict access, having LLMs.txt in place sets a baseline for managing how your content appears in AI-driven experiences.

Key Takeaways

  • LLMs.txt lets you control how AI crawlers such as GPTBot, ClaudeBot, and Google-Extended use your content for model training.
  • It functions similarly to robots.txt but focuses on AI data usage rather than traditional crawling and indexing.
  • Major LLM providers are rapidly adopting LLMs.txt, creating a clearer standard for consent.
  • Allowing access may strengthen your presence in AI-generated answers; blocking access protects proprietary material.
  • LLMs.txt doesn’t impact rankings now, but it helps define your position in emerging AI search ecosystems. 

What is LLMs.txt?

LLMs.txt is a simple text file you place at the root of your domain to signal how AI crawlers can interact with your content. If robots.txt guides search engine crawlers, LLMs.txt guides LLM crawlers. Its goal is to define whether your public content becomes part of training datasets used by models such as GPT-4, Claude, or Gemini.

LLMs.txt files.

Here’s what the file controls:

  • Access permissions for each AI crawler
  • Whether specific content can be used for training
  • How your site participates in AI-generated answers
  • Transparent documentation of your data-sharing rules

This protocol exists because AI companies gather training data at scale. Your content may already appear in datasets unless you explicitly opt out. LLMs.txt adds a consent layer that didn’t previously exist, giving you a direct way to express boundaries.

OpenAI, Anthropic, and Google introduced support for LLMs.txt in response to rising concerns around ownership and unauthorized data use. Adoption isn’t universal yet, but momentum is growing quickly as more organizations ask for clarity around AI access.

LLMs.txt isn’t replacing robots.txt because the two files handle different responsibilities. Robots.txt manages crawling for search engines, while LLMs.txt manages training permissions for AI models. Together, they help you protect your content, define visibility rules, and prepare for a future where AI-driven search continues to expand.

Why is LLMs.txt a Priority Now?

Model developers gather massive datasets, and most of that comes from publicly accessible content. When OpenAI introduced GPTBot in 2023, it also introduced a pathway for websites to opt out. Google followed with Google-Extended, allowing publishers to restrict their content from AI training. Anthropic and others soon implemented similar mechanisms.

This shift matters for one reason: your content may already be part of the AI ecosystem unless you explicitly say otherwise.

LLMs.txt is becoming a standard because site owners want clarity. Until recently, there was no formal way to express whether your content could be repurposed inside model training pipelines. Now you can define that choice with a single file.

There’s another angle to this. Generative search tools increasingly rely on trained data to produce answers. If you block AI crawlers, your content may not appear in those outputs. If you allow access, your content becomes eligible for reference in conversational responses, something closely tied to how brands approach LLM SEO strategies.

Neither approach is right for everyone. Some companies want tighter content control. Others want stronger visibility in AI-driven areas. LLMs.txt helps you set a position instead of defaulting into one.

As AI-generated search becomes more prominent, the importance of LLMS.txt grows. You can adjust your directives over time, but having the file in place keeps you in control of how your content is used today.

How LLMs.txt Works

LLMs.txt is a plain text file located at the root of your domain. AI crawlers that support the protocol read it to understand which parts of your content they can use. You set the rules, upload the file once, and update it anytime your strategy evolves.

Where it Lives

LLMs.txt must be placed at:

yoursite.com/llms.txt

This mirrors the structure of robots.txt and keeps things predictable for crawlers. Every supported AI bot checks this exact location to find your rules. It must be in the root directory to work correctly, subfolders won’t register.

Robots.txt structure.

Source

The file is intentionally public. Anyone can view it by navigating directly to the URL. This transparency allows AI companies, researchers, and compliance teams to see your stated preferences.

What You Can Control

Inside LLMs.txt, you specify allow or disallow directives for individual AI crawlers. Example:

User-agent: GPTBot
Disallow: /

User-agent: Google-Extended
Allow: /

You can grant universal permissions or block everything. The file gives you fine-grained control over how your public content flows into AI training datasets.

Current LLMs That Respect It

Several major AI crawlers already check LLMs.txt automatically:

  • GPTBot (OpenAI) — supports opt-in and opt-out training rules
  • Google-Extended — used for Google’s generative AI systems
  • ClaudeBot (Anthropic) — honors site-level directives
  • CCBot (Common Crawl) — contributes to datasets used by many models
  • PerplexityBot — early adopter in 2024

Support varies across the industry, but the direction is clear: more crawlers are aligning around LLMs.txt as a standardized method for training consent.

LLMs.txt vs Robots.txt: What’s the Difference?

Robots.txt and LLMs.txt serve complementary but distinct purposes.

Robots.txt controls how traditional search engine crawlers access and index your content. Its focus is SEO: discoverability, crawl budgets, and how pages appear in search results.

Robots.txt example.

LLMs.txt, in contrast, governs how AI models may use your content for training. These directives tell model crawlers whether they can read, store, and learn from your pages.

Here’s how they differ:

  • Different crawlers: Googlebot and Bingbot follow robots.txt; GPTBot, ClaudeBot, and Google-Extended read LLMs.txt.
  • Different outcomes: Robots.txt influences rankings and indexing. LLMs.txt influences how your content appears in generative AI systems.
  • Different risks and rewards: Robots.txt affects search visibility. LLMs.txt affects brand exposure inside AI-generated answers — and your control over proprietary content.

Both files are becoming foundational as search shifts toward blended AI and traditional results. You’ll likely need each one working together as AI-driven discovery expands.

Should You Use LLMs.txt for SEO?

LLMs.txt doesn’t provide a direct ranking benefit today. Search engines don’t interpret it for SEO purposes. Still, it influences how your content participates in generative results, and that matters.

Allowing AI crawlers gives models more context to work with, improving the odds that your content appears in synthesized answers. Blocking crawlers protects proprietary or sensitive content but removes you from those AI-based touchpoints.

Your approach depends on your goals. Brands focused on reach often allow access. Brands focused on exclusivity or IP protection typically restrict it.

LLMs.txt also pairs well with thoughtful LLM optimization work. Content structured for clarity, strong signals, and contextual relevance helps models interpret your material more accurately. LLMs.txt simply defines whether they’re allowed to learn from it.

“LLMs.txt doesn’t shift rankings today, but it sets early rules for how your content interacts with AI systems. Think of it like robots.txt in its early years: small now, foundational later.” explains Anna Holmquist, Senior SEO Manager at NP Digital.

Who Actually Needs LLMs.txt?

Some websites benefit more than others from adopting LLMs.txt early.

  • Content-heavy sites
    Publishers, educators, and documentation libraries often prefer structure around how their content is reused by AI systems.
  • Brands with proprietary material
    If your revenue depends on premium reports, gated content, or specialized datasets, LLMs.txt offers a necessary layer of protection.
  • SEOs planning for AI search
    As generative results become more common, brands want control over how content feeds into those answer engines. LLMs.txt helps set boundaries while still supporting visibility.
  • Industries with compliance requirements
    Healthcare, finance, and legal organizations often need strict data-handling rules. Blocking AI crawlers becomes part of their governance approach.

LLMs.txt doesn’t lock you into a long-term decision. You can update it as AI search evolves.

How To Set Up an LLMs.txt File

Setting up an LLMs.txt file is simple. Here’s the process. If you want assistance doing this, there are tools and generators that can assist.

LLMs. txt generator in action.

Source

1. Create the File

Open a plain text editor and create a new file called llms.txt.

Add a comment at the top for clarity:

# LLMS.txt — AI crawler access rules

2. Add Bot Directives

Define which crawlers can read and train on your content. For example:

User-agent: GPTBot
Disallow: /

User-agent: Google-Extended
Allow: /

You can open or close access globally:

User-agent: *
Disallow: /

or:

User-agent: *
Allow: /

3. Upload to Your Root Directory

Place the file at:

yoursite.com/llms.txt

This location is required for crawlers to detect it. Subfolders won’t work.

4. Monitor AI Crawler Activity

Check your server logs to confirm activity from:

  • GPTBot
  • ClaudeBot
  • Google-Extended
  • PerplexityBot
  • CCBot

This helps you verify whether your directives are working as expected.

AI crawler activity.

Source

FAQs

What is LLMs.txt?

It’s a file that tells AI crawlers whether they can train on your content. It’s similar to robots.txt but designed specifically for LLMs.

Does ChatGPT use LLMs.txt?

Yes. OpenAI’s GPTBot checks LLMs.txt and follows the rules you specify.

How do I create an LLMs.txt file?

Create a plain text file, add crawler rules, and upload it to your site’s root directory. Use the examples above to set your directives.

Conclusion

LLMs.txt gives publishers a way to define how their content interacts with AI training systems. As AI-generated search expands, having explicit rules helps protect your work while giving you control over how your brand appears inside model-generated answers.

This file pairs naturally with stronger LLM SEO strategies as you shape how your content is discovered in AI-driven environments. And if you’re already improving your content structure for model comprehension, LLMs.txt fits neatly beside ongoing LLM optimization efforts.

If you need help setting up LLMs.txt or planning for AI search visibility, my team at NP Digital can guide you.

Read more at Read More

AI Search for E-commerce: Optimize Product Feeds for Visibility

AI is reshaping how people shop online. Search isn’t just about keywords anymore. Tools like Google’s AI Overviews, ChatGPT shopping features, and Perplexity product recommendations analyze huge amounts of product data to decide what to show users. That shift means e-commerce brands need to rethink the way their product information is structured.

If you want visibility in these AI-powered shopping journeys, your product data has to be clean, complete, and enriched. AI models lean heavily on structured feeds, trusted marketplaces, and high-quality product attributes to understand exactly what you sell.

That’s why AI search for e-commerce matters right now. Brands that optimize their feeds will show up in conversational queries, comparison results, and visual search responses. Brands that don’t will struggle to appear even if they’ve done traditional SEO well.

This foundation will help you give AI systems the clarity they need to recommend your products with confidence.

Key Takeaways

  • AI search engines rely heavily on structured product feed data instead of just site content to understand and surface products.
  • Clean, complete feeds lead to higher visibility across Google Shopping, ChatGPT shopping research, Perplexity results, and other LLMs.
  • Strong titles, enriched attributes, and quality images make it easier for AI systems to match your products to real user needs.
  • Brands with clear, structured product data will outperform competitors in AI-driven shopping experiences.

How AI Search Is Reshaping Product Discovery

AI is changing the way customers find products long before they reach your website. Instead of typing traditional keywords, shoppers now describe what they want in plain language:
“lightweight waterproof hiking boots,”
“a gift for a 12-year-old who loves science,”
“a mid-century floor lamp under $150.”

AI systems interpret these natural-language queries using semantic understanding instead of exact keyword matches. That shift affects everything from Google Shopping listings to ChatGPT’s built-in shopping tools. It also impacts how AI-driven platforms rank your products when answering conversational or comparison-based queries.

Shopping resuts in ChatGPT.

Source: RetailTouchPoints

If you’ve been following the evolution of AI in e-commerce, you already know AI is moving deeper into product search, recommendation, and personalization. But behind the scenes, the link between your product data and AI visibility is tightening.

AI models rely on structured, trustworthy data sources, including product feeds, schema markup, and marketplace listings. If your feed lacks attributes or clarity, AI can’t confidently connect your product to a user’s need, even if your website is strong.

Optimizing your feed is no longer a backend task. It’s a visibility strategy.

What Is a Product Feed (and Why AI Cares About It)

A product feed is a structured data file that contains detailed information about every item you sell. It includes attributes like product title, description, brand, size, color, price, availability, GTIN, and more. Platforms such as Google Shopping, Meta, Amazon, and TikTok Shops rely on these feeds to understand your inventory and decide when to show your products.

AI systems depend on the same structure. Instead of scanning pages manually, they pull product details from feeds because the information is cleaner, more complete, and easier to interpret at scale.

If your feed includes rich attributes, AI can match your items to complex user queries. When attributes are missing or titles are vague, your products become invisible in AI-driven discovery, regardless of how strong your website content might be.

This is why optimizing product feeds is a priority for e-commerce brands right now. Clean, enriched feeds increase your visibility across AI-powered shopping experiences and visual search tools like Google Lens.

A product feed for E-commerce.

Source

Now, your product feed isn’t just for ads, but is a core input for AI search.

What AI Needs From Your Product Feed (Titles, Attributes, Images)

AI systems don’t guess what your products are, instead analyzing the data you provide. These are the elements that matter most.

Titles and Descriptions

AI models prefer natural, descriptive, human-sounding titles. Short, vague titles like “Running Shoes” don’t give AI enough context. But a title such as:

“Women’s Waterproof Trail Running Shoes – Lightweight, Breathable, Blue”

instantly signals the audience, category, and key benefits.

Descriptions should reinforce the title and add details that help AI understand use cases, materials, fit, and core value.

Avoid keyword stuffing. AI systems would likely reference sites with ambiguity less because they would have less info to understand it.

Product Attributes

AI engines rely heavily on structured attributes such as:

  • Size
  • Color
  • Material
  • Fit
  • Style
  • GTIN/MPN
  • Age range
  • Intended use

Missing attributes = missing visibility.

Attributes help AI refine products when users ask things like:
“Show me a size 8,”
“Only vegan options,”
“Something in walnut or dark wood.”

The more complete your attributes, the better your likelihood of appearing in those filtered results.

Product Images and Alt Text

AI increasingly “reads” images using vision models. Google Lens, Pinterest Lens, and multimodal AI systems analyze colors, textures, shapes, and packaging.

Clear, high-resolution images paired with alt text provide two inputs: visual interpretation and descriptive language.

Example alt text:
“Women’s waterproof trail running shoe with rubber sole, breathable mesh upper, and reinforced toe cap in blue.”

Examples of trail running shoes for women.

Visual clarity improves both AI understanding and user experience.

Steps To Optimize Product Feeds for AI Visibility

Here’s the practical workflow to upgrade your product feed for AI search visibility.

1. Audit Your Current Product Feed

Start with a complete audit using tools like Google Merchant Center, Feedonomics, or GoDataFeed. Look for:

  • Missing GTINs or invalid identifiers
  • Weak or vague product titles
  • Incomplete attributes
  • Duplicate listings
  • Mismatched availability or pricing
  • Blank fields or generic descriptions

AI search systems penalize incomplete or ambiguous data.

Google Merchant Center's interface.

Source

2. Improve Title and Description Relevance

Use a clear structure:

Brand + Category + Key Attributes + Value Proposition

Examples:

  • “Nike Men’s Running Shoes – Cushioned, Lightweight, Black”
  • “Organic Cotton Baby Pajamas – Soft, Breathable, Unisex”
  • “Mid-Century Floor Lamp – Walnut, LED Compatible, 60” Height”

Descriptions should expand on the title, adding details AI can use to match queries.

Avoid fluff. Focus on clarity.

3. Enhance Structured Attributes

Fill out every attribute you have access to, even optional ones. AI uses these to match long-tail, specific user needs.

Add custom labels for:

  • Best sellers
  • Seasonal items
  • High margin
  • Clearance
  • New arrivals

Custom labels help you manage bidding, targeting, and segmentation across Shopping and Performance Max campaigns.

Custom lables for Google Shopping campaigns.

Source

4. Optimize for Rich Results & Visual Search

Include product schema markup on all product pages, especially:

  • Product
  • Review
  • Price
  • Availability

AI search engines treat structured schema as a trust signal.

Also include descriptive alt text on all product images to support accessibility and AI interpretation.

Example results for Blue Hiking Shoes for women.

5. Set Up Feed Rules and Automations

Automate cleanup tasks such as:

  • Adding missing colors to titles
  • Appending product type or material
  • Standardizing capitalization
  • Populating missing attributes with known defaults
  • Flagging products with incomplete data

Automation keeps your feed consistent as your catalog changes.

How AI Assistants Use Product Data

AI shopping assistants are rapidly changing how customers discover and compare products. 

To generate these answers, AI systems pull from:

  • Merchant Center feeds
  • Structured schema markup
  • Marketplace listings
  • Verified product databases
  • High-quality product images
  • Trusted review sources

This creates a composite understanding of your product beyond just what your site says about it.

If you’ve explored the role of AI shopping assistants, you’ve likely seen how quickly they recommend products based on attributes like size, color, performance, ratings, and price. Those signals come directly from your feed and structured product data.

Brands with richer data sets see higher inclusion rates in:

  • Comparison lists
  • “Top choices” summaries
  • Product match queries
  • Visual search results
  • Conversational shopping recommendations
AI shopping results.

Source

AI systems don’t guess. They promote products they can understand clearly and ignore the rest.

Common Mistakes That Hurt AI Visibility

Most feed problems fall into a few categories, and each one reduces visibility in AI search engines.

1. Vague or Duplicated Titles

Titles like “Running Shoes” or “LED Lamp” provide no usable context. AI deprioritizes these compared to richer alternatives.

2. Missing Key Attributes

Many merchants skip fields like size, color, material, GTIN, or gender. AI relies heavily on these attributes when matching products to specific user requests.

3. Keyword-Stuffed or Fluffy Descriptions

Descriptions should be informative, not bloated. AI models prefer specific phrasing over repetitive keywords.

4. Inconsistent Pricing or Availability

If your feed shows “in stock” but your page says “out of stock,” AI systems flag inconsistencies and may reduce your visibility.

5. Low-Quality Images or Missing Alt Text

Visual AI models need clarity. Poor images or missing alt text make your product harder to classify.

Fixing these issues has a measurable impact on how often your products appear in AI-driven recommendations.

FAQs

What is AI e-commerce?

AI e-commerce refers to using artificial intelligence to improve product discovery, recommendations, personalization, and automation throughout the online shopping experience.

How is AI changing e-commerce?

AI is shifting product discovery toward natural-language search, visual identification, and conversational shopping assistants. Brands now need structured, enriched product data to stay visible.

How do you optimize a product feed for AI search?

Create clear titles, use complete attributes, include schema markup, strengthen product images, and use automation to maintain consistency. A detailed feed helps AI understand your products accurately.

Conclusion

Brands that invest in structured data, enriched attributes, and clear product information will outperform competitors as AI-driven shopping grows.

Feed optimization also strengthens your broader search strategy. The same structured data powering AI engines aligns with strong AI in e-commerce practices, and the same clarity helps conversational systems recommend your products more confidently.

Visibility in AI search isn’t random. It comes from data quality. And improving that data is one of the highest-impact steps an e-commerce brand can take today.

Read more at Read More

What is a redirect? Types, how to set them up, and impact on SEO 

Ever clicked a link and landed on a “Page Not Found” error? Redirects prevent that. They send visitors and search engines to the right page automatically. Redirects are crucial for both SEO and user experience. For SEO, they preserve link equity and keep your rankings intact. Additionally, it enhances the user experience, as no one likes dead ends. 

Key takeaways

  • A redirect automatically sends users and search engines from one URL to another, preventing errors like ‘Page Not Found.’
  • Redirects are crucial for SEO and user experience, preserving link equity and maintaining rankings.
  • Different types of redirects exist: 301 for permanent moves and 302 for temporary ones.
  • Avoid client-side redirects, such as meta refresh or JavaScript, as they can harm SEO.
  • Use Yoast SEO Premium to easily set up and manage redirects on your site.

What is a redirect? 

A redirect is a method that automatically sends users and search engines from one URL to another. For example, if you delete a page, a redirect can send visitors to a new or related page instead of a 404 error. 

How redirects work

  1. A user or search engine requests a URL (e.g., yoursite.com/page-old).
  2. The server responds with a redirect instruction.
  3. The browser or search engine follows the redirect to the new URL (e.g., yoursite.com/page-new).

Redirects can point to any URL, even on a different domain. 

Why redirects matter 

Redirects keep your website running smoothly. Without them, visitors hit dead ends, links break, and search engines get lost. They’re not just technical fixes, because they protect your traffic, preserve rankings, and make sure users land where they’re supposed to. Whether you’re moving a page, fixing a typo in a URL, or removing old content, redirects make sure that nothing gets left behind. 

When to use a redirect 

Use redirects in these scenarios: 

  1. Deleted pages: Redirect to a similar page to preserve traffic. 
  2. Domain changes: Redirect the old domain to the new one. 
  3. HTTP→HTTPS: Redirect insecure URLs to secure ones. 
  4. URL restructuring: Redirect old URLs to new ones (e.g., /blog/post → /articles/post). 
  5. Temporary changes: Use a 302 for A/B tests or maintenance pages. 

Types of redirects 

There are various types of redirects, each serving a distinct purpose. Some are permanent, some are temporary, and some you should avoid altogether. Here’s what you need to know to pick the right one. 

Not all redirects work the same way. A 301 redirect tells search engines a page has moved permanently, while a 302 redirect signals a temporary change. Client-side redirects, like meta refresh or JavaScript, exist because they’re sometimes the only option on restrictive hosting platforms or static sites, but they often create more problems than they solve. Below, we break down each type, explain when to use it, and discuss its implications for your SEO. 

Redirect types at a glance 

Redirect type  Use case  When to use  Browser impact  SEO impact  SEO risk 
301  Permanent move  Deleted pages, domain changes, HTTP→HTTPS  Cached forever  Passes (almost) all link equity  None if used correctly 
302  Temporary move  A/B testing, maintenance pages  Not cached  May not pass link equity  Can dilute SEO if used long-term 
307  Temporary move (strict)  API calls, temporary content shifts  Not cached  Search engines may ignore  High if misused 
308  Permanent move (strict)  Rare; use 301 instead  Cached forever  Passes link equity  None 
Meta Refresh  Client-side redirect  Avoid where possible  Slow, not cached  Unreliable  High (hurts UX/SEO) 
JavaScript  Client-side redirect  Avoid where possible  Slow, not cached  Unreliable  High (hurts UX/SEO) 

301 redirects: Permanent moves 

A 301 redirect tells browsers and search engines that a page has moved permanently. Use it when: 

  • You delete a page and want to send visitors to a similar one.
  • You change your domain name.
  • You switch from HTTP to HTTPS.

SEO impact: 301 redirects pass virtually all link equity to the new URL. But be sure to never redirect to irrelevant pages, as this can confuse users and hurt SEO. For example, redirecting a deleted blog post about “best running shoes” to your homepage, instead of a similar post about running gear. This wastes link equity and frustrates visitors. 

Example HTTP header

HTTP/1.1 301 Moved Permanently 
Location: https://example.com/new-page

302 redirects: Temporary moves 

A 302 redirect tells browsers and search engines that a move is temporary. Use it for: 

  • A/B testing different versions of a page.
  • Temporary promotions or sales pages.
  • Maintenance pages.

SEO impact: 302 redirects typically don’t pass ranking power like 301s. Google treats them as temporary, so they may not preserve SEO value. For permanent moves, always use a 301 to ensure link equity transfers smoothly. 

Examples of when to use a 301 and 302 redirect:  

Example 1: Temporary out-of-stock product (302): An online store redirects example.com/red-sneakers to example.com/blue-sneakers while red sneakers are restocked. A 302 redirect keeps the original URL alive for future use. 

Example 2: A permanent domain change (301): A company moves from old-site.com to new-site.com. A 301 redirect makes sure visitors and search engines land on the new domain while preserving SEO rankings. 

307 and 308 redirects: Strict rules 

These redirects follow HTTP rules more strictly than 301 or 302: 

  1. Same method: If a browser sends a POST request, the redirect must also use POST. 
  2. Caching
    • 307: Never cached (temporary). 
    • 308: Always cached (permanent). 

When to use them

  • 307: For temporary redirects where you must keep the same HTTP method (e.g., forms or API calls). 
  • 308: Almost never, use a 301 instead. 

For most sites: Stick with 301 (permanent) or 302 (temporary). These are for specific technical cases only. 

What to know about client-side redirects:

Client-side redirects, such as meta refresh or JavaScript, execute within the browser instead of on the server. They’re rarely the right choice, but here’s why you might encounter them: 

  • Meta refresh: A HTML tag that redirects after a delay (e.g., “You’ll be redirected in 5 seconds…”).
  • JavaScript redirects: Code that changes the URL after the page loads.

Why should you avoid them? 

  • Slow: The browser must load the page first, then redirect.
  • Unreliable: Search engines may ignore them, hurting SEO.
  • Bad UX: Users see a flash of the original page before redirecting.
  • Security risks: JavaScript redirects can be exploited for phishing. 

When they’re used (despite the risks): 

  • Shared hosting with no server access. 
  • Legacy systems or static HTML sites.
  • Ad tracking or A/B testing tools.

Stick with server-side redirects (301/302) whenever possible. If you must use a client-side redirect, test it thoroughly and monitor for SEO issues. 

How redirects impact SEO 

Redirects do more than just send users to a new URL. They shape how search engines crawl, index, and rank your site. A well-planned redirect preserves traffic and rankings. A sloppy one can break both. Here’s what you need to know about their impact. 

Ranking power 

301 redirects pass most of the link equity from the old URL to the new one. This helps maintain your rankings. 302 redirects may not pass ranking power, especially if used long-term. 

Crawl budget 

Too many redirects can slow down how quickly search engines crawl your site. Avoid redirect chains (A→B→C) to save crawl budget

User experience 

Redirects prevent 404 errors and keep users engaged. A smooth redirect experience can reduce bounce rates. 

Common redirect mistakes 

Redirects seem simple, but small errors can cause big problems. Here are the most common mistakes and how to avoid them. 

Redirect chains 

A redirect chain happens when one URL redirects to another, which redirects to another, and so on. For example:  

  • old-page → new-page → updated-page → final-page

Why it’s bad

  • Slows down the user experience. 
  • Wastes crawl budget, as search engines may stop following the chain before reaching the final URL. 
  • Dilutes ranking power with each hop. 

How to fix it

  • Map old URLs directly to their final destination. 
  • Use tools like Screaming Frog to find and fix chains. 

Redirect loops 

A redirect loop sends users and search engines in circles. For example:  

  • page-A → page-B → page-A → page-B...

Why it’s bad

  • Users see an error page (e.g., “Too many redirects”). 
  • Search engines can’t access the content, so it won’t rank. 

How to fix it

  • Check your redirect rules for cblonflicts. 
  • Test redirects with a tool like Redirect Path (Chrome extension) or curl -v in the terminal. 

Using 302s for permanent moves 

A 302 redirect is meant for temporary changes, but many sites use it for permanent moves. For example: 

  • Redirecting old-product to new-product with a 302 and leaving it for years. 

Why it’s bad

  • Search engines may not pass link equity to the new URL. 
  • The old URL might stay in search results longer than intended. 

How to fix it

  • Use a 301 for permanent moves. 
  • If you accidentally used a 302, switch it to a 301 as soon as possible. 

Redirecting to irrelevant pages 

Redirecting a page to unrelated content confuses users and search engines. For example: 

  • Redirecting a blog post about “best running shoes” to the homepage or a page about “kitchen appliances”. 

Why it’s bad

  • Users land on content they didn’t expect, increasing bounce rates. 
  • Search engines may ignore the redirect or penalize it for being manipulative. 
  • Wastes ranking power that could have been passed to a relevant page. 

How to fix it

  • Always redirect to the most relevant page available. 
  • If no relevant page exists, let the old URL return a 404 or 410 error instead. 

Ignoring internal links after redirects 

After setting up a redirect, many sites forget to update internal links. For example: 

  • Redirecting old-page to new-page but keeping links to old-page in the site’s navigation or blog posts. 

Why it’s bad

  • Internal links to the old URL force users and search engines through the redirect, slowing down the experience. 
  • Wastes crawl budget and dilutes ranking power. 

How to fix it

  • Update all internal links to point directly to the new URL. 
  • Use a tool like Screaming Frog to find and fix outdated links. 

Not testing redirects 

Assuming redirects work without testing can lead to surprises. For example: 

  • Setting up a redirect but not checking if it sends users to the right place. 
  • Missing errors like 404s or redirect loops. 

Why it’s bad

  • Broken redirects frustrate users and hurt SEO. 
  • Search engines may drop pages from the index if they can’t access them. 

How to fix it

  • Test every redirect manually or with a tool. 
  • Check Google Search Console for crawl errors after implementing redirects. 

Redirecting everything to the homepage 

When a page is deleted, some sites redirect all traffic to the homepage. For example: 

  • Redirecting old-blog-post to example.com instead of a relevant blog post. 

Why it’s bad

  • Confuses users who expected specific content. 
  • Search engines may see this as a “soft 404” and ignore the redirect. 
  • Wastes ranking power that could have been passed to a relevant page. 

How to fix it

  • Redirect to the most relevant page available. 
  • If no relevant page exists, return a 404 or 410 error. 

Forgetting to update sitemaps 

After setting up redirects, many sites forget to update their XML sitemaps. For example: 

  • Keeping the old URL in the sitemap while redirecting it to a new URL. 

Why it’s bad

  • Sends mixed signals to search engines. 
  • Wastes crawl budget on outdated URLs. 

How to fix it

  • Remove old URLs from the sitemap. 
  • Add the new URLs to help search engines discover them faster. 

Using redirects for thin or duplicate content 

Some sites use redirects to hide thin or duplicate content. For example, redirecting multiple low-quality pages to a single high-quality page to “clean up” the site. 

Why it’s bad

  • Search engines may see this as manipulative. 
  • Doesn’t address the root problem, which is low-quality content. 

How to fix it

  • Improve or consolidate content instead of redirecting. 
  • Use canonical tags if duplicate content is unavoidable. 

Not monitoring redirects over time 

Redirects aren’t a set-it-and-forget-it task. For example: 

  • Setting up a redirect and never checking if it’s still needed or working. 

Why it’s bad

  • Redirects can break over time (e.g., due to site updates or server changes). 
  • Unnecessary redirects waste crawl budget. 

How to fix it

  • Audit redirects regularly (e.g., every 6 months). 
  • Remove redirects that are no longer needed. 

How to set up a redirect 

Setting up redirects isn’t complicated, but the steps vary depending on your platform. Below, you’ll find straightforward instructions for the most common setups, whether you’re using WordPress, Apache, Nginx, or Cloudflare.  

Pick the method that matches your setup and follow along. If you’re unsure which to use, start with the platform you’re most comfortable with. 

WordPress (using Yoast SEO Premium) 

Yoast SEO Premium makes it easy to set up redirects, especially when you delete or move content. Here’s how to do it: 

Option 1: Manual redirects 

  1. Go to Yoast SEO → Redirects in your WordPress dashboard. 
  2. Enter the old URL (the one you want to redirect from). 
  3. Enter the new URL (the one you want to redirect to). 
  4. Select the redirect type: 
  • 301 (Permanent): For deleted or permanently moved pages. 
  • 302 (Found): For short-term changes. 
  1. Click Add Redirect
Manually redirecting a URL in Yoast’s redirect manager

Option 2: Automatic redirects when deleting content 

Yoast SEO can create redirects automatically when you delete a post or page. Here’s how: 

  1. Go to Posts or Pages in your WordPress dashboard. 
  2. Find the post or page you want to delete and click Trash
  3. Yoast SEO will show a pop-up asking what you’d like to do with the deleted content. You’ll see two options: 
    • Redirect to another URL: Enter a new URL to send visitors to. 
    • Return a 410 Content Deleted header: Inform search engines that the page is permanently deleted and should be removed from their index. 
  4. Select your preferred option and confirm. 

This feature saves time and ensures visitors land on the right page. No manual setup required. 

Need help with redirects? Try Yoast SEO Premium

No code, no hassle. Just smarter redirects and many other invaluable tools.

Get Yoast SEO Premium Only $118.80 / year (ex VAT)

Apache (.htaccess file) 

Apache uses the .htaccess file to manage redirects. If your site runs on Apache, this is the simplest way to set them up. Add the rules below to your .htaccess file, ensuring it is located in the root directory of your site. 

Add these lines to your .htaccess file: 

# 301 Redirect 
Redirect 301 /old-page.html /new-page.html
# 302 Redirect 
Redirect 302 /temporary-page.html /new-page.html

Nginx (server config) 

Nginx handles redirects in the server configuration file. If your site runs on Nginx, add these rules to your server block and then reload the service to apply the changes. 

Add this to your server configuration: 

# 301 Redirect 
server { 
    listen 80; 
    server_name example.com; 
    return 301 https://example.com$request_uri; 
}
# 302 Redirect 
server { 
    listen 80; 
    server_name example.com; 
    location = /old-page { 
        return 302 /new-page; 
    } 
}

Cloudflare (page rules) 

Cloudflare allows you to set up redirects without modifying server files. Create a page rule to forward traffic from one URL to another, without requiring any coding. Simply enter the old and new URLs, select the redirect type, and click Save. 

  1. Go to Rules → Page Rules
  2. Enter the old URL (e.g., example.com/old-page). 
  3. Select Forwarding URL and choose 301 or 302
  4. Enter the new URL (e.g., https://example.com/new-page). 

Troubleshooting redirects 

Redirects don’t always work as expected. A typo, a cached page, or a conflicting rule can break them, or worse, create loops that frustrate users and search engines. Below are the most common issues and how to fix them.  

If something’s not working, start with the basics: check for errors, test thoroughly, and clear your cache. The solutions are usually simpler than they seem. 

Why isn’t my redirect working? 

  • Check for typos: Ensure the URLs are correct. 
  • Clear your cache: Browsers cache 301 redirects aggressively. 
  • Test with curl: Run curl -v http://yoursite.com/old-url to see the HTTP headers. 

Can redirects hurt SEO? 

Yes, if you: 

  • Create redirect chains (A→B→C
  • Use 302s for permanent moves 
  • Redirect to irrelevant pages 

How do I find broken redirects? 

  • Use Google Search Console → Coverage report. 
  • Use Screaming Frog to crawl your site for 404s and redirects. 

What’s the difference between a 301 and 308 redirect? 

  • 301: Most common for permanent moves. Broad browser support. 
  • 308: Strict permanent redirect. Rarely used. Same SEO impact as 301. 

What is a proxy redirect? 

A proxy redirect keeps the URL the same in the browser but fetches content from a different location. Used for load balancing or A/B testing. Avoid for SEO, as search engines may not follow them. 

Conclusion about redirects

Redirects are a simple but powerful tool. A redirect automatically sends users and search engines from one URL to another. As a result, they keep your site running smoothly and preserve SEO value and ranking power. Remember: 

  • Use 301 redirects for permanent moves. 
  • Use 302 redirects for temporary changes. 
  • Avoid client-side redirects, such as meta refresh or JavaScript. 

Need help? Try Yoast SEO Premium’s redirect manager.  

The post What is a redirect? Types, how to set them up, and impact on SEO  appeared first on Yoast.

Read more at Read More

Micro Influencer Marketing: How Small Creators Drive Results

Influencer marketing works because people trust people more than they trust brands. 

When a creator shares a product they actually use, their audience pays attention and often takes action. That’s the core of effective influencer marketing.

Micro-influencer marketing takes that idea and runs with it. 

Creators with smaller, focused followings tend to have stronger, more personal relationships with their audience. Because their content feels real, their recommendations feel trusted. 

Consequently, their engagement rates often outperform even the largest accounts.

For brands, that means efficient ad spend and high-quality interactions for your brand, making campaign testing simple. Forget buying reach for the sake of reach. You’re tapping right into tight-knit communities that already trust the creator’s voice.

This guide breaks down how to find the right micro-influencers and turn those relationships into measurable results.

Key Takeaways

  • Micro-influencer marketing works because smaller creators have tight, trusting communities that take their recommendations seriously.
  • Partnering with micro-influencers gives brands a steady stream of authentic user-generated content (UGC) that fills your content pipeline.
  • Storytelling typically beats straight product promotion. When creators share a problem and naturally introduce your brand as the solution, engagement and credibility jump.
  • Sponsored posts perform best when creators stay in their own voice. Give them a clear angle and not a script.
  • Tools like CreatorIQ, Upfluence, and Instagram’s Creator Marketplace make it easier to find micro-influencers whose audiences match your target customer.

What Are Micro-Influencers, and Why Should You Use Them?

Micro-influencers are creators with a smaller but highly focused following, usually between 10,000 and 50,000 followers. 

They sit in the sweet spot of influence. 

They’re big enough to have reach but small enough to maintain real trust. Their audience knows them in a way that feels personal and believes in their recommendations.

This is where micro-influencer marketing stands apart from traditional social media marketing and celebrity partnerships. Instead of paying for broad visibility, you’re tapping into communities built on genuine connection.

Recent data backs this up. A study from HypeAuditor shows that micro-influencers consistently outperform larger creators in:

  • Engagement rate: About four times higher than branded accounts
  • Comment quality: More real conversations, fewer bots
  • Conversion intent: Followers view them as trusted peers, not spokespeople

Our own data backs up the value of micro-influencers, too. 

In NP Digital’s analysis of 2,808 influencer campaigns, micro-influencers delivered the highest return on investment (ROI) of any tier, even though this dataset defines “micro” more broadly (1,000–100,000 followers). 

"ROI of Influencer Marketing” comparing return on investment across four influencer tiers.

The pattern is the same: Smaller, more connected creators are more than capable of outperforming larger accounts.

With micro-influencers, you’re not buying reach for vanity metrics. You’re investing in creators whose audiences take action.

Micro-influencers also bring niche expertise.

Be it fitness, skincare, gaming, parenting, or finance, they understand their community’s pain points and how to speak to them. That makes your partnership feel organic.

If you’re looking to build brand trust or reach niche audiences, micro-influencer marketing might be a better fit than chasing accounts with millions of followers.

How to Find Micro-Influencers for Your Brand

Finding the right micro-influencer matters as much as the content they create. 

You’re looking for creators whose audience matches your own. That means demographics, interests, tone, and the problems they help people solve. 

Where to begin? 

It starts with understanding your customer. Once you know who you’re trying to reach, you can identify creators who already have their attention.

Thankfully, there are several reliable platforms that turn influencer hunting into a science:

  • All-in-one powerhouses: Tools like Aspire, Upfluence, and CreatorIQ act as powerful search engines. They let you filter creators by niche, location, follower range, engagement rate, and detailed audience demographics.
  • Platform-specific: Don’t forget Instagram’s own Creator Marketplace. It’s especially valuable for campaigns tied to Reels or broader Instagram marketing efforts.

Upfluence streamlines the vetting process by showing how closely a creator matches your campaign criteria and letting you accept or reject applicants with a single click.

The Upfluence influencer application interface. The card shows a creator profile (@danishworld) with a profile photo, short bio, and three recent content thumbnails. A “98% match” badge appears in the top right, indicating strong alignment with the brand’s criteria.

(Image Source)

CreatorIQ makes discovery simple by letting you filter creators by platform, engagement rate, audience demographics, and content style so you can quickly spot micro-influencers who actually fit your brand.

The CreatorIQ discovery dashboard showing filters for finding influencers.

(Image Source)

If your audience spends time on multiple platforms, like YouTube Shorts or TikTok, try cross-platform tools like HypeAuditor or Influence.co. They let you compare creators across channels and keep your campaigns consistent. (If TikTok is part of your plan, here’s a deeper dive into TikTok marketing.)

When evaluating micro-influencers, look at more than follower count. Keep these metrics in mind, too:

  • Engagement quality: Comments, saves, and shares
  • Audience relevance: Do their followers match your target?
  • Content style: Does it align with your brand’s tone and values?
  • Consistency: Active creators deliver stronger results

After narrowing your list, reach out with a clear pitch. Be sure to leave space for creative freedom. Micro-influencer marketing works best when they can speak to their audience in their own authentic voice.

How Micro-Influencers Can Help Power Your Marketing Campaigns

Micro-influencers shine when you plug them into real campaigns vs. one-off posts. They do the heavy lifting, sparking awareness and directly driving product demand, keeping your brand in front of the right people. Their audiences trust them, and that trust moves fast. 

The next sections break down how to use that momentum.

1. Use Campaign-Specific Hashtags

Campaign-specific hashtags make it easy for micro-influencers and their audiences to rally around your brand. They give you a single thread that connects posts and user-generated content (UGC) in one place.

Start by creating a hashtag that’s simple and tied to a clear idea, not just your brand name. Then invite a group of micro-influencers to use it in their posts, Reels, and Stories as they share your product in real-life settings.

A branded hashtag can work when real people actually use it, though. 

LaCroix’s #livelacroix tag is a great example. Search it on Instagram or TikTok, and you’ll see the same pattern play out over and over again: micro-influencers showing how the product fits naturally into their routines.

Instagram’s hashtag results page for #livelacroix, showing a “For you” feed with a Meta AI summary at the top and a 3×3 grid of posts featuring LaCroix sparkling water.

On Instagram (above), the tag pulls up everything from fridge restocks to quick taste tests in the car. 

These aren’t creators with crazy big audiences, but their engagement is strong because the posts feel personal. 

Even better, the hashtag travels across platforms. Here’s what it looks like on TikTok.

TikTok search results for “Livelacroix”, displaying top videos. Thumbnails feature creators holding different LaCroix cans inside their cars, demonstrating taste tests or casual product demos.

Among those showing up in the grid is local food creator @zwhoeats (19,000 followers), who posts casual reviews and flavor rankings using the same tag. His videos pull in thousands of views because his audience trusts his take on everyday products.

TikTok creator @zwhoeats’s profile. It shows the creator’s username, 19.1K followers, and 603.5K likes. The bio highlights local food content in Fort Worth, Texas.

This is the real power of a campaign-specific hashtag. 

It gives micro-influencers a simple way to plug your brand into content they’re already making. And from it grows a discoverable trail of posts you can reshare and build upon. 

2. Leverage User-Generated Content

User-generated content may be the “ace in the hole” for your next micro-influencer campaign.

Rather than rely only on polished brand assets, you show real people using your product in real situations. And that’s what convinces others to try it.

Micro-influencers are perfect UGC engines. They already create content that their followers trust, so you tap into a steady stream of authentic content when you partner with them.

A great example comes from I and Love and You, the pet food brand. Its open Influencer Ambassador Program is built specifically for micro-influencers—everyday pet owners and small creators who share honest moments with their pets. 

The three steps of the “I and Love and You” influencer ambassador program. Step 1 (“Apply”) includes a photo of a woman sitting on a porch with her dog and a bag of pet food. Step 2 (“Complete Foodie Missions”) shows a cat sniffing a pouch of “I and Love and You” treats. Step 3 (“Reward”) features two dogs holding chew treats in their mouths next to a branded product bag. Each step includes a short description below the image.
Section titled “What Are the Perks?” displaying six benefit icons with short descriptions for members of the “I and Love and You” influencer ambassador program

Through this program, the brand activated hundreds of micro-influencers, generating countless posts and impressions. 

The content all looks and feels like real life, because it is. There aren’t any studio shoots, no forced scripts. As you can see from the Instagram grid below, it’s just UGC created by people their audience already trusts.

Instagram hashtag page for #iandloveandyou, showing a 3×4 grid of pet-related posts featuring cats, pet owners, and various “I and Love and You” cat food products.

This is the playbook. Collaborate with micro-influencers who already share the kind of content your customers want to see, let them create in their own style, and then amplify the best pieces. 

UGC not only builds social proof but fills your content pipeline with assets that outperform polished brand creative.

3. Create Sponsored Posts

Sponsored posts work well with micro-influencers because their audiences already trust them. 

The key is letting creators build content that fits their tone and the way their audience naturally engages.

Take this Candy Cloud example from TikTok. 

TikTok video screenshot showing a Candy Cloud barista struggling to make a skinny latte while wearing a black “Candy Cloud” T-shirt. Text on the video reads, “When you lied on your resume and someone orders a skinny latte.”

Instead of a polished product shot, the creator filmed a chaotic behind-the-counter moment with a joke about messing up a “skinny latte.” It’s tagged as a paid partnership, but the vibe is unmistakably them. 

That’s the lesson: Sponsored posts feel credible when they look like the creator’s regular content. 

Give micro-influencers room to shoot in their own style and let the authenticity do the heavy lifting. 

When you do that, sponsored posts feel like genuine recommendations instead of ads competing for attention.

4. Tell a Story With Your Promotion

Storytelling is where micro-influencer marketing really shines. 

Facts and features are forgettable. Stories, though? They stick. 

When creators show why a product fits into their life (not just what it is), people pay attention.

I learned this firsthand years ago when I was growing my blog. My posts were solid, but traffic wasn’t moving. Once I started weaving in small stories—real struggles, lessons, wins—engagement spiked and readers stayed longer. 

The content didn’t change much. But the connection did.

The same principle applies to micro-influencer marketing campaigns. Instead of asking for a straight product shot, encourage creators to wrap your brand into a moment that feels true to them. 

Maybe it’s a “day in the life,” a behind-the-scenes routine, a quick before-and-after, or a personal challenge they’re solving.

For example, the TikTok post below works because the creator, @bianca.montalvo, sets up a relatable travel problem—pricy roaming fees. She then folds Airalo, an eSIM platform, in as the natural solution, turning her tip into a simple, effective story her audience can follow.

TikTok video screenshot featuring creator Bianca Montalvo standing in front of a Paris-style street background with the text “Travel Tips From an Airline Employee – Part 11” above her.

These are chapters from the creator’s life where your product naturally fits. And because micro-influencers are already tight with their followers, that story feels authentic. 

How to Track Influencer Campaigns

Tracking your influencer marketing campaigns isn’t complicated once you know what to look for. 

Start by measuring performance on the platform itself. Instagram’s Insights, TikTok’s Analytics, and YouTube’s Creator Studio all show reach, engagement, audience demographics, and which posts actually drove action. 

These numbers help you understand which creators and formats are worth repeating.

For deeper reporting, some of the platforms we mentioned earlier—Aspire, Upfluence, and CreatorIQ—let you track creators, pull in content automatically, monitor hashtag performance, and calculate cost per engagement or cost per acquisition across campaigns. 

If you’re running a mix of organic and paid micro-influencer content, these tools give you one place to compare everything.

You should also tag your links with UTM parameters so you can see traffic and conversions inside Google Analytics

The goal is to track the pieces that show real impact: saves, shares, comments, website clicks, and sales. That way, you know exactly which micro-influencers are moving the needle and where to invest next.

FAQs

What is a micro-influencer?

A micro-influencer is a creator with roughly 10,000 to 50,000 followers (though it’s sometimes defined as 10,000–100,000 or in other ranges). These creators tend to have highly focused, highly engaged audiences. They’re big enough to create impact but small enough to maintain real trust with their community. 

Does micro-influencer marketing work?

Yes. Micro-influencers often outperform larger creators in engagement, conversions, and cost efficiency. Their followers view them as peers, which leads to stronger recommendations and higher intent. 

Where to find micro-influencers?

You can find micro-influencers through platforms like Aspire, Upfluence, and Instagram’s own Creator Marketplace. If your audience is active across platforms, tools like HypeAuditor can help you compare creators on Instagram, TikTok, and YouTube. 

Conclusion

Driving more sales and landing more customers is a grind.

That’s especially true in today’s world, where every niche and subset of that niche has a competitor.

There are countless businesses, just like mine and just like yours. 

Investing in micro-influencer marketing can be a way to stand out. They get your brand in front of people who actually care.

Their audiences know them and pay attention when they recommend something.

Start small. Build a list of creators who already speak to your target customer

Look for strong engagement and content that aligns with your brand. Then plug them into your broader influencer marketing strategy. UGC, sponsored posts, campaign hashtags, and simple storytelling all work well at the micro level.

If you stay consistent and treat these creators like true partners, you’ll see the impact quickly.

Read more at Read More