Stuck in Positions 70–90: The Trust Signal Recovery Framework
A 90-day roadmap to diagnose low-trust indexing patterns and rebuild the confidence Google needs to move your pages out of the visibility pit—without a massive backlink budget.
What This Framework Solves
You’ve fixed Core Web Vitals. Your crawl is clean. Schema validates. Yet your pages hover in positions 70–90 for months, accumulating impressions but almost no clicks. This pattern isn’t bad luck—it’s a trust threshold problem.
Google can crawl your site, but it’s not confident enough to index broadly or rank prominently. Google’s documentation confirms that crawling and indexing are separate processes, and that Google may crawl a URL and still decide not to index it based on quality and value assessments rather than technical errors 1, 2.
At Iriscale, we’ve analyzed hundreds of sites trapped in the 70–90 band. The pattern is consistent: Google sees enough value to test your pages (impressions) but not enough trust to rank them competitively (clicks). Google spokespeople have repeatedly clarified that many ranking shortfalls aren’t penalties—the algorithm simply doesn’t see enough value or trust signals to rank a page higher, even if the site looks “technically perfect” 3.
This guide shows you how to (1) confirm whether you’re dealing with trust-signal weakness versus penalties, (2) pinpoint “crawled – currently not indexed” root causes common to low-authority sites, (3) decide migration vs. recovery using a decision tree, and (4) execute a 90-day roadmap powered by Iriscale’s Keyword Repository, Search Ranking diagnostics, and Content Architecture Generator.
Step 1: Diagnose the 70–90 pattern as a trust threshold, not a burned domain
Positions 70–90 represent the algorithm’s compromise: “eligible enough to test” (impressions) but not trusted enough to win competitive auctions consistently (rank). Google’s “How Search Works” documentation describes a multi-stage process where systems evaluate relevance and quality before ranking results 5, 6. When pages sit in 70–90, they’re not failing basic requirements—they’re not surpassing the quality and trust thresholds that top results demonstrate.
Two misconceptions that block progress
Misconception 1: “Our domain is burned.”
In most cases, there is no permanent “domain burn.” Google generally prefers ignoring low-value signals rather than punishing sites. Ranking stagnation is frequently explained by insufficient value, unclear purpose, or weak trust indicators—not a hidden penalty 3.
Misconception 2: “If the technical audit is perfect, ranking should follow.”
Google has explicitly downplayed over-focusing on certain technical “perfection” myths (e.g., HTML validity) while emphasizing outcomes: usefulness, clarity, and signals that show you’re a reliable source 7, 8.
Real examples from our data
B2B glossary trap: A SaaS site published 250 glossary pages. They were crawlable and got impressions, but most sat at positions 60–90. The issue wasn’t rendering or sitemap coverage—each page repeated generic definitions and didn’t demonstrate experience, examples, or original perspective. Result: Google chose not to rank them prominently, and some ended up “crawled – currently not indexed” as the site expanded.
Local service pages: A multi-location services site created near-duplicate city pages. They got discovered, crawled, and occasionally indexed, but rankings stalled. Google was likely clustering similar pages and suppressing thin variants rather than “penalizing” the domain—consistent with Google’s separation of crawl and index decisions 1.
What to do next
Treat 70–90 as a threshold problem: your job is to raise trust and value signals until Google can confidently index and rank. Start measuring “trust symptoms” (indexation selectivity, duplicated templates, weak entity signals) rather than hunting for phantom penalties.
Use Iriscale’s Search Ranking diagnostics to segment pages stuck in 70–90 versus those improving, and correlate with indexation status patterns. This data shows you where to focus recovery efforts.
Step 2: Separate “crawled – not indexed” and low trust from real penalties
Before you rebuild anything, rule out penalties and misconfigurations—otherwise you’ll waste 90 days polishing pages Google can’t (or won’t) show.
What “Crawled – currently not indexed” actually means
Google crawled the URL but decided not to index it. This is explicitly not a guarantee of indexing, and it is often driven by perceived quality, duplication, or insufficient unique value 1, 2. Industry analyses align: it’s not necessarily an error; it’s frequently a quality selection outcome 9.
John Mueller has suggested that some non-indexation is normal—often tied to overall site quality rather than a single technical fault 4.
What a real penalty looks like (and what it doesn’t)
Google’s penalty and reconsideration ecosystem exists, but many “stuck” sites are not penalized. A manual action is typically visible in Search Console; algorithmic demotions correlate with spam patterns or major quality failures. Google spokespeople have noted that a single violation doesn’t always trigger a manual action, and that Google may ignore problematic links rather than penalize sites 10, 11.
Real examples from our analysis
False penalty panic: A publisher saw rankings decline and assumed a penalty. Search Console showed no manual actions; server logs showed stable crawling; the real issue was a surge of low-value tag pages and internal search pages that diluted perceived quality. Fixing indexation controls and consolidating thin pages improved crawl efficiency and indexation selection.
Actual constraint, not penalty: A marketplace blocked essential category pages via robots.txt “to reduce crawl.” Google couldn’t fetch content reliably, and indexation dropped. This wasn’t a penalty; it was self-inflicted accessibility loss—consistent with Google crawling and indexing guidance 1.
What to do next
If Search Console shows no manual action, assume quality and trust selection first, not punishment. Validate access: rendering, robots, canonicals, and server stability before rewriting content.
At Iriscale, we recommend using Search Ranking diagnostics to identify patterns across affected pages. Track whether the 70–90 cohort shares common templates, internal linking depth, or content structures—this reveals the trust gap faster than manual audits.
Step 3: Find the root cause of “crawled – not indexed” on low-authority sites
Low-authority sites get less benefit of the doubt. When Google must choose what to store and show, it prioritizes pages with clear purpose, distinct value, and strong signals of trustworthiness and helpfulness 8, 5. If your site produces many similar pages, Google may crawl them but decline to index a large share—especially if the site doesn’t demonstrate strong E-E-A-T patterns (experience, expertise, authoritativeness, trustworthiness) emphasized across quality guidance 12, 13.
The most common root causes we see at Iriscale
1. Template duplication at scale
Programmatic pages with swapped city names, thin “best X in Y” lists, or category pages with minimal unique content.
2. Weak internal discovery and prioritization
Google crawls but doesn’t see strong internal signals that a page is important. Internal linking is a trust and prioritization mechanic—especially when external signals are limited. This aligns with Google’s emphasis on discoverability and site structure in starter guidance 14.
3. Unclear topical focus (content sprawl)
Publishing everything for everyone: guides, news, definitions, and affiliate pages without a coherent architecture. Google’s helpful content guidance stresses people-first, purpose-driven creation 8.
4. Low “proof” content (no experience signals)
No first-hand photos, no data, no named experts, no methodology, no editorial policy—common on small teams moving fast.
Case study: Travel agency stuck 70–90 with 85% “crawled – currently not indexed”
A travel agency launched hundreds of destination and package pages quickly. In Search Console, 85% of URLs showed “crawled – currently not indexed,” while the indexed set averaged positions 70–90 for non-brand queries. Technical checks were clean: no noindex, no robots blocking, acceptable performance.
The real issues were (a) near-identical templates across destinations, (b) thin itinerary details, © weak internal linking (deep pages >4 clicks), and (d) no trust reinforcement (no clear “who we are,” author or editor standards, or real trip expertise).
After consolidating destination clusters, adding first-hand itinerary content, and improving internal hub linking, indexation began to rise and rankings moved into the 20–40 band within the recovery window. This is consistent with Google’s crawl and index separation and helpful-content emphasis 1, 8.
What to do next
If non-indexation is >40–50% on a small site, treat it as a sitewide trust and value selection problem, not a single-URL fix. Google notes some non-indexation is normal, but extreme ratios are a red flag 4.
Prioritize reducing “duplicate-by-template” inventory and strengthening internal hubs before publishing more. Use Iriscale’s Content Architecture Generator to design hub-and-spoke structures that consolidate credibility and reduce orphaned pages.
Step 4: Rebuild trust signals without buying backlinks (E-E-A-T + behavior + architecture)
You can’t “audit” your way into trust. You earn it by making it easy for Google (and users) to verify who you are, what you know, and why your pages deserve index space and prominent rankings. Google’s helpful content guidance and broader ranking systems documentation support this direction: systems aim to reward useful, people-first content and demote content that feels made for search engines first 8, 6.
Analyses of Google’s trust-related patents and commentary highlight that user behavior and trusted references can function as confidence signals 15, 16. You don’t need a giant backlink budget to improve the signals that typically correlate with trust.
Five budget-friendly trust builders (that work especially well in the 70–90 trap)
1. Evidence upgrades (“Experience” signals)
Add first-hand details: original photos, “what we tested,” pricing snapshots with dates, itinerary constraints, and FAQs based on customer conversations.
- Example: Travel agency adds “What we learned running this route weekly” sections and real traveler constraints (visa, weather, transfer time).
- Example: B2B site adds annotated screenshots, implementation pitfalls, and “tested on” version notes.
2. Author and editorial transparency
Not fluff bios—verifiable credentials, editorial policy, and revision notes. Danny Sullivan has repeatedly emphasized user trust and transparency as central to search quality expectations 17.
- Example: “Reviewed by” subject matter expert with a real profile page and contact route.
- Example: Publish a corrections policy and show “last reviewed” for YMYL-adjacent pages. This aligns with E-E-A-T emphasis 12, 13.
3. Internal linking as trust routing
Create hubs that consolidate credibility: a destination hub linking to subpages, or a product-led pillar linking to implementation guides. This clarifies importance and reduces orphaned pages. This is supported by SEO starter guidance on structure 14.
4. Content consolidation instead of expansion
Merge 10 thin pages into 1 definitive guide. Delete or noindex “supporting noise” (thin tags, faceted duplicates). Google may ignore low-value pages rather than “penalize,” but they still dilute perceived site quality 18.
5. Branded-query and demand capture (without link buying)
If trust is shaky, branded demand can help stabilize performance. This is supported by branded-search patent coverage and discussion 19. Tactics: YouTube explainers, partner webinars, newsletters—aimed at generating direct and brand searches and repeat visits, not links.
How Iriscale helps here
At Iriscale, we built tools specifically to support trust recovery without a massive backlink budget:
- Keyword Repository: Identify “trust-easy” keywords (lower ambiguity, clearer intent) to win early and build performance history.
- Search Ranking diagnostics: Segment keywords and pages stuck 70–90 versus those improving, and correlate with indexation status patterns.
- Content Architecture Generator: Design hub-and-spoke structures to reduce duplication and improve internal signal flow.
What to do next
Spend your first trust budget on evidence + consolidation, not new URLs. Build hubs that make your best pages impossible to ignore internally.
Step 5: Migration vs. recovery decision tree (and why migrations often backfire for low-trust sites)
When rankings stall, teams often propose a domain migration, CMS switch, or “fresh start.” But Google’s systems evaluate sites over time; wiping URLs, changing structures, and redirecting at scale introduces risk—especially when you haven’t fixed the underlying trust and value issues. If the site is already struggling to get indexed, a migration can temporarily reduce signals further. This is consistent with Google’s emphasis that crawling and indexing are selective and that quality matters 5, 8.
Use this decision tree
Choose Recovery (usually) if:
- No manual actions in Search Console. Penalty absence points to quality selection 3.
- You see widespread “crawled – currently not indexed” and template duplication.
- Your best pages still get impressions; Google is testing you.
Choose Migration (rare) if:
- The domain has an unresolvable legacy problem (e.g., legal or brand constraints), or the architecture is so broken it cannot be corrected without changing URL design.
- You can commit to preserving content quality, mapping redirects cleanly, and controlling index bloat from day one.
Real examples from our analysis
Recovery wins: The travel agency above considered migrating to a new domain. Instead, they consolidated destination pages into fewer, deeper hubs, strengthened internal linking, and improved author trust. Indexation rose because the site offered fewer but stronger candidates for the index.
Migration hurts: A content site migrated from subdomain to root with thousands of thin pages intact. Google re-crawled everything, saw the same low-value patterns, and “crawled – not indexed” increased post-migration—plus rankings reset. The team spent months fixing what could have been fixed before the move.
What to do next
Treat migration as an amplifier of your current quality—good or bad—not a reset button. If you can’t clearly describe the trust problem today, migrating will not remove it tomorrow.
At Iriscale, we recommend using the Content Architecture Generator to map your current structure and identify consolidation opportunities before considering migration. This shows you whether the problem is fixable in place.
Step 6: The 90-day Google Trust Signal Recovery Roadmap (no big backlink budget required)
This roadmap assumes you have Search Console access, basic crawling capability, and content resources to update key templates and pages. It is designed to improve indexation selectivity, sitewide quality, and rank confidence in a measurable way.
Days 1–15: Triage + measurement baseline
Goals: Confirm it’s trust and quality selection, not access or penalty; build a prioritized URL set.
- Pull Search Console exports: Page indexing statuses, performance by query and page, and URL inspection samples 2.
- Classify URLs into: must-index, nice-to-index, should-not-index (thin, duplicates, utility).
- Benchmark indexation ratio; remember some non-indexation is normal, but extreme non-indexation is a strategy red flag 4.
- Use Iriscale Search Ranking diagnostics to tag pages with stable 70–90 averages versus rising pages and identify shared patterns (template, intent, depth).
Example milestone: Reduce “unknowns” from 1,000 URLs to 100 priority URLs and 10 templates to fix.
Days 16–45: Inventory reduction + architecture rebuild
Goals: Reduce low-value index candidates and strengthen internal signals.
- Consolidate duplicate sets (city or service variants) into hubs.
- Add internal links from high-traffic pages to priority hubs (navigation + contextual).
- Apply canonical or noindex where appropriate. This aligns with reducing low-quality signals that can drag site perception 18.
- Use Iriscale Content Architecture Generator to design: 3–6 hubs, each with 5–12 spokes, and define internal anchor rules.
Example from our data: Travel agency cuts 600 near-duplicate package URLs down to 120 strengthened pages plus hub pages. Result: crawl focuses on fewer URLs; indexation improves as Google sees clearer “best candidates.”
Days 46–75: Evidence upgrades (E-E-A-T execution sprint)
Goals: Increase “experience” and trust verification without links.
- Add real-world proof: images, pricing methodology, itinerary constraints, “who this is for,” and expert review lines. This is consistent with E-E-A-T emphasis 12, 13.
- Publish author pages + editorial policy; add sitewide About and Contact clarity. This supports trust and transparency emphasis 17.
- Use Iriscale Keyword Repository to target “trust-easy” queries: narrow intent, clear format, high match between your evidence and the need.
Example milestone: 30 priority pages upgraded with proof blocks + reviewed-by, and linked into hubs.
Days 76–90: Indexation nudges + performance iteration
Goals: Help Google reassess improved pages and iterate based on results.
- Re-submit key URLs via URL inspection (sparingly) and ensure sitemaps reflect only index-worthy URLs. This is consistent with Mueller discouraging “force indexing” as a strategy at scale 4.
- Monitor: “crawled – not indexed” trendlines, impressions-to-click movement, and average position drift.
- Expand what works: if one hub moves from 70–90 to 20–40, replicate the template across the next hub cluster.
Example from our data: A B2B help center moves from scattered articles to a structured “Getting Started” hub. Even without new backlinks, internal routing and consolidation drive broader indexing and more stable rankings. This aligns with architecture and helpfulness principles 14, 8.
What to do next
Your fastest win is usually less content, better connected, not more content. Measure weekly: indexing status mix + rank distribution changes, not just traffic.
At Iriscale, we built the Search Ranking diagnostics dashboard specifically to track these metrics week-over-week. You can see which hubs are moving, which templates are stuck, and where to focus your next sprint.
Recovery Checklist (copy into your project tracker)
- Confirm “not a penalty”: Check Manual Actions; validate no sitewide noindex or robots issues.
- Export GSC indexing statuses and segment by template and type.
- Set targets: e.g., reduce crawled-not-indexed from 85% → <50% in 60 days.
- Cull index bloat: Noindex or merge thin tags, internal search, duplicative parameter URLs.
- Consolidate: Merge thin variants into hubs; redirect or canonicalize duplicates.
- Rebuild internal linking: Add hub navigation + contextual links from strongest pages.
- E-E-A-T upgrades: Author pages, editorial policy, proof sections, expert review where relevant.
- Iriscale workflow:
- Keyword Repository → pick “trust-easy” clusters
- Search Ranking diagnostics → monitor 70–90 cohorts
- Content Architecture Generator → publish hub and spoke plan
- 90-day reporting: Track (1) indexation ratio, (2) count of pages in top 50, (3) % queries with rising trend.
Optional: Create a one-page “Trust Recovery Scorecard” and share it weekly with stakeholders (indexing mix + hub progress + wins).
Related Questions
If Google crawls my pages, why won’t it index them?
Because crawling is just retrieval; indexing is a selection decision. Google can fetch your URL and still decide it doesn’t add enough unique value to store and serve in results 1, 2. For low-authority sites, thin templates, duplication, and unclear purpose often lead to “crawled – currently not indexed” even when nothing is “broken” 9.
Is it normal that some pages aren’t indexed?
Yes. Google has indicated it can be normal for a portion of a site’s pages not to be indexed, often depending on overall site quality and content uniqueness 4. The warning sign is not “some pages”; it’s when most pages are excluded and the indexed set can’t escape positions 70–90—suggesting Google doesn’t trust your inventory enough to prioritize it.
How do I know whether I’m dealing with a penalty?
Start with Search Console: manual actions are surfaced there. If there’s no manual action, you’re usually looking at algorithmic quality selection, not punishment. This is consistent with Google spokesperson commentary that non-ranking often isn’t a penalty 3. Also note Google may ignore spammy signals and links rather than penalize, which is why “penalty hunting” often leads nowhere 11.
Can I fix trust issues without building new backlinks?
Often, yes—especially when the issue is indexation selectivity and template-level thinness. Consolidation, internal linking, clearer architecture, and evidence-based content upgrades can raise perceived usefulness and trustworthiness 8, 14. Backlinks can help, but they’re not the only lever—particularly when your current inventory isn’t “index-worthy” yet.
Should I migrate to a new domain to “reset” trust?
Usually not. If your underlying issue is thin or duplicative content or unclear value, a migration simply moves the same problem to a new place—and adds crawling, redirect, and re-evaluation risk. This is consistent with Google’s selective crawl and index processes and quality emphasis 5, 8. Fix trust signals first; migrate only for structural necessity.
Put the framework into motion with Iriscale
If your pages are trapped in positions 70–90 and Search Console is full of “crawled – currently not indexed,” Iriscale can help you move from guesswork to a controlled recovery plan.
We built Iriscale specifically to solve this problem. Use the Keyword Repository to find winnable clusters, Search Ranking diagnostics to isolate the 70–90 cohort, and the Content Architecture Generator to rebuild hubs that concentrate trust and relevance.
See how Iriscale’s trust recovery tools work → Get a demo
Calculate your tool consolidation savings → ROI Calculator
Related Guides
- Content Architecture Generator: Building hub-and-spoke structures that Google actually indexes
- Search Ranking Diagnostics: How to segment plateaus vs. true declines
- Keyword Repository Playbook: Finding “trust-easy” keywords for low-authority sites
Sources
[1] Google Search Docs — Crawling and indexing: https://developers.google.com/search/docs/crawling-indexing
[2] Google Search Console Help Thread — “Page is not indexed: Crawled - currently not indexed”: https://support.google.com/webmasters/thread/248401570/page-is-not-indexed-crawled-currently-not-indexed?hl=en
[3] John Mueller explains why a site may not rank despite good SEO: https://www.improvemysearchranking.com/john-mueller-explains-why-a-site-may-not-rank-despite-good-seo/
[4] Google: It’s Normal for Pages of a Site to Not Be Indexed: https://www.searchenginejournal.com/google-not-indexing-site/416717/
[5] Google Search Docs — How Search Works: https://developers.google.com/search/docs/fundamentals/how-search-works
[6] Google Search Docs — Ranking systems guide: https://developers.google.com/search/docs/appearance/ranking-systems-guide
[7] John Mueller debunks misconceptions about website quality: https://readable.com/blog/john-mueller-debunks-misconceptions-about-website-quality-on-linkedin/
[8] Google Search Docs — Creating helpful, reliable, people-first content: https://developers.google.com/search/docs/fundamentals/creating-helpful-content
[9] Onely — How to fix “Crawled - currently not indexed”: https://www.onely.com/blog/how-to-fix-crawled-currently-not-indexed-in-google-search-console/
[10] John Mueller: One violation will not result in a manual action: https://www.searchenginejournal.com/googles-john-mueller-one-violation-will-not-result-in-a-manual-action/344712/
[11] Google answers if outbound links pass poor signals: https://www.searchenginejournal.com/google-answers-if-outbound-links-pass-poor-signals/571687/
[12] Google and how to increase your website trust rank (E-E-A-T overview): https://deepwpseo.com/google-and-how-to-increase-your-website-trust-rank-for-the-site/
[13] E-E-A-T guide for more trust and top rankings: https://www.seo-kreativ.de/en/blog/e-e-a-t-guide-for-more-trust-and-top-rankings/
[14] Google SEO Starter Guide: https://developers.google.com/search/docs/fundamentals/seo-starter-guide
[15] Google’s trust ranking patent & user behavior as a signal: https://www.searchenginejournal.com/googles-trust-ranking-patent-shows-how-user-behavior-is-a-signal/550203/
[16] Google patent US9195944B1: https://patents.google.com/patent/US9195944B1/en
[17] Danny Sullivan on trust and transparency (X): https://twitter.com/dannysullivan/status/1578448010815500290
[18] Google low-quality signals: https://www.searchenginejournal.com/google-low-quality-signals/306277/
[19] Google’s branded search patent for ranking results: https://www.searchenginejournal.com/googles-branded-search-patent-for-ranking-search-results/524083/