SEO Audit Checklist: How Two Lines in robots.txt Cost 180 Leads a Month
Last fall, the owner of a suburban auto service center outside reached out in a panic. His five-year-old site used to bring in a steady 60 leads every single week from organic search. By September, that number had collapsed to just ten. He had already killed his paid ads to save money and couldn’t afford a new agency. He simply wanted to know: What the heck happened?
We jumped in, opened Google Search Console, and saw something jaw-dropping—4,200 pages marked “Crawled — currently not indexed.” The site had once held around 800 pages in the index. This wasn’t a server crash, a penalty, or a mysterious algorithm apocalypse. It was a classic robots.txt SEO disaster that had been quietly building for months until a Yandex update finally exposed it.
What follows is the exact five-step SEO audit process we run in the first few hours of every emergency case like this. Eighty percent of the problems we uncover are embarrassingly basic—yet fixing them delivers the fastest, most dramatic results.
Table of Contents
The Call That Changed Everything for a Five-Year-Old Auto Service Site
The client’s story is painfully common. Stable traffic for years, then a sudden drop that no one could explain. He assumed the worst—maybe sanctions, maybe a competitor attack, maybe the search engines just hated him. In reality, the culprit was sitting in plain sight inside one tiny file most site owners never open after launch.
We treat every sudden traffic crash the same way: start with the fundamentals that actually control whether search engines can even see your pages. Fancy keyword research and link building come later—much later.
Why Your First Move in Any SEO Audit Must Be robots.txt and Indexing
Everyone loves starting with PageSpeed Insights because the colorful graphs look impressive in reports. But beautiful scores rarely explain why leads suddenly vanish. The only question that matters in hour one is simple: Is the search engine even allowed to see the site?
In this auto service case, we opened the robots.txt file and found exactly two extra lines that had been added back in April:
text
Disallow: /uslugi/
Disallow: /ceny/
Someone had thrown them in during a pricing-page redesign, thinking they’d temporarily block old URLs until the new ones launched. The new pages never went live, but the Disallow directives stayed. For six straight months, both Google and Yandex dutifully obeyed and removed the very pages that had been generating the majority of service requests.
Finding the problem took thirty minutes. Removing the two lines took two minutes. Ten days later, the blocked pages started trickling back into the index.
Quickly boost your Google rankings with 4 real backlinks from trusted sites (DR 60+) in just 30 days—help your pages get indexed faster and start bringing in traffic without guesswork.
Duplicate Content: The Silent Traffic Thief Hiding in Plain Sight
Once indexing looks healthy, the next step is checking whether the site is accidentally competing against itself. Duplicate content is boring to fix, yet it consistently delivers some of the highest ROI we see.
Common culprits include:
- The same URL loading with and without a trailing slash
- www versus non-www versions
- http versus https
- Filter and sort parameters turning every combination into its own indexed page (?sort=price&color=red)
- Pagination without proper rel=canonical tags
We once audited a furniture e-commerce store that had 17,000 pages in the index but only 900 actual products. Search engines had no idea which of the eighteen near-identical versions of “Albert sofa” should rank. We cleaned it up with canonical tags and strategic noindex rules. Within two months, organic traffic climbed 34%—without publishing a single new article.
Title Tags and Meta Descriptions That Actually Help (Not Hurt)
Next we export every page with Screaming Frog or any crawler and sort by title tag. Three patterns almost always appear, and all three hurt rankings and clicks.
First, 60–80% of pages share the exact same title—often something generic like “Home — Company Name.” This happens when a website builder template was never updated.
Second, titles exist but feel robotic and template-driven: “Buy [Category] in Moscow — Cheap, Fast Delivery.” They blend into every other result and give search engines zero reason to prefer one site over another.
Third, titles are stuffed with keywords and stretch past 180 characters. Search engines truncate them, so the snippet turns into gibberish.
A strong title is short (under 60 characters), describes the page’s actual purpose, includes one primary query, and hints at the benefit. Meta descriptions aren’t for SEO points—they exist purely to boost click-through rate. Write them like a helpful human would.
Real Site Speed: Forget PageSpeed Scores—Focus on Core Web Vitals
Yes, speed matters, but not the way most people test it. The synthetic PageSpeed score on a single page is a toy. What actually moves the needle are the real-user metrics inside Core Web Vitals:
- LCP (Largest Contentful Paint) — under 2.5 seconds
- INP (Interaction to Next Paint) — under 200 ms
- CLS (Cumulative Layout Shift) — under 0.1
You can see these numbers directly in Search Console under Core Web Vitals for actual visitors across the entire site. In nine out of ten audits, the culprit is oversized images that never got converted to WebP plus a pile of third-party tracking scripts. Fix them in a single day and the improvement in user experience (and rankings) is immediate.
The Forgotten sitemap.xml That Search Engines Stopped Trusting
While you’re in the files, check /sitemap.xml. If it hasn’t been updated since 2022, you have a problem. Ours often contain hundreds of outdated URLs, missing new sections, and 404 errors that waste crawl budget.
The fix is straightforward: make the sitemap generate automatically from your CMS and refresh with every new publish. After robots.txt, this is usually the second quickest win we implement.
If you’re seeing similar indexing issues, it’s worth reviewing a deeper guide on fixing indexing problems in WordPress — it often uncovers hidden technical blockers.
What Happened After We Fixed the Auto Service Site in Four Days
The full list for this client was straightforward but devastating:
- Two rogue Disallow lines in robots.txt
- 1,200 duplicate pages from filters
- 70% of pages sharing the identical title “Auto Service in Reutov”
- A sitemap two years out of date
- LCP of 5.8 seconds on the homepage thanks to a 4 MB background image
We wrapped the work in four days. Six weeks later Search Console showed 700 pages returning to the index. Two months after that, the site was back to 52 leads per week—without spending a single ruble on advertising. The cost per lead effectively dropped to zero.
The Real Secret to SEO Success: Fix the Basics Before Anything Else

An SEO audit isn’t about chasing shiny keywords or building links. It’s about making sure your own website isn’t quietly sabotaging itself. The correct order is always the same: indexing (robots.txt first) → duplicate content → title tags and meta descriptions → Core Web Vitals → sitemap.xml → internal linking.
Follow this sequence and you’ll catch eighty percent of the problems in the first two hours. Most of them fall into the “How did we ever miss this?” category.
The most expensive mistake we see is hiring an SEO agency for “promotion” before the fundamentals are solid. Paying for links and content while robots.txt is busy removing your most important pages from the index is the fastest way to burn money and trust.
If your traffic has mysteriously disappeared, start with the basics. Open your robots.txt right now. You might be surprised what you find.
Source: Pavel
Please let me know if you’re looking for a writer for your blog.
You have some really great articles and I think I would be
a good asset. If you ever want to take some of the load off, I’d
really like to write some material for your blog in exchange for a link back to mine.
Please send me an e-mail if interested. Cheers!