Editor’s note (April 2026): This article is part of Blog Herald’s editorial archive. Originally published in 2005, it has been reviewed and updated to ensure accuracy and relevance for today’s readers.
I often experience a particular kind of nostalgia that comes with revisiting how we measured ourselves as bloggers in the early 2000s. Back then, link counts were the currency of credibility. Who was linking to you? How many blogs cited your post? These questions mattered enormously — not just to the ego, but to the craft of building an audience before social media existed to do it for you.
Back in 2005, The Blog Herald roundup has compared six competing tools: Blogpulse, Feedster, Bloglines, Technorati, Pubsub, and Icerocket. Reading it nearly two decades later, what strikes me isn’t the technical details — most of those tools are long gone — but the deeper anxiety running underneath. Bloggers were already worried about dependency. Worried about which platforms to trust. Worried about which numbers actually meant anything.
Those concerns haven’t aged a day.
The landscape that existed — and what replaced it
In 2005, Technorati was the canonical authority on blog influence. It ranked blogs by incoming links, provided an “authority score,” and became the default reference when anyone wanted to understand a blog’s reach. The roundup from that era was already noting its decline — slower results, missed links, a once-elegant interface growing unwieldy. Feedster was delivering stronger raw link counts but lacked the contextual tools. Blogpulse had the most thoughtful suite of analysis features. Pubsub had fascinating data wrapped in a confusing interface. Each tool was partial, fallible, and quietly competing for a position that none of them would ultimately hold.
By the early 2010s, Technorati had shifted its focus away from blog search entirely, pivoting toward advertising and content marketing before eventually shutting down as a search tool in 2014. Feedster and Blogpulse followed similar trajectories into irrelevance. The infrastructure that had been built to serve the blog ecosystem was dismantled, piece by piece, as the money moved elsewhere.
What filled the vacuum wasn’t a single platform — it was a professional toolkit. Ahrefs, Moz, and Semrush gradually shifted from niche SEO instruments to the standard infrastructure for anyone serious about understanding who links to them and why. Google Search Console, which launched in 2006 as Google Webmaster Tools, became the only truly authoritative source for understanding how Google itself sees your backlink profile. The measurement got more rigorous. The stakes got higher.
What the old tools were really measuring
Here’s what’s easy to miss when you look back at that 2005 comparison: those tools weren’t really measuring influence in any stable sense. They were measuring activity within a relatively closed ecosystem. The blogosphere, at that point, was still a defined community — a finite network of sites linking to each other, trackable with RSS-based indexing.
The links that mattered in 2005 were editorial signals: another blogger found your post worth citing, so they linked to it. That’s a genuinely meaningful signal. It’s also why Technorati had cultural weight that no equivalent tool carries today. When your Technorati authority score climbed, it meant something in the community.
Today, backlink profiles are far more complex and far more manipulated. Link-building has become an industry in its own right, complete with outreach templates, guest post marketplaces, and link farms that inflate numbers without adding any genuine signal. The tools have responded by adding layers of quality scoring — Ahrefs has its Domain Rating, Moz its Domain Authority, Semrush its Authority Score — each attempting to separate the meaningful links from the noise. None of them fully succeeds. The signal-to-noise problem is one of the defining challenges of modern SEO.
The platform dependency problem, still unsolved
Reading the 2005 review again, what stands out is how much the author’s workflow depended on tools he didn’t control. He was checking multiple platforms daily, relying on Feedster for link counts and Blogpulse for comparative analysis, knowing that either could change their indexing methodology or disappear entirely.
That fragility hasn’t gone away — it’s just moved. Today’s bloggers and content creators have the same dependency relationship with Google Search Console, with Ahrefs’ crawl index, with whatever algorithm Semrush is using to calculate authority. When Google updates its search quality guidelines, backlink profiles that looked strong can weaken overnight. When a major SEO tool changes its scoring model, rankings shift without any corresponding change to the actual content.
The lesson the old tools taught us — that no single platform’s numbers should be treated as ground truth — is still worth remembering. Backlinks remain one of Google’s most important ranking factors, but the way you measure and interpret them requires judgment that no tool can fully substitute for.
What a thoughtful approach to link tracking looks like now
If you’re a blogger or content creator trying to understand your backlink profile today, the honest answer is that you need multiple sources of signal — not because any single tool is incompetent, but because each has different blind spots.
Google Search Console gives you the most accurate picture of what Google has actually indexed and credited. It’s limited to your own site, it doesn’t offer competitive comparison, and it can be slow to update — but the data is as close to authoritative as you’ll get for SEO purposes.
Ahrefs and Semrush are useful for competitive analysis: understanding who links to similar sites, spotting content that attracts links in your niche, tracking whether your link profile is growing or stagnating. Their absolute numbers shouldn’t be taken literally — crawl indices are always incomplete — but the relative comparisons are informative. A 2024 analysis by Ahrefs found that referring domain count correlates more strongly with organic traffic than raw backlink counts, which is a meaningful methodological note for anyone building a tracking practice.
For most independent bloggers, the practical takeaway is simpler than all of this: focus on the links you can verify have driven real traffic or real relationships. A link from a newsletter with a small but engaged readership is worth more than a hundred links from directories nobody visits. That was true in 2005, and it’s true now.
The enduring value of being cited
What made the 2005 link tracking ecosystem meaningful wasn’t the tools — it was the underlying culture. Bloggers linked to each other because they found each other’s work worth recommending. The tools were just trying to make that network of recommendations legible.
The culture of genuine editorial citation is still alive, even if it’s harder to see. Newsletters cite their sources. Substacks link to the reporting that inspired them. Independent bloggers still build reputations through the quality of what they write and the credibility of who references them.
The tools for measuring this have grown more sophisticated, more commercial, and more gameable. But the underlying signal — did someone find your work worth pointing to? — hasn’t changed. That’s still what you’re trying to earn. Everything else is just how you track it.
