Algorithmic Authority and Why it Can’t be Trusted

by | Nov 21, 2009

Can algorithmic tools, like Google’s PageRank, really determine or predict the authority of a blog? Find out why these tools aren’t necessarily better than human rankings, and why they might even be worse.

This post is in response to Frank Pasquale’s “Assessing Algorithmic Authority” which was in turn in response to Clay Shirky’s “A Speculative Post on the Idea of Algorithmic Authority.” I attempted to leave this in comment form on Frank’s post, but kept getting errors. So lucky you; you get to enjoy my ranting today.

What I specifically want to address is:

“Now the question becomes: are these algorithmic authorities any worse than the corporate goliaths they are displacing?”

As someone deeply embedded in the digital publishing world whose business (and whose clients’ businesses) are deeply affected by these algorithms, I would argue they are.

Here’s why:

Algorithmic “Authority” Status Can Change Almost Daily

Algorithms that determine a person’s, or publication’s, authority status (or pretend to at least) are ever-changing. They aren’t static metrics. They don’t depend on thoughtful analysis by people qualified to judge that authority, not that staff of those “corporate goliaths” was necessarily qualified either. 

They rely on surface-level measurements (such as the number of backlinks a site or article has). These factors change all the time.

These factors change all the time, especially when people are working to game the system; and at this point internet marketers have turned algorithm-chasing into sport. 

What doesn’t change as frequently, however, is actual authority of a source, author, or publication. True authority is something that tends to grow over time, not something routinely displaced because a corporate entity changes measurement standards on a whim.

Measurements of Algorithmic Authority are Easily Manipulated

Unfortunately, algorithms aren’t perfect. They can look at data. But they can’t control the humans behind that data. 

That’s why almost as soon as we see obvious algorithm changes from players like Google, there are people looking for ways to exploit them and game the system. At this point internet marketers have turned algorithm-chasing into sport.

Algorithms Aren’t Independent of Human Input

These companies (namely Google, since their PageRank algorithm was used as an example) have been known to override algorithmic assessments at will if you don’t act in accordance with their rules as the self-appointed internet police.

For example, it’s well-known that they’ll eliminate or decrease your PageRank if you use an advertising model they don’t approve of such as selling links.

Excuse my side tangent, but a few words on Google’s manual actions related to their algorithm and paid links.

There are legitimate reasons in online publishing to sell links that don’t involve any “black hat” intent to manipulate search rankings.

That said, there are two problems on Google’s end. First, private link sale advertisements are direct competition for their own link-based contextual ad network. And second, their algorithms weren’t set up in a way where they were able to determine advertisements from other links adequately. Their own shortcomings in an over-reliance on backlinks as a metric both led to an increased demand for text link ads and an increased crackdown on them.

If you use this kind of ad model without following their “rules” they treat you like a spammer or scam artist no matter how relevant or transparent the ads on your site might be. In other words, they make no differentiation between legitimate and relevant ads that offer value and true spam.

While I’ve been heavily involved in the online publishing, webmaster, and search engine optimization (SEO) communities for years, the bulk of my clients happen to be small businesses and independent professionals (generally deep experts and true authorities in their fields). This is where I take issue with Google’s largely self-serving manual actions overriding their algorithm. 

The problem is rather than fixing that over-reliance on backlinks as a ranking metric, they created a system where online publishers were expected to change the default behavior of how links on the web work (by including a “nofollow” attribute on any link that might violate their guidelines) to account for their flaws.

They put the onus on publishers, of all sizes and experience levels with HTML, to tell their bots which links to follow and crawl, and which not to. 

Now, those who routinely game the system could find ways to work around this and make it less obvious when links were sponsored. But what about those smaller publishers, often the true authorities even if their general popularity didn’t reflect that?

Independent researchers, academics, longstanding industry professionals, entrepreneurs… if slapped with a manual penalty for doing something Google’s guidelines disapprove of — not necessarily selling sponsored links even, are they really any less authoritative in their area of expertise? 

Of course not. 

So in this case, not only are the algorithms themselves deeply flawed and incapable of distinguishing between real authority vs popularity, but human intervention manages to muck things up even more. Just because you don’t see human interference with algorithmic outputs, it doesn’t mean that interference doesn’t exist.

All of that said, I’d be disappointed to see this tool (PageRank), or those like it, factored into anything authority-related.  It’s not a legitimate measure of authority. It never has been. And it never will be.

(Update: As of 2016, Google no longer updates Toolbar Pagerank at all — meaning, while they still can rank sites behind the scenes, you cannot see the actual PageRank of any site publicly. And before this, updates were increasingly infrequent — as in once per year before the end — so they were still highly inaccurate on the public side. If you come across a tool or rankings list promising to show PageRanks for sites, they are woefully outdated. On the surface, this is a good thing. But I can’t help but remain concerned over PageRank simply becoming more of a hidden metric in an effort to stop marketers and SEO people from exploiting its obvious flaws.)

The inaccuracies of authority / influence ranking tools or algorithms aren’t new. They’ve been discussed in-depth for quite some time following the “best” list craze that relied on them to paint a false picture of influence in the blogging world for example.

Popularity doesn’t determine influence. And algorithms don’t determine, or accurately calculate, authority. The best we can do for now is be cautious about where we place our own trust. That, and avoid perpetuating these myths by amplifying subjective and faulty evaluations on behalf of ourselves and our clients by seeking validation through algorithms and the ranking lists that employ them.

2 Comments

  1. Well put response.

  2. Jennifer:

    Are you going to keep posting? Good stuff in here…

    Cheers,

    Robert

Share This