Today I’m responding to Frank Pasquale’s “Assessing Algorithmic Authority.” That, in turn, responded to Clay Shirky’s “A Speculative Post on the Idea of Algorithmic Authority.” I attempted to leave this as a comment on Frank’s post. I kept getting errors. So lucky you. You get to enjoy my ranting.
What I want to address:
“Now the question becomes: are these algorithmic authorities any worse than the corporate goliaths they are displacing?”
I’m someone deeply embedded in the digital publishing world. My business, and my clients’ businesses, are affected by these algorithms. And I would argue they are worse.
Here’s why.
Algorithmic Authority Can Change Almost Daily
Algorithms that determine the authority status of a person or publication (or pretend to) often change. They don’t remain static or depend on thoughtful analysis by people qualified to judge authority.
Instead, they rely on surface-level measurements such as the number of backlinks to a page.
These factors change all the time.
Actual authority doesn’t change as frequently. And true authority tends to grow over time. A corporate entity changing measurement standards cannot displace that true, earned authority.
Measurements of Algorithmic Authority are Easily Manipulated
Unfortunately, algorithms aren’t perfect. They look at data. But they don’t control the humans behind that data.
Almost as soon as we see obvious algorithm changes from players like Google, people find ways to exploit them.
By this point, internet marketers treat algorithm-chasing as sport.
Algorithms Aren’t Free From Human Input
Google’s PageRank algorithm featured in the above post. So let’s focus on that example.
Google can override algorithmic assessments by applying penalties. They can even outright de-index a website. This can happen if you don’t follow their rules as the self-appointed internet police.
For instance, with PageRank Google can eliminate or decrease your site’s rank if you use an advertising model they don’t approve of — like selling links directly.
Excuse my side tangent, but a few words on Google’s manual actions related to their algorithm and paid links:
Legitimate reasons exist in online publishing to sell links that don’t involve any “black hat” intent to manipulate search rankings.
That said, two problems exist on Google’s end.
First, private link sale advertisements exist as direct competition for their own link-based contextual ad network (Adwords / Adsense).
Second, their algorithms can’t determine advertisements from other links adequately. They over-rely on backlinks as a metric. That both led to an increased demand for text link ads and an increased crackdown on them.
If you use this kind of ad model without following their rules they treat you like a scam artist. It doesn’t matter how relevant or transparent the ads on your site are. It doesn’t matter if your content serves as the most authoritative content on its subject matter. They make no differentiation between relevant ads that offer value and search engine spam.
While I’ve worked in the online publishing, webmaster, and search engine optimization (SEO) communities for years, the bulk of my clients include small businesses and independent professionals. They’re often true experts in their fields.
Because of this, I take issue with Google’s largely self-serving manual actions overriding their algorithm.
Rather than fixing their over-reliance on backlinks, Google created a system where online publishers were expected to change the default behavior of how links on the web work.
This means including a “nofollow” attribute on any link that might violate their guidelines. Publishers can’t simply link where they wanted to anymore without risk of penalization. They now have to tell Google whether or not they want each link, or type of link, on their site crawled.
Problem: They Put the Onus on Publishers
Publishers of all sizes and experience levels now have to tell Google’s bots which links to follow and crawl in their code, and which not to. So publishers are expected to help overcome Google’s design flaws.
Now, those who routinely game the system can find ways to work around this. They just make it less obvious when links were sponsored. As an online publisher managing over a dozen sites at any given time, I’ve seen more than my share of these requests. And while I don’t accept them, I’ve seen how Google’s policies can hurt transparency more than they help.
But what about those smaller publishers, often true authorities (which popularity can poorly indicate)?
Independent researchers, academics, industry professionals, entrepreneurs… If slapped with a manual penalty for doing something Google disapproves of — not necessarily selling links — are they really less authoritative in their area of expertise?
Of course not.
Algorithms are deeply flawed and incapable of distinguishing between real authority vs popularity. Human intervention mucks things up even more. Just because you don’t see human interference with algorithmic outputs, it doesn’t mean that interference doesn’t exist.
Tools like PageRank should not be factored into anything authority-related. It can never be a legitimate measure of authority.
Algorithmic Authority in 2016: Update on Google PageRank
Google no longer updates Toolbar Pagerank. That means while they still can rank sites behind the scenes you cannot see your PageRank publicly. Before this, updates were infrequent — about once per year toward the end. If you come across a tool or rankings list showing PageRank for sites, they are woefully outdated. On the surface, this is a good thing. But I’m still concerned over PageRank simply becoming more of a hidden metric in an effort to stop marketers and SEO people from exploiting its obvious flaws without the underlying flaws being fixed.
The inaccuracies of authority / influence ranking tools or algorithms aren’t new. They’ve been discussed for quite some time, such as following the “best” list craze that relied on them to paint a false picture of influence in blogging.
Popularity doesn’t determine influence. And algorithms don’t determine, or accurately calculate, authority.
For now, be cautious about where we place our trust. That, and stop amplifying subjective and faulty evaluations on behalf of ourselves and our clients. And maybe stop seeking validation through algorithms and the ranking lists that employ them in the first place.
Well put response.
Jennifer:
Are you going to keep posting? Good stuff in here…
Cheers,
Robert