Who decides what is quality content?
In a post panda World, low quality content no longer cuts the mustard. It’s the high quality stuff that Google now requires when deciding where to rank your website.
This refers to both the content on your web site and the content on the websites linking to you.
Quality web content can be defined along the following terms:
- It attracts and persuades a person to link to it for no other reason than to cite the information.
- It attracts and persuades a person to share it on social media for no other reason than to show the information to others.
It is the intent within these two statements that is fundamental.
It is human action or reaction that defines what is or isn’t quality. Action regardless of any other factor than the content itself.
This is what we would call social proof or more accurately human proof. By it’s very nature it cannot be manipulated in the same way a search engine algorithm can be.
To be able to accurately quantify web content, the content that must be the sole influencer.
The human signal remains pure and is the product of each individual who makes the decision to create each social signal they produce.
It can of course be mimicked by software designed to act like humans, although such software is unable to replicate the nuance and detail which human social signals produce. We see this in software designed to inflate social media accounts such as Twitter, Pinterest etc. Although they aid to give some signal, it is one of very low quality and easily identified.
The human social signal can also bought. In large offshore setups, low paid workers toil in Internet factories creating social signals to order for clients to gain advantage in search engine rankings. Similar offshore factories already exist to build links in forums, blog comments etc.
Thus, the human social signal is noisy and contaminated and if Google is not able to identify and isolate the manipulated social signal then Google’s search engine rankings can be manipulated regardless of the quality of the content.
We may be seeing evidence that Google has somehow factored in a way to quantify the veracity of the social signal. I only have anecdotal evidence and is an educated guess, but it is the only way to determine whether the social signal is worth listening to. And also Google does give us clues on the direction of how their search engine is going to work.
It has gone beyond Google simply listening to the social signal, all social signals must now be quantified if they are to be of any use and authorship is one way to do this. If Google knows who wrote the article and who is responding to the article it can in someway more accurately predict the quality of the social signal.
For example, if an SEO agency writes an article and someone else calls it “Awesome”, that is a signal. But if Google then works out that the person works for the SEO agency, or constantly calls the content which the SEO agency produces “Awesome”, then the signal needs to be quantified to be accurate.
I have noticed this behavior on Twitter and it feeds into my study of the tribal psychology that exists on Twitter and social media. A certain website will release an article, its immediate employees respond in an unnaturally hyped up fashion, the approbation cascades down to partner companies and individuals seeking attention and validation from association.
Sometimes the content is excellent, sometimes it is mediocre and yet the same applause emanates from the same individuals creating a never ending stream of hype. It is only when the content is viewed by dispassionate readers that we are able to assess its true quality and quantify it. Therefore if Google were to determine the quality of the content it would have to apply a filter to those who express relentless, sycophantic adoration.
If Google knows who is initiating the social signal it will be able to build an algorithm around the data it knows about the signal. If it knows the author and if they have given their data to the Google database voluntarily or not then the Company can perform a correct quantification of the social signal and even the link signal if it comes from a website or webpage soley in the control of the the author.
Therefore, we may be seeing a way that Google has accurately determined which is quality content and also which are quality links.
It may even determine that if the author is not in their database their signal cannot be correctly quantified and must be treated accordingly.
This may already be happening with the data Google has in its database from its G+ system. It may that this so called “social network” is not a social network at all, but more of a way for Google to acquire an accurate human social signal in relation to web content
In conclusion, it is essential that websites continue to produce high quality content, defined by the viewer who is independent of any benefits which may come to the website which hosts the content.
The content must be judged by the reaction of those outside of the tribe if it is to be regarded as high quality or not.
You cannot build natural links.
You cannot build organic links.
You can only build content that attracts, that persuades that can be promoted.
But wait, all the seo industry keep banging on about how you must build natural this and organic that.
Understand this, “we are not discussing peaches”.
And I love a good peach.
Quite simply, and I hate to say this as I have a lot of mates (and I even do it myself) do the natural-organic, link build shuffle. The reason it exists is one of marketing speak and oh how marketers love to create jargon.
In other words, it’s guff.
And it’s very effective marketing guff.
As an industry we need to label complex, machine based processes with easy to use fluffy, psuedo hippy terms like “organic”. This helps us sell a service that is imperfect, and hide the fact that we don’t totally know what the hell is going on because Google wont tell us. Of course you wouldn’t think that if you read most seo blogs as most present pure guess work as ironclad fact.
Google will hint that we need to build fields of lush, verdant organic links without a tinge of blackhat (another guff word) pesticide.
The problem with this is the language that marketing people use. But Google is run run by engineers and what we are dealing with is a highly engineered, efficient machine.
Get it into your heads.
Google is a machine.
What the hell is organic about that?
Building natural links is an oxymoron.
Is it better to suggest we need to build links that the machine “thinks” is natural.?
Machines are process led, they repeat tasks, they create patterns. Not that you could possibly backward engineer the Google algo . It is one of the most incredible things man has created, I could throw out some mind blowing stats here but you probably already know.
The point is:
We build content for people.
We try to build content for the machine.
Time for us to try to stop doing both.
Web content should be designed to make people gasp, not for a bot to judge with ones or zeroes.
“I say old chap, what an absolute bounder.”
If you have been gripped by the Leveson Inquiry into Media Ethics as I have, you will have noticed that the lead council, Robert Jay Q.C. has a certain way of speaking and asking questions.
Yesterday a meme started on Twitter querying pop lyricists in the style of Robert Jay
The purpose of this post is to test a new curation tool that I will be using when offering the Web Content News Feed service. Currently it’s going out on Linkbait Coaching, but I will be offering it as a stand alone product to help with content creation ideas, ultimately leading to more links and attention.