A New Google Algorithm based on Factual Trust?

A New Google Algorithm based on Factual Trust

According to a recent article on New Scientist, Google is devising a new algorithm which will be based on Knowledge Based Factual Trust as opposed to the strength of links pointing to the article.

Google has devised its algorithm based on strength of backlinks as a strong ranking signal. As more popular and authoritative sites point to a page, the organic rank for the page will improve.

However, over the years, we have seen pretty crappy garbage sites ranking for keywords and they spoil the user experience. The websites having incorrect and false information may start trending the search results if more and more people link to the page.

Knowledge Based Trust

Google is planning to use the concept of Knowledge Based Trust as opposed to the backlinking signal. This concept is based on a research study by a team of researchers from Google.

The research paper abstract says this:

"We propose a new approach that relies on endogenous signals, namely, the correctness of factual information provided by the source. A source that has few false facts is considered to be trustworthy. The facts are automatically extracted from each source by information extraction methods commonly used to construct knowledge bases. We propose a way to distinguish errors made in the extraction process from factual errors in the web source per se, by using joint inference in a novel multi-layer probabilistic model. We call the trustworthiness score we computed Knowledge-Based Trust (KBT)."

What this essentially means is, Google will be checking the correctness of factual information presented on a page with the Knowledge Vault, which is a huge database of information which Google has been collecting over the years on global facts.

So, if the New Scientist report is correct, what we will see is, Google will look at authoritative information and the correctness of information as a strong ranking signal in its algorithm.

This according to the report will help weed out the garbage websites which rank purely on the basis of backlinks.

Now the big question is - Will it really solve the problem?

For example, if I write an article on Sachin Tendulkar, the chances are all the facts I use about him will be correct. And so will be the case for every article written about him. Then how will Google detect which article to rank higher in search results.

I believe, correctness of factual information will be one of the several ranking signals as opposed to an alternative to backlinks. Besides, one of the biggest challenges of search engine is to filter out quality pages from non quality pages.

And by quality I mean the quality of information written, which is highly subjective. And this in no way can ever be solved by factual correctness of information, which is objective in nature.

What are your thoughts around this?