Google is upgrading its famous search engine to make it harder for sites selling fake news and other hoaxes to make it into its top search results.
As it turns out, filtering the fake is no small job — making sure fake news does not dominate search results means that “more structural” changes to the way searching works have to be made.
Changes like refining the algorithm that determines which search results appear on top. Google did not offer much detail on the technical side of the change — past noting that it would “help surface more authoritative pages and demote low-quality content.”
And Google is relying on more than the algorithm — a blog post by Google’s parent Alphabet outlined the human intervention side of its efforts like training evaluators (people who vet the search results) to use human eyes to better spot low quality results and make it easier to flag the fake news.
Ben Gomes, vice president of engineering for Google, claims 0.25 percent of daily searches return “offensive or clearly misleading” content. But those 0.25 percent have a way of getting into the news cycle — like last year, for example, when a white supremacist site was featured prominently in search results about the Holocaust.
The changes to the search algorithm are designed so that “issues similar to the Holocaust denial results” are less likely to appear, Google said.
Google has also started adding a “Fact Check” feature to some results. That feature essentially ranks whether the presented claim are true, false or just full of “truthiness.” To make that determination, Google works with over 100 fact checking organizations (similar to a move by Facebook which also employs the efforts of a few dozen fact checking groups to keep fake news off its news feed.)
And of course, Google users are part of Google’s plan — it believes said users can be counted on to report “unexpected, inaccurate or offensive” results that show up in autocompleted searches and featured snippets. The work of the evaluators will, in the long term, help Google’s efforts to vet with software as it will train it to stay away from sites that push “misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories,” the company said.