Skip to main content
The Globe and Mail
Support Quality Journalism.
The Globe and Mail
First Access to Latest
Investment News
Collection of curated
e-books and guides
Inform your decisions via
Globe Investor Tools
per week
for first 24 weeks

Enjoy unlimited digital access
Enjoy Unlimited Digital Access
Get full access to
Just $1.99 per week for the first 24 weeks
Just $1.99 per week for the first 24 weeks
var select={root:".js-sub-pencil",control:".js-sub-pencil-control",open:"o-sub-pencil--open",closed:"o-sub-pencil--closed"},dom={},allowExpand=!0;function pencilInit(o){var e=arguments.length>1&&void 0!==arguments[1]&&arguments[1];select.root=o,dom.root=document.querySelector(select.root),dom.root&&(dom.control=document.querySelector(select.control),dom.control.addEventListener("click",onToggleClicked),setPanelState(e),window.addEventListener("scroll",onWindowScroll),dom.root.removeAttribute("hidden"))}function isPanelOpen(){return dom.root.classList.contains(}function setPanelState(o){dom.root.classList[o?"add":"remove"](,dom.root.classList[o?"remove":"add"](select.closed),dom.control.setAttribute("aria-expanded",o)}function onToggleClicked(){var l=!isPanelOpen();setPanelState(l)}function onWindowScroll(){window.requestAnimationFrame(function() {var l=isPanelOpen(),n=0===(document.body.scrollTop||document.documentElement.scrollTop);n||l||!allowExpand?n&&l&&(allowExpand=!0,setPanelState(!1)):(allowExpand=!1,setPanelState(!0))});}pencilInit(".js-sub-pencil",!1); // via darwin-bg var slideIndex = 0; carousel(); function carousel() { var i; var x = document.getElementsByClassName("subs_valueprop"); for (i = 0; i < x.length; i++) { x[i].style.display = "none"; } slideIndex++; if (slideIndex> x.length) { slideIndex = 1; } x[slideIndex - 1].style.display = "block"; setTimeout(carousel, 2500); } //

A man holds his smartphone which displays the Google home page in this file photo.

© Regis Duvignau / Reuters

Every year, makes hundreds of changes to improve the computer code of its search engine, but in an attempt to combat the scourge of fake news and offensive content, its engineers are beginning to collect data from a new source: humans.

"It's become very apparent that a small set of queries in our daily traffic (around 0.25 per cent), have been returning offensive or clearly misleading content," writes Ben Gomes, vice-president of engineering for Google, in a blog post outlining some policy changes that will seek more user feedback in an effort to clean up some of the scandals related to automatically generated sections of its search results.

Google's troubles with offensive content have been popping up with more frequency in recent months. In October, 2016, users noticed Google would sometimes autocomplete the phrase "are jews …" with the word "evil." After a public outcry the company made changes to remove the offending lines, adding more scrutiny in its algorithm to so-called "sensitive" topics. But even after the fixes, its search engine still regularly turns up offensive results.

Story continues below advertisement

For example, right now, users who type "are black" into a Google search bar might see autocomplete suggestions such as "are black people smart," which leads to a search page topped by a story about the offensiveness of that autocomplete suggestion, followed by a Fox News article claiming a DNA connection to intelligence and a fourth article with the headline: "Black people aren't human." That last article is from an organization called National Vanguard, which is identified as a U.S. neo-Nazi white nationalist splinter group by the Southern Poverty Law Centre.

To combat the problem, Google is giving regular users a new "report" button on its search-bar autocomplete feature so people can more easily alert Google to problematic results. A similar button will be added to the "featured snippets" section of its results pages. Autocomplete and featured snippets – previews of search results – have both been the subject of controversies that involved the promotion of conspiracy theories, fake news and racist slurs on the hugely popular website.

After Tuesday, a user who spots an offensive autocomplete result will be able to flag it for Google's engineers to review.

But even these high-profile anecdotes don't capture the scale of the problem Google faces. The company doesn't say how many searches a day it processes; it simply says it processes "trillions" of search requests a year. So while one-quarter of 1 per cent of bad content might be a good result for almost any other enterprise, Google could be responding to many billions of user requests a year with these "low quality" results. Small for Google is still a potential avalanche of unwelcome content for users.

Mr. Gomes explained that content promoting hate is also being given the lowest possible search weighting, and an increased importance will be given to "high-quality" sources of information, particularly on sensitive topics. The process of sifting through search results involves a mix of algorithm and human-curation efforts.

For instance, Google has seen posts containing Holocaust-denying falsehoods ranking high in its searches – an absurd condition when there is excellent scholarship and documentation of the horrors of the Holocaust available online.

Google is also releasing more details about its human "raters" – a hand-picked group of 10,000 users who already give Google feedback on the hundreds of tests it runs to improve search results. In addition to this extra monitoring, Mr. Gomes and Google believe making feedback tools easier to find for the rest of the general public could expand the effectiveness of this human-curation effort.

Story continues below advertisement

Mr. Gomes and Pandu Nayak (a Google research fellow in search quality) said on Monday that some of Google's problems come from users trying to game the system to gain a higher ranking for their content (which can lead to more ad dollars, among other effects). The company blog described "low-quality 'content farms,' hidden text and other deceptive practices," among the tactics. In that environment, Google's challenge is to guard against abuse of the new feedback buttons. For instance, if Google guaranteed that flagging content would remove a search result, unethical users could wield a "banhammer" to block content they didn't like or in order to favour their own content.

"There is likely to be [helpful] signal in there, even through all the noise through abuse," Mr. Nayak says. "We don't expect the problem will completely disappear."

Content problems with Google's featured snippets may even be more serious. According to the search engine optimizing service MozCast, currently about 15 per cent of Google searches return a result including a featured snippet, which on just looks like a text box – one of many results – off to the right side. However, if you searched using one of Google's voice-assistant or smart-home products and a snippet was returned, the context of the other results on a Web page is missing and the service would read the snippet aloud as if that was the one true answer.

In recent months, users have posted videos of a "smart device" responding with answers sponsored by racist or conspiratorial sites, such as false claims that former president Barack Obama was plotting a coup, or allegations that four U.S. presidents had been members of the Ku Klux Klan (there is little evidence to suggest any U.S. president was an active or former KKK member).

"There are people who are writing all kinds of things on the Web," says Mr. Gomes, who added that one issue Google is having is finding high-quality sources to promote in place of some of the more explicit content. "Journalists are not covering some of these conspiracy theories."

As Mr. Gomes points out in his blog post, it's both a strength and a limitation of its service that Google doesn't create its own content.

Story continues below advertisement

The trouble is that some users go to Google to find others who agree with the latest reality-challenged statement from their preferred political leader … Other users who find the same counterfactual content might assume the world -+ or Google – has gone mad.

"The content that appears in these features is generated algorithmically and is a reflection of what people are searching for and what's available on the Web. This can sometimes lead to results that are unexpected," Mr. Gomes writes.

The changes began rolling out in the U.S. and international markets today.

Your Globe

Build your personal news feed

  1. Follow topics and authors relevant to your reading interests.
  2. Check your Following feed daily, and never miss an article. Access your Following feed from your account menu at the top right corner of every page.

Follow the author of this article:

View more suggestions in Following Read more about following topics and authors
Report an error Editorial code of conduct
Tickers mentioned in this story
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to If you want to write a letter to the editor, please forward to

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

If you do not see your comment posted immediately, it is being reviewed by the moderation team and may appear shortly, generally within an hour.

We aim to have all comments reviewed in a timely manner.

Comments that violate our community guidelines will not be posted.

UPDATED: Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies