Officials at tech behemoth Google have alternately claimed the company doesn’t manipulate searches, improperly curate results, or keep “blacklists” of sites that it either buries or bans altogether.
But a new report from The Wall Street Journal involving extensive analysis and research found that in reality, the company does all three of those things.
The paper notes:
Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.
The company even claims on its blog, “We do not use human curation to collect or arrange the results on a page.” That said, Google adds it cannot divulge just how it’s search algorithms work because it’s currently involved in a very high-stakes conflict with entities who want to game the system for profit.
However, that claim of innocence is not accurate, the WSJ’s study found. In fact, over the course of years, Google researchers and engineers have re-tooled and manipulated search results to a much greater degree than company officials have admitted.
The investigation noted that the manipulative actions have very often occurred in response to pressure from companies, various authoritarian governments (like China), and outside interest groups.
“They have increased sharply since the 2016 election and the rise of online misinformation,” the paper reported.
Through more than 100 interviews and some of the paper’s own testing and analysis of Google search results, the WSJ found:
— Google engineers often make adjustments outside the public’s view to information the company layers on top of basic search returns including auto-complete suggestions, “knowledge panel” boxes, and “featured snippets.” In addition, news results are not subject to the same company policies that limit what engineers are permitted to alter or remove altogether.
— Though the company has publicly denied that it keeps blacklists of certain sites that have been removed from search results, Google does, indeed, have such lists. In fact, as Natural News reported in July, Google’s vice president of government affairs and public policy, Karan Bhatia, appeared to have lied under oath to Congress when he denied such lists exist.
— In the auto-complete function, Google engineers have manipulated the algorithm to “weed out more incendiary suggestions for controversial topics” like abortion and illegal immigration, the WSJ reported. The effect has been to effectively remove any such results altogether.
— Internally, Google employees and founders Sergey Brin and Larry Page have had disagreements about how extensively the company should filter search results. Currently, employees are permitted to seek revision to specific ‘controversial’ search results for topics including autism and vaccinations.
The results of the Journal’s investigation contradict one of the company’s primary defenses against global regulators who are increasingly concerned about how the search behemoth uses its power (or abuses it): “That the company doesn’t exert editorial control over what it shows users,” the paper reported.
Mostly, regulators have voiced concerns about alleged anticompetitive practices, anti-trust issues, online misinformation, and, of course, Left-wing political bias.
The algorithms the search giant uses are anything but autonomous. In fact, they are regularly altered and changed by executives and engineers who are interested in delivering what they feel are the most ‘relevant’ results to users. That alone suggests tampering.
At the same time, the manipulation is being done to satisfy the demands of some powerful interests, including governments like China, all in an effort to protect the parent company’s $30 billion in annual profits.
So the next time you use Google, think about this: Depending on the subject matter you searched for, you’re probably not getting everything you could be getting. There’s always Good Gopher.