Tricksy! You’ll notice the knowledge box on the righthand side, too. But Google has around 92 percent of global search market share. It effectively is online search.
Dark patterns are all too common online in general, and January wasn’t the first time people accused Google of deploying them. In June 2018, a blistering report from the Norwegian Consumer Council found that Google and Facebook both used specific interface choices to strip away user privacy at almost every turn. The study details how both platforms implemented the least privacy-friendly options by default, consistently “nudged” users toward giving away more of their data, and more. It paints a portrait of a system designed to befuddle users into complacency.
That confusion reached its apex a few months later, when an Associated Press investigation found that disabling Location History on your smartphone did not, in fact, stop Google from collecting your location in all instances. Shutting off that data spigot altogether required digging through the settings on an Android smartphone. It took eight taps to reach, assuming you knew exactly where to go—and Google didn’t exactly provide road signs. In May of this year, Arizona attorney general Mark Brnovich sued Google under the state’s Consumer Fraud Act, alleging “widespread and systemic use of deceptive and unfair business practices to obtain information about the location of its users.” Even a privacy-focused Google software engineer didn’t understand how location controls worked, according to recently unsealed court documents from the case first reported by the Arizona Mirror. “Speaking as a user, WTF?” reads the chat log.
“The attorney general filing this lawsuit appears to have mischaracterized our services,” another Google spokesperson, Jose Castaneda, said. “We have always built privacy features into our products and provided robust controls for location data. We look forward to setting the record straight.” Castaneda also called the employee communications surfaced in the court documents “cherry-picked published extracts,” which “state clearly that the team’s goal was to ‘Reduce confusion around Location History Settings.’”
Google has taken steps in recent years to give users more control over how long it keeps the data that it collects. A feature added in 2019 let you set your “Web & App Activity” to delete automatically after three or 18 months, and this summer Google implemented auto-deletion of data for even more categories by default for new accounts. It has also made it easier to adjust your privacy settings directly from within search, meaning you have to dig less to find them, and introduced Incognito Mode to YouTube and Google Maps.
“We are unequivocally committed to providing prominent, transparent and clear privacy controls, and we continue to raise the bar, with improvements like making auto-delete the default for our core activity settings,” Google said in its statement.
Critics say that the company has not gone far enough. “We are aware that Google has made a number of minor improvements,” says Gro Mette Moen, acting digital policy director of the Norwegian Consumer Council. “However, as far as we have seen, none of these changes address the main issue: Consumers are still led to accept a large amount of tracking.”
They’re also led to accept a large amount of, well, Google. A detailed investigation by the Markup last month found that in 15,000 queries examined, nearly half of the first page of mobile search results were designed to keep the user on Google, rather than directing them to another website. Those responses consisted of both Google’s own properties and the “direct answers,” the snippets Google pulls from outside sites to display right in the results. Google has called the Markup’s methodology “flawed and misleading,” arguing that it pertains to a “non-representative” set of samples. “Providing feedback links, helping people reformulate queries or explore topics, and presenting quick facts is not designed to preference Google,” the company said in its statement. “These features are fundamentally in the interest of users, which we validate through a rigorous testing process.”