Gender
Considering the monopoly and clout that Google enjoys, it must be held accountable for promoting such stereotypes and values, experts say.

If there’s a question you don’t know the answer to, more often than not, you’ll turn to Google. The search engine’s monopoly over the market, and data pool at hand, are almost unparalleled in public perception – which makes it all the more alarming when it tends to feed into dangerous and misogynistic values.

This is what seems to have happened, as pointed out by Twitter user @AHappyChipmunk, when someone opened Google Maps and typed “bitches near me.” The results show up addresses of girls’ schools, women’s and girls’ hostels and PGs, and women’s clothing shops. Not only is this shocking because ‘bitches’ is often used as a derogatory phrase for women, but also these Google search results of schools put minors at risk.

The tweet caused much outrage and alarm among people, who did the search themselves and posted screenshots of the real time results that they got.

When TNM ran a search of the same phrase on Google, the search results were the same.

Why would Google show these results?

Pranesh Prakash, a fellow at the Centre for Internet and Society, points out that this was telling of the fact that Google knows that ‘bitches’ can be a slang for women. “For many years now, Google has been trying to understand and search for what you meant, than what you may have typed,” he explains. “However, if you enclose this search phrase in quotes, you will not get the same results, because then it will look for the phrase.”

He also notes that if you were to look for “girls near me”, you would get similar search results on Google Maps, though more specific to women’s accommodations.

In contrast, it you were to Google, “bitch near me”, you would not get Google Map results. “I think that’s because Google understands that ‘bitches’ is more likely to mean women, than singular ‘bitch’,” Pranesh says.

Tech giants need to be held accountable

While Pranesh thinks that the search results are not ‘dangerous’ per se in the sense that a potential abuser was more likely to Google ‘girls schools’ than ‘bitches near me’, he agrees that these search results are telling of the misogynistic language in use online.

However, is it okay for tech giants to wash the responsibility off their hands because this is the language that their algorithm picked up? Nayantara R, who works with the Internet Democracy Project in Bengaluru, says, “This reminds me of the book Algorithms of Oppression: How Search Engines Reinforce Racism." The book challenges the idea that platforms like Google are level playing fields for different ideas and ideologies. Author Safiya Umoja Noble argues that due to data discrimination, private interests in promoting certain sites, as well as the monopoly of a few online search engines, results in biased search algorithms which discriminate against women of colour while painting favourable portraits of whiteness.

One such bias came to light in July last year where many people, including Congress MP Shashi Tharoor pointed out how a Google search for ‘south Indian masala’ led to a pictures of skimpily clad women, while ‘north Indian masala’ led to photos of spices and dishes. At the time, Google had maintained that this wasn’t its fault and that Google’s search worked by learning from the keywords people use and the results they click on to predict just what people are searching for.

However, Nayantara asserts that considering the monopoly and clout that Google enjoys, it must be held accountable for promoting such stereotypes and values. “Given Google search's dominance, it does make sense to point out instances like this to the company. This does not mean that Google is misogynisitc, but the algorithmic decision that is taken by someone at Google to rank certain pages which promote certain values before others does have a social impact. All technology is a result of how it is used. Accountability is required there,” she argues.

A conversation about what search results mean

Nayantara also points out that apart from accountability, there needs to be a conversation about what these search results mean. “Search results do embed political values. However, people need to understand that if they Google something and some pages are ranked before the others, it is not reflective of the truth or right or wrong, but reflection of an opinion,” she says.

“We need popular perception to wrap its head around what a search engine is, and that there are alternatives to Google. We also need more algorithmic diversity in search results,” she adds.