4 minutes reading time (854 words)

Google exacerbates the Internet's sexism: How the tech giant's revenue model leads to an unequal Internet

Google exacerbates the Internet's sexism: How the tech giant's revenue model leads to an unequal Internet

Compare the image search results for "boy crawling" and "girl crawling." In the first you see cute babies, in the second you see scantily-clad women in seductive poses. Similarly, compare "boy at work" with "girl at work" or "schoolboy" with "school girl." Reseachers at MIT have discovered that this is, in part, because of a "hidden sexism within language.But that isn't the whole story.

This sexist representation of women is also the result of Google's extremely profitable revenue model. As Zeynep Tufekci, Associate professor at the School of Information and Library Science at the University of North Carolina at Chapel Hill reminds us, "It's not the intent or the statements people in technology make that matter, it's the structures and business models they're building."

Through its 81% of the global search market and 4.5 billion searches occurring every day, Google influences not just what internet content is visible but also what content is created. Google made its 89.5 billion dollars in 2016 in large part because companies find it a valuable advertising tool; In helping you find what you are looking for, Google helps companies find you.

Conduct an image search (images.google.com) for "Amazon." Do the same for "Cherokee." The results, dominated by the e-commerce company and the car brand respectively, tell us something important about how the internet works and who is more likely to benefit: companies over communities and products over people.

Companies pay Google for ads within search results but also spend time and money conducting Search Engine Optimization. SEO involves designing web pages and online content in a way that increases visibility within search engines. Companies hope to get a return on their investments through increased sales of their products and services. FCA, the maker of Cherokee cars, has a financial incentive to care about search visibility while the people of the Cherokee tribe do not. While the Brazilian and Peruvian government successfully have halted Amazon Inc. from its use of the .amazon domain extension, the river barely surfaces in initial search results.

In today's most important information channel females are defined by what of them can be sold and consumed. Tufekci has pointed out that "The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows." Within this model, females surface as products before they surface as people. The female body as a product is commoditized and sexualized, and from it Google earns revenue, 43 cents per click for girls compared to that of 23 cents for boys.

Not only is the current internet unequal but artificial intelligence is programmatically encoding this discrimination into our future. Artificial intelligence uses large datasets, such as all the images of "girls" on the internet, and learns to label future images based on common characteristics found in the initial data set. Machines are learning that girls are sexualized women while they are learning that boys are young male children. We don't know the full impact of AI, but it is already being built into the technology we trust and depend on. But, if the past is any indication, it won't equally respect us all.

This product over people prioritization is an example of what Rahul Bhargava, Research Scientist at MIT Center for Civic Media has identified as a problem not with machine learning but with its teachers and textbooks. By using a data set heavily influenced by the commercial-first internet, we are teaching machines to treat people as products. An important part of the solution lies with MIT researcher Joy Buolamwiki's reminder that, "It is possible to create full-spectrum training sets that reflect a richer portrait of humanity.By using international training materials we can start to teach machines to see the world both as it is and should be, instead of reflecting the world and culture that is currently online.

We should take the pervasive presence of sexism with search technology as evidence that companies need to be cognisant of sexism in the initial design process, asking and pursuing the implications as possible amplifiers and embedders of what Boston College professor Gerald Kane calls, "our base instincts." As Silicon Valley is erupting with sexual discrimination accusations we can see this as another reason why corporate diversity matters. However unintentionally, technology built primarily by men disproportionally hurts women. Whether or not this could have been anticipated, this is not yet considered an important problem by the leaders of these companies that continue to profit from it, only 11% of whom are women.

An additional and often over overlooked part of the solution is the opportunity for each individual to make and share creative, non-commercial content online: drawings, paintings, poems. The system has become impossible to opt out; you and your tribe will be represented online whether or not you contribute to it through self-representation. Using content and information as a weapon, women and underrepresented populations can contribute to the way machines see them. By creating content and, like those using the female body for the profit motive, learn to structure it for search engines women can self-define and compete with the interests working against our equal representation.

DCMS publishes a guide to personal data mobility
Anatomy of an AI system