Why do people call Silicon Valley racist

After the killing of George Floyd, the Black Lives Matter movement also put pressure on the tech companies of Silicon Valley, right down to the DNA of their software: the program code. Google wants to subtly erase racist terms from familiar computer slang. Specifically, this affects the code behind the Chrome browser. In it, common terms like "blacklist" and "whitelist" are to be replaced by less catchy ones like "blocklist" and "allowlist" ("blocked" and "allowed list"). The colors black and white have so far been symbolic of the trustworthiness of websites - the blacklist therefore includes pages that are classified as dangerous and are blocked for the browser user. They can be used as a kind of child protection in the browser.

Another example from the programming language are the terms "master" and "slave", which hierarchically regulate access to shared data. For example, a device defined as the "master" is given priority in the network and controls all other ("slave") devices. Nat Friedman, CEO of the Github platform, also joined in when a Google developer asked for these terms to be removed. It is the most important place in the network to exchange software code and edit it together.

Everyday racism is often invisible

Are such renaming effective signals or gossip? Linguist Susan Arndt endorses Google's move. "People often don't know that certain terms are racist - but that doesn't make them any less racist," says the professor of Anglophone literature. Color symbolism and racism are historically intertwined: "In order to build up a system of enslavement, you needed a narrative of why that was okay. At that point, different races were invented and Christian color symbolism was adopted to distinguish between good and evil . " This theory of colors shapes our language today: driving black, seeing black, black market - the color is used as a synonym for bad, evil or illegal. In the long term, these terms are part of everyday language. Google has the chance to change something with immediate effect, says Susan Arndt.

Above all else, Silicon Valley is male and white

Language creates reality - there is hardly an area in which this principle can be taken as literally as with program code. This language is a command to act, a law for the machine that does not allow any deviations. Anyone who writes the codes of modern infrastructure in Silicon Valley usually not only belongs to a well-paid elite, but also helps to shape the options for action in the digitized society. The technology industry itself is still dominated by white men. 14.2 percent of the people who work in a technical area at Google are women, 2.4 percent of the technical employees are black. This emerges from the company's diversity report this year. Black residents made up 13.4 percent of the total US population last year.

For Achim Rettinger, Professor of Computational Linguistics at the University of Trier, discriminatory terms in the code are problematic if they are also used in everyday language. "Code is primarily only accessible to the programmer and therefore only affects the reality of the programmer and not the end user. However, there are many common terms such as 'blacklist', which should be avoided, as they at least are often used within the industry. " In his opinion, however, racist language is not a special problem in the software development industry.

Technology is not neutral

Linguistic sensitivity is particularly important for Rettinger when it comes to machine learning. "When these methods are used in software tools such as digital language assistants, recommendation systems or for creating content, the actions of the software are learned from a large number of records of human behavior. The software thus adopts discriminatory behavior and reinforces it." This could be observed directly using the example of several chatbots that were trained on the basis of chats with users and adopted their abusive behavior. Achim Rettinger also points out areas in which algorithms can further advance unconscious discrimination: This means, for example, that dark-skinned people are more likely to be recommended rented apartments than condominiums on the Internet, as this corresponds to the empirical values ​​with which the algorithm was fed.

A study by the Institute for Technology Assessment at the Karlsruhe Institute of Technology (KIT) recently came to the conclusion that algorithms are even more susceptible to discriminatory behavior than humans. The so-called algorithm bias show up in processes such as automated face recognition, which is supposed to identify faces in photos or videos in comparison with image databases. The programs have a significantly higher error rate when it comes to dark-skinned people. As a result, ethnic minorities could be falsely suspected even more frequently than is already the case. This is one of the reasons why IBM, Microsoft and Amazon recently announced that they would stop selling facial recognition software to the US police for the time being.

Real commitment or image cultivation?

Efforts to remove offensive language from program code have existed in software development for years. As early as 2014, Drupal, free content management software for websites, banned master-slave terminology from its code. The commitment of Google and Github could now give the movement more reach, as they put code on the Internet with open source - so everyone can access it and it therefore forms the basis for many other websites and private applications.

However, the report about language changes in the code meets with outrage among many African Americans. For example, says Ian M., a software developer from Silicon Valley who does not want to give his last name, the SZ: The action is "silly and hypocritical". They show a lack of understanding of the deeper problems in society. "For me as a black person this is something I can only imagine that a white person thinks to make a difference." The industry is faced with a lot of problems, from the working conditions of its employees to urgently needed data protection guidelines for the users of its products. Ian M. also says that Google, as a market-oriented company, does not have the task of being an activist. Especially not if this primarily serves to present oneself in the current mood.