Data orientalism: on the algorithmic construction of the non-Western other
- PDF / 478,988 Bytes
- 21 Pages / 439.37 x 666.142 pts Page_size
- 101 Downloads / 185 Views
Data orientalism: on the algorithmic construction of the non-Western other Dan M. Kotliar 1,2 # Springer Nature B.V. 2020
Abstract Research on algorithms tends to focus on American companies and on the effects their algorithms have on Western users, while such algorithms are in fact developed in various geographical locations and used in highly diverse socio-cultural contexts. That is, the spatial trajectories through which algorithms operate and the distances and differences between the people who develop such algorithms and the users their algorithms affect remain overlooked. Moreover, while the power of big data algorithms has been recently compared to colonialism (Couldry and Mejias 2019), the move from the colonial gaze (Yegenoglu 1998) to the algorithmic gaze (Graham 2010) has yet to be fully discussed. This article aims to fill these gaps by exploring the attempts to algorithmically conceptualize “the Other” . Based on the case study of an Israeli userprofiling company and its attempts to sell its services to East Asian corporations, I show that the algorithmic gaze—algorithms’ ability to characterize, conceptualize, and affect users—stems from a complex combination between opposing-but-complimentary perspectives: that it is simultaneously a continuation of the colonial gaze and its complete opposite. The ways in which algorithms are being programmed to see the Other, the ways algorithmic categories are named to depict the Other, and the ways people who design such algorithms describe and understand the Other are all different but deeply interrelated factors in how algorithms “see.” I accordingly argue that the story of algorithms is an intercultural one, and that the power of algorithms perpetually flows back and forth—between East and West, South and North. Keywords Algorithms . Algorithmic power . Culture . Data . Data colonialism . Racism
* Dan M. Kotliar [email protected]
1
Deptartment of Communication, Stanford University, 450 Jane Stanford Way, Stanford, CA 94305-2050, USA
2
Department of Sociology and Anthropology, The Hebrew University of Jerusalem, Jerusalem, Israel
Theory and Society
Our lives are increasingly becoming algorithmic. Algorithms automatically extract our data, categorize us, profile us, and have deep and often tangible effects on our lives (Zuboff 2019; Couldry and Mejias 2019; Eubanks 2018; Noble 2018; O’Neal 2016). Our “algorithmic profiles” affect the content that we see online, our chances of getting a job or a loan, our relationships with our friends, colleagues, and bosses, and even how we express and understand ourselves (Karakayali et al. 2018). That is, our life chances are increasingly being affected by the algorithmic gaze. We are all perpetually seen by this gaze, and are deeply influenced by it. But who is “we”? While in the last decade the power of algorithms has been thoroughly theorized (Bucher 2018; Gillespie 2016; Neff et al. 2012; Zuboff 2019; Couldry and Mejias 2019; Cheney-Lippold 2011; Beer 2009), and while the social consequences of algorithms have be
Data Loading...