This is a short essay I wrote as an assessment for the course Issues and Concepts in Digital Society, part of the MSc in Digital Society. It is more formal than my usual blog posts, but I think it is a useful reflection, so I decided to share it here.
The term “socio-technical” is a constant presence in the literature about technology and society, both in academic and popular contexts. It has been widely disseminated because it condenses multiple ideas, becoming useful in the analysis of technology and society, even if has not always been rigorously defined. This essay will first describe the development of its main ideas based on a historical review by Grint, K. and Woolgar (2013), then will present how the term is used in the current analysis. That should contribute to an understanding of its use by digital sociologists.
The socio-technical system approach was created in the context of organizational theory. Focusing on the outcome of the relationship between machines and the social aspects of the workplace, this approach established that appliances only constrained human action, as opposed to the deterministic idea that they defined the production of workers. It considered a clear distinction between technology, which was understood as something objective, and the collective of workers, the social.
A critique of that perspective is the constructivist social shaping approaches, mostly developed by the sociology of scientific knowledge and industrial sociology. They focus on the political circumstances of the production of technology and its consumption, especially the “constraints” they present to users. In this instance, technologies shouldn’t be assumed to be stable entities with fixed and determinate ‘users’, because the processes of design, development, manufacture, and consumption are socially constructed. One of these approaches, actor-network theory blurs the distinction between human and non-human actors, which are equal entities of contingent networks involved in processes of negotiation that result in the development and stabilization of different technologies. This approach does not account for any influential structural powers outside from the network, in stark contrast with socio-technical alignments.
Socio-technical alignments focus on the significance of the fit between technology and society. One of its most important theorists, Habermas arguments that technology can be mistakenly perceived as having autonomy whenever general social norms become conflated with or reduced to the norms constructed by technologists and their supporters. By creating this “technocratic consciousness”, those groups conceal their interests in an atmosphere of objective necessity for and the inevitability of technological advancement. Through that deceit, they determine the function, direction, and pace of technological and social developments (Held, 1980 in Grint and Woolgar 2013). Discourse is also the focus of anti-essentialism, that proposes they cause disagreements in the analysis of technology’s capacities by presenting them as objective reflections of the truth, not interpreted representations of a truth.
The idea of a socio-technical perspective, thus, has a diverse history and ample foci. This intricacy is conducive to its application without an explicit commitment to one specific approach: the term “socio-technical” is routinely summoned to imply an awareness that society and technology are distinct but dependent, and take turns influencing agents, processes, and consequences of each other. In its vagueness, it captures a complexity that makes it intuitively useful in the analysis of technological products, frequently preceding the mobilization of clearer sociological theories and concepts.
An exemplary case study is Safiya Umoja Noble’s Algorithms of Oppression: How search engines reinforce racism(2018). In this work, she investigates the socio-technical aspects of Google’s search engine results, informed by race and gender studies. On one analysis, a query for the single adjective “beautiful” returned only images of white women. A constructivist interpretation of this output suggests it as a result of negotiation processes between users and platform interpreting beauty in the context of a society that embodies it as female and evaluates it by white standards. This produces the pool of possible results available for search and the results that are consumed in the form of clicks by users that make such a query. But Noble also notes that those outputs are additionally shaped by the values and norms of the search company’s commercial partners and advertisers. Through the “technocratic consciousness”, economic interests of technologists that control the platform is concealed but still expressed in the form of results that can be perceived by the user as objective.
The same perceived objectivity of technology is operating when people request facial plastic surgeries motivated by the output of their mobile phone self-portraits, as reported by Belluz (2018). The device’s lenses can make a nose look up to 30% larger compared to the rest of the face depending on the distance it is kept from the user at the moment of capture. An anti-essentialist interpretation of this phenomenon could raise interesting research questions on whether users see the photographs as an interpretation of their faces or a reflection of it. Or more poignant still: could it be that users may just be willing to change their faces to achieve the desired interpretation regardless of how their faces will actually look like without the mediation?
Throughout its existence across disciplines, the term “socio-technical” has analyzed agents, processes, and consequences involved in the complex relations of technology and society. The examples in this essay should illustrate how different approaches underlie current analysis of everyday technology in very relevant.
References
Belluz, J., 2018. ‘Selfie face distortion is driving people to get nose jobs’, Vox, 21 Jun. Available at: https://www.vox.com/science-and-health/2018/3/1/17059566/plastic-surgery-selfie-distortion(Accessed: 25 October 2018).
Grint, K. and Woolgar, S., 2013. ‘Theories of technology’, in The machine at work: Technology, work and organization. John Wiley & Sons.
Noble, S.U., 2018. Algorithms of Oppression: How search engines reinforce racism. NYU Press.