Tag: identity

What do digital sociologists mean by “socio-technical”?

This is a short essay I wrote as an assessment for the course Issues and Concepts in Digital Society, part of the MSc in Digital Society. It is more formal than my usual blog posts, but I think it is a useful reflection, so I decided to share it here.

The term “socio-technical” is a constant presence in the literature about technology and society, both in academic and popular contexts. It has been widely disseminated because it condenses multiple ideas, becoming useful in the analysis of technology and society, even if has not always been rigorously defined. This essay will first describe the development of its main ideas based on a historical review by Grint, K. and Woolgar (2013), then will present how the term is used in the current analysis. That should contribute to an understanding of its use by digital sociologists.

The socio-technical system approach was created in the context of organizational theory. Focusing on the outcome of the relationship between machines and the social aspects of the workplace, this approach established that appliances only constrained human action, as opposed to the deterministic idea that they defined the production of workers. It considered a clear distinction between technology, which was understood as something objective, and the collective of workers, the social.

A critique of that perspective is the constructivist social shaping approaches, mostly developed by the sociology of scientific knowledge and industrial sociology. They focus on the political circumstances of the production of technology and its consumption, especially the “constraints” they present to users. In this instance, technologies shouldn’t be assumed to be stable entities with fixed and determinate ‘users’, because the processes of design, development, manufacture, and consumption are socially constructed. One of these approaches, actor-network theory blurs the distinction between human and non-human actors, which are equal entities of contingent networks involved in processes of negotiation that result in the development and stabilization of different technologies. This approach does not account for any influential structural powers outside from the network, in stark contrast with socio-technical alignments.

Socio-technical alignments focus on the significance of the fit between technology and society. One of its most important theorists, Habermas arguments that technology can be mistakenly perceived as having autonomy whenever general social norms become conflated with or reduced to the norms constructed by technologists and their supporters. By creating this “technocratic consciousness”, those groups conceal their interests in an atmosphere of objective necessity for and the inevitability of technological advancement. Through that deceit, they determine the function, direction, and pace of technological and social developments (Held, 1980 in Grint and Woolgar 2013). Discourse is also the focus of anti-essentialism, that proposes they cause disagreements in the analysis of technology’s capacities by presenting them as objective reflections of the truth, not interpreted representations of a truth.

The idea of a socio-technical perspective, thus, has a diverse history and ample foci. This intricacy is conducive to its application without an explicit commitment to one specific approach: the term “socio-technical” is routinely summoned to imply an awareness that society and technology are distinct but dependent, and take turns influencing agents, processes, and consequences of each other. In its vagueness, it captures a complexity that makes it intuitively useful in the analysis of technological products, frequently preceding the mobilization of clearer sociological theories and concepts.

An exemplary case study is Safiya Umoja Noble’s Algorithms of Oppression: How search engines reinforce racism(2018). In this work, she investigates the socio-technical aspects of Google’s search engine results, informed by race and gender studies. On one analysis, a query for the single adjective “beautiful” returned only images of white women. A constructivist interpretation of this output suggests it as a result of negotiation processes between users and platform interpreting beauty in the context of a society that embodies it as female and evaluates it by white standards. This produces the pool of possible results available for search and the results that are consumed in the form of clicks by users that make such a query. But Noble also notes that those outputs are additionally shaped by the values and norms of the search company’s commercial partners and advertisers. Through the “technocratic consciousness”, economic interests of technologists that control the platform is concealed but still expressed in the form of results that can be perceived by the user as objective.

The same perceived objectivity of technology is operating when people request facial plastic surgeries motivated by the output of their mobile phone self-portraits, as reported by Belluz (2018). The device’s lenses can make a nose look up to 30% larger compared to the rest of the face depending on the distance it is kept from the user at the moment of capture. An anti-essentialist interpretation of this phenomenon could raise interesting research questions on whether users see the photographs as an interpretation of their faces or a reflection of it. Or more poignant still: could it be that users may just be willing to change their faces to achieve the desired interpretation regardless of how their faces will actually look like without the mediation?

Throughout its existence across disciplines, the term “socio-technical” has analyzed agents, processes, and consequences involved in the complex relations of technology and society. The examples in this essay should illustrate how different approaches underlie current analysis of everyday technology in very relevant.



 Belluz, J., 2018. ‘Selfie face distortion is driving people to get nose jobs’, Vox, 21 Jun. Available at: https://www.vox.com/science-and-health/2018/3/1/17059566/plastic-surgery-selfie-distortion(Accessed: 25 October 2018).

Grint, K. and Woolgar, S., 2013. ‘Theories of technology’, in The machine at work: Technology, work and organization. John Wiley & Sons.

Noble, S.U., 2018. Algorithms of Oppression: How search engines reinforce racism. NYU Press.

Digital identity crisis

Youtube and Spotify know when I am missing Hong Kong (although that is a Chinese music video in Shanghai, but you can’t control emotional associations)

For the last two and a half years, I lived in a country that was not my own. I moved to Hong Kong in a short notice and did not know much about the place before arriving. As so many people do, I started using technologies to find my way around. Af first I relied on the usual “neutral” apps: wouldn’t step out of my apartment without Google Maps and would only know where to eat if Foursquare directed me to a vegetarian-friendly shop.

With the passing of the months, I had to also move my digital presence to my new physical place. To my surprise, it was very cumbersome to change my Google, Apple and Spotify accounts from Brazil to Hong Kong. But it was necessary, both to allow for adding my new local forms of payment and for getting access to local apps. Netflix did not require such a maneuver, but my options for consumption in the platform were automatically changed. Of course, in due time, I forgot what I lost and started to enjoy the Asian content that was new to me.

In order to keep up with local news, I started following Hong Kong media on Twitter, then Chinese and Asian media more broadly, as traveling took me to those places and made me more invested in their realities. Around the same time, local social scientists started creeping up in my feed, as did local photographers on Instagram, both because the platform suggested them to me and because I was getting more and more attached to images of specific roads, neons, bamboo scaffolding and all sorts of daily mundane details that make a place dear to us. Slowly but surely, as I became accustomed to those sights and because I recognised them, they started becoming part of my identity.

My support systems for living in the in 852* had by them changed almost completely: I checked the weather using the Hong Kong Observatory‘s app; the Hong Kong Public Libraries‘ app was something I couldn’t live without; Foodpanda and Deliveroo fed me, HSBC Hong Kong‘s app made me anxious. Not to sound offensive, but was I not, at least digitally, a bit of a Hongkonger?

Thousands of tiny (mostly) perfectly rational and practical decisions ended up amounting to some sort of (quite emotional) identity shift. Which was reinforced by locals interpreting my ethnically ambiguous face as Chinese, only to be corrected by my reaction, that was usually just looking very surprised and confused.

Now I am a Brazilian who lives in Edinburgh and has to decide what to do with this digital Hongkonger self. There is no playbook for that. Some things are not a matter of choice, so I find myself once again suffering with Google, Apple and Spotify. But what about the Facebook page of my former running community; the Instagram account from the vegan restaurant in my neighbourhood that was pretty much my dining room; the local charity from which I adopted my two lovely cats? If I keep all of these, am I maintaining my Hongkonger digital self alive? Would that stop me from moving on and developing a more current digital identity? Killing her feels like killing a part of myself that I actually kinda like. But honestly, can I even move on, when different social media platforms still recommend me HK content and Google Photos keeps bringing up memories? Is it up to me?

I just don’t know what to do with my(digital)self.

*852 is HK’s area code and how locals sometimes refer to here there. You know, just something we locals do.

Powered by WordPress & Theme by Anders Norén


Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.