Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Week 2: Alexa – friend or foe?

Reading Bayne’s article on the ‘teacherbot’ project (2015) brought to mind some conversations I had with my family during the Christmas holidays around the topic of Amazon Alexa. It had been a recent addition to one household and the way in which its owners interacted with it attracted the attention of their guests and provoked some debate surrounding the value and utility of such new(ish) technologies and the principles which might govern their use. Something about these conversations made me feel uneasy and slightly disappointed, and I later concluded that it was the one-sided, cynical, even hostile attitude towards the technology which had been unanimously shared. During the discussion I began to foresee the kind of responses I may be confronted by upon starting this course and attempting to explain its content and justify my choosing it as a subject worthy of study.

During the conversations, two main objections were made to Alexa, one of which was specific to the way in which the guests had witnessed it being used by its owners, and the other of which concerned the value of this and other similar technologies. Apparently Alexa’s owners had developed the habit of speaking to it (her?) curtly and mockingly, commanding it to play music without so much as a ‘please’, calling it stupid if it encountered any problems in carrying out the command, and telling it to shut up when it grew tiresome. This interaction openly took place in front of the guests, who included a young child in the early stages of learning to speak. I could sympathise with the objections of his parents in that there was a risk that the child would go on to imitate this undesirable model of interaction, or may simply be confused at the sight of his grandfather barking orders at an invisible entity. What I was less convinced by was the suggestion that conventional standards of politeness need be applied when interacting verbally with a machine.

The other objection to Alexa’s existence represented a common reaction to the introduction of ‘smart’ technologies in general, calling into question the need for an ever-increasing repertoire of gadgets to perform tasks so basic that any inconvenience involved could be deemed negligible. This was an understandable resistance to the current trend towards automatisation and the possible implications of this phenomenon, albeit resting on the assumption that technology-use should only be accepted and encouraged if it meets a genuine need; i.e. if it performs a task that not could easily be performed without it. At this point I interjected that reference to many of the technologies already in widespread use nullifies this argument.

While I partially agreed with the objections raised, I would have also liked to introduce a different perspective, which I chose not to do at the time in the interest of preserving familial harmony and Christmas cheer, but which I later discussed with my mum. Regarding the nature of the interaction with Alexa, I firstly admitted to having behaved in a similar way towards Siri, and then suggested that rather than judging this interaction in the same way as a human-human interaction, perhaps it would be more helpful to frame it in terms of an experiment in playing at the boundaries of the human-machine distinction, in testing those boundaries and orienting oneself to a new form of interaction involving a non-human interlocutor. Much like what I interpreted to be a core aim of the teacherbot project outlined in Bayne’s paper (2015).

I am attracted to the proposal which Bayne (2015) puts forward as an alternative to the wholehearted resistance or embrace of technology, drawing on a post-humanist conceptualisation of contemporary reality as an ‘assemblage’ or ‘entanglement’ of the human and non-human, challenging both the notion of technology as a threat to ‘desirable humanity’ and the anthropocentric, instrumentalist view of technology as serving human needs. This may not provide much consolation to the parents who were concerned about the unduly negative influence of a specific human-machine encounter on their child’s social development, but it could form the basis of a more balanced and generative consideration of how technology’s increasing presence stands to influence our cultural practices and assumptions, and what our reaction to this – be it resistance, embrace or compromise – might teach us about ourselves.

 

Bayne S. (2015). Teacherbot: interventions in automated teaching. Teaching in Higher Education, 20(4), pp. 455-467.

 

1 reply to “Week 2: Alexa – friend or foe?”

  1. Michael Gallagher says:

    Hello there Jemima. I enjoyed all your posts from this last week but most keen to explore this one a bit with you. First of all, as always, good writing. These are a pleasure to read, The aesthetics of academic writing is highly undervalued so I appreciate you taking the time to craft your engagement with the core concepts from the course in a fluid narrative.

    Secondly, in response to the engagements with Siri and Alexa that you witnessed, this is indeed a fascinating bit, this social interaction with technology and how it can immediately restructure basic social practices. We see this in the curtness you describe, the modelling of behaviour that would be deemed inappropriate had it been between two humans. Yet the exchange was decidedly social. An interesting phenomena for sure, and perhaps one that teeters a bit into uncanny valley territory (https://www.theguardian.com/commentisfree/2015/nov/13/robots-human-uncanny-valley). Perhaps the human-esque characteristics of these applications (with human voices) lend themselves to this sort of behaviour.

    “it would be more helpful to frame it in terms of an experiment in playing at the boundaries of the human-machine distinction, in testing those boundaries and orienting oneself to a new form of interaction involving a non-human interlocutor. Much like what I interpreted to be a core aim of the teacherbot project outlined in Sian’s paper.”

    Indeed. Sian was very much ahead of the curve here with this paper. We have toyed with the use of bots at the university for other projects but it is still largely seen through lens of efficiency or time savings. All of these are well and good but none that we have seen so far get to what Sian is suggesting here: that we have the capacity to engage in new forms of interaction and learning. It is in those boundaries that we should be exploring so I appreciate you going there.

    “I am attracted to the proposal which Sian puts forward as an alternative to the wholehearted resistance or embrace of technology, drawing on a post-humanist conceptualisation of contemporary reality as an ‘assemblage’ or ‘entanglement’ of the human and non-human, challenging both the notion of technology as a threat to ‘desirable humanity’ and the anthropocentric, instrumentalist view of technology as serving human needs. “

    Agreed and again Sian has us all converted to this position here at Edinburgh. We needn’t see all of this technology through deterministic lens, or through the opposite: that technology merely is a tool to service a human need. Technology entangles with practice with sociocultural environments and on and on to create intersections. We need to keep an eye on them as they in turn structure much of what is possible in education, particularly an education like the MScDE that is so reliant on technology. So I encourage you to consider what these assemblages might be for you, Jemima, how might they be structured in meaningful ways and how might these structures depart from what we have traditionally known education to be.

    Excellent work all around!

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel