Coloniality of Data Reflection

Note: These were taken from a course reflection. Question 2 pertains directly to my Futures Project; I’ve included Question 1 for context.

  1. How did you understand data and/or coloniality before you started the course? How this understanding changed?

    Before this course, I did have an acute awareness of data as something created or imposed on the world, rather than harvested empirically from it. But most of this critique of traditional approaches still came from data scientists—information designers, critical coders, data feminists, etc.—who did not acknowledge the possibility of refusal, only of ways to mitigate risks and improve representation.
    Now, I see data as not merely a form of knowledge production with a capacity for harm because of its flattened representations, but also as a form of knowledge production intimately tangled with and implicated in Western philosophies which actively erase and do violence to other ways of knowing. In other words, data is not just harmful because it is limited but because it limits, suppressing alternate ways of being and knowing and rendering its subjects always-already determined in particular, and often deadly, ways. Through the course readings I learned to contextualize these harms within historical colonialism and present-day coloniality, and I have since begun to notice them in conversations across the rest of my life and work. Discussions around trending movements like “responsible AI” (note the term which implies that AI is, by default, irresponsible) tend to focus on risk mitigation. This presupposes harm—there is never a question of whether risk is acceptable, only how much and to what extent, meaning that, in these systems, those harms upon vulnerable populations are predetermined.

  2. How might you incorporate the learning and discussions from this course into future study, thinking and practice? This may be in relation to your time at EFI and/or your work or life, in general.

    I plan to incorporate this learning into my Futures Project in a very direct and immediate sense. My creative dissertation explores the gap between personal identity and the versions of ourselves encoded into data, through the use of a large language model trained on my own writing—all the text I’ve produced and retained over my life, from creative and academic writing to emails, journals, and personal reflection. Before this course, the primary tension I wanted to negotiate was between the individual and the model as an actant and an other: first establishing how our data fail to represent us, then, with the notion of representative “data identities”, questioning what it is we can learn from that data by seeing it as something beyond ourselves, with unique capacities that make it dangerous but also possibly beautiful and/or revelatory.
    I do not see this as entirely opposed to the ethos of the course. The data work I’m trying to pursue is personal data as one very limited form of knowledge among many, and an experiential/relational exchange between human and nonhuman assemblages rather than something deterministic which claims to accurately/empirically represent a subject. My engagement with this course has helped me to pin down and articulate the importance of that approach. I cannot view my encounter with my model as a procurement of knowledge, but rather as an encounter with another way of knowing: how does this model know me, and how is it different than the ways in which I know myself? How might we learn from each other—not in the sense of an exchange of information but in a reciprocal, dialogic, earnest engagement of entities vastly different from one another and yet inextricably linked? In short, I want to engage with data the way I engaged with the art in the Rising Tide exhibition: with ears open to what it has to say, but without any presumption that I can comprehend/conquer/utilize that utterance.
    However, this course has also shown me that I need to be skeptical even of these good intentions. I want to believe that data can be used artistically and experientially rather than reductively, and I plan to try. But I can’t ignore the harm it has and does wreak on the people, places, and things it attempts to determine, or that those harms are an intrinsic part of the philosophies guiding data science and practice. Finding other ways forward in data will mean actively rejecting those traditions wherever they may perpetuate violence, and I’ll have to be incredibly careful to keep my project on track within an academic and social context which maintains that data must be produced and knowledge must be utilized.

Leave a Reply

Your email address will not be published. Required fields are marked *