Computer security is the culmination of a series of subversions. Just as policing evolves in response to crime, so computer systems evolve in response to threats. When it comes to malware, while the code is clever, the social engineering is more persistent, and more depressing. Malware designers evolve their products through testing and tooling to find out what works best. What wording will mean that more victims click on that ransomware link? Do people trust social media messages more than emails? More and more, malware evolves in response to human social and economic systems, and organisational/geo-political priorities, rather than to technical vulnerabilities.
Technological histories exist, recording major developments – the first Worm, the first RootKit, the first ransomware and so on. The social history is more dispersed and is yet to be written. Elements of a theory of malware could focus on: Where was it developed, for what purpose, who by, and how? How have victim and perpetrator characteristics evolved with the systems they use? What ethical and political questions are raised by it? For example, do we have the right to destroy self replicating computer programs, and how should malware be preserved for future study? My sense is malware history is somewhat separated from other kinds of crime historiography.
We would need to begin with a basic taxonomy. Here is my first sketch of the social history of malware: First stage: technically focused malware. Often created to prove a point, or by accident, like the first computer worm. Second stage: payload. Malware delivers a technical form that does something else, like modifies a computer system, or locks you out of it. Third stage: organisational. Malware becomes an object of economic and political activity. There is a high division of labour. Stuxnet and NotPetya encode geo-political priorities. Markets like Dark0de trade and systematise it.
Methods should set out the relevant fields. Studying evolving malware means studying the evolving computer security industry, its capacity for threat detection and horizon analysis, and the back and forth between malware operators and industry personnel. Constructs to be examined include changes in how costs are calculated and understood – from direct economic losses to reputation damage, lay perceptions of malware, and popular discourse and cultural representations of the topic. There is an opportunity for theoretical innovation here, for example considering QAnon as a mimetic virus.