Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

16 Days of Activism – Ending Digital Violence Against All Women and Girls

16 Days of Activism with illustrations of women on computers. The Digital Skills Programme logo is in the top left corner
Today, the 25th of November, is International Day for the Elimination of Violence against Women. This marks the start of 16 Days of Activism – a campaign to end gender-based violence. 

Over half of all women and girls world-wide have experienced some form of harassment online. The United Nation’s theme for 16 Days of Activism in 2025 is “UNiTE to End Digital Violence Against All Women and Girls”. At the University of Edinburgh, our Digital Safety work includes raising awareness and sharing support resources for those affected by ‘technology-facilitated violence against women and girls’. 

Over the coming 16 days we will highlight resources on Instagram (uoedigiskills) and the Digital Skills, Design and Training blog for anyone who experiences or supports someone experiencing gender-based violence in the digital realm. Watch this space for updates as we highlight facts, tools, and support contacts to create safer and kinder online spaces.

 

16 Days of Activism

Day 1

We are kicking things off with our existing collection of resources against Online Gender-Based Violence section on the Digital Safety, Wellbeing and Citizenship hub. We encourage you to share these widely to create safer and kinder online spaces.

Day 2

What is ‘technology-facilitated gender-based violence’? The term refers to violence against women or girls which is made possible, aided or amplified through the use of technology. This can take many forms, like doxing, online (sexual) harassment, image-based abuse and more. We will introduce them in more detail next week, highlighting what you can do to protect and act.

Day 3

Here are some tips on what you can do for a digital world where technology-facilitated gender-based violence is a thing of the past:
🫂 Support – initiatives and organisations advocating for better regulation of digital spaces
💡 Innovate – better solutions in the tech world, centring women and girls as developers and consumers
🦻 Listen – to friends, colleagues, students who experience technology-facilitated abuse. Make them feel heard and believed. Support them in a way that is right for them and highlight where they can get professional help.

Day 4

Technology-facilitated violence against women and girls often starts earlier than you may think! Recognising problematic behaviour towards you and (re)-acting early can help you stay on top of distressing situation and get the help you need.

Here are some red flags in online conduct to look out for:
🚩 Threats to share your private photos without consent (for example as blackmail)
🚩 Controlling the chats and contacts you may have, for example, on your phone
🚩 Harassing and abusive comments or DMs
🚩 Fake or doctored images of you appearing online
🚩 Impersonation, exclusion, or smear campaigns in online groups

If you or someone you know is affected, head to the ‘Need Help with Your Digital Safety?’ section on our Digital Safety, Wellbeing and Citizenship hub for tailored support contacts.

Day 5

Digital abuse can have real-life impact. It is a common misconception that ‘just’ because it happens online, or on your phone, the body stays unharmed. Our bodies can still get into panic or emergency mode, and to experience digital gender-based violence can have serious negative effects on your mental health.

Often it takes more than just turning off the device. Intersectional feminist charity Chayn has collected a variety of grounding exercises you can do immediately when everything becomes overwhelming. Head to the link in our bio to learn how to ground yourself through breathing, engaging your senses, making yourself laugh or something else.

This is not necessarily a substitute for long-term mental health support and you may still want to talk to a friend or professional. But in the moment, taking a breath, and focusing your mind and body can help you take the next steps calmly.

Day 6

To get you started, here are some grounding exercise ideas for you – free and possible to do in most places:

🌳 Breathe slowly. Perhaps tell yourself ‘in’ and ‘out’ as you do so in the cycle.

🌳 Listen to or sing your favourite song. What are the melody or lyrics like? How does it make you feel?

🌳 Pick a colour and find as many things with that colour as possible, wherever you are.

🌳 If possible, put your feet on the ground or sit down. Imagine roots form between the ground and your feet or seat.

🌳 Imagine your ideal place – one that is safe. Where is it? What does the light there look like? What would you do there first?

🌳 Clench, then release your fist.

🌳 Draw am abstract doodle or skribble. Can you make out shapes in there?

🌳 Pick a category, and list an item that fits it for each letter of the alphabet.

Day 7

During the second week of 16 Days of Activism by deep-diving into some specific forms of technology facilitated gender-based abuse (TFGBV). We are starting off with Doxing.

Doxing is when someone takes or has your personal and identifiable details and publishes them online without your knowledge or consent. Examples of this information include your full name, email or home address, other contact details, bank statements and other important documents.

Doxing is often used as a way to intimidate, control or seek revenge. The details can be used to spam digitally or by mail, facilitate (sexual) harassment, or cause financial harm. Be aware that a threat to share your information already counts as doxing and should be taken seriously!

Is Doxing illegal? It is not specifically covered by law in Scotland but the Scottish Women’s Rights Centre writes that doxing may be illegal, as it can be covered through offences such as threatening and abusive behaviour, stalking, abusive behaviour towards a partner or ex-partner, or improper use of public electronic communications network.

Day 8

What can you do against Doxing? If you or someone you support has been doxed you may wish to:

🗣️ Report it to the police

🗣️ Speak to the Equally Safe Team or use the Report + Support tool at the University of Edinburgh

🗣️ Review and invoke reporting mechanisms of the relevant social media platform

🗣️ Invoke the right to erasure (also known as ‘the right to be forgotten’)

🗣️ Protect images or videos by invoking copyright law

🗣️ Apply for a protective order with the help of a solicitor

Day 9

Online Sexual Harassment is unwanted and inappropriate behaviour of a sexual nature conducted through digital platforms or devices. Online sexual harassment can include messages, images, or comments that make someone feel uncomfortable or threatened. Important here is, like with all forms of (sexual) harassment, it is important how the survivor feels, and not if the perpetrator intended to harass or not.

Online sexual harassment is serious and affects over 1 in 5 women in the UK.

Let’s create a safer digital space for everyone. Here are some steps you can take to protect yourself and others:
🔏 Strengthen Privacy Settings: Regularly update your privacy settings on social media to control who can see your posts and personal information.
📝 Document Everything: Keep records of any harassment or bullying. Screenshots and saved messages can be crucial if you need to report the behavior.
🚫 Report and Block: Use the platform’s tools to report abusive behavior and block the offenders. Most social media sites have protocols to handle harassment.

Day 10

Cyberflashing is the unsolicited sending of explicit images via digital platforms or devices.

This may happen on social media platforms or dating sites. Often systems like airdrop are used to send the pictures in public, which makes it difficult to identify the sender. Functions like preview mean a small version of the images may already be shown to the survivor.

The aim of cyberflashing may be to intimidate, embarrass, distress, or for gratification of the sender without regard for the survivor’s feelings.

Let’s take action to protect ourselves and each other:
❗ Adjust privacy settings: Limit who can send you messages or images on social media and other apps. Many platforms like dating apps also allow you to filter or block unwanted content.

❗ Familiarise yourself with your Bluetooth and Wifi settings. You currently cannot turn airdrop previews off but can limit drops to your contacts.

❗ If you receive an unsolicited explicit image, you can also report it to the platform and, if you like, to the authorities. If this happens on public transport, you can also directly contact Transport Police’s relevant reporting line by calling 0800 40 50 40, texting 61016, or using their reporting online form.

Day 11

Image-based abuse is similar to cyberflashing and doxing. It means publishing intimate images or videos on digital platforms or devices, but in this case, your own, and without your consent.

This includes any images or videos showing intimate acts, private body parts or people wearing underwear. Some of the images may not even be real, but generated with the help of AI or photo editing software. We will cover so called ‘deepfakes’ in future.

Whether real or not, distributing private images of you, or threatening to do so, without your consent is wrong and illegal. In Scotland, image-based abuse is covered in the Abusive Behaviour and Sexual Harm (Scotland) Act 2016.

The Scottish Government, together with gender equality organisations, published an interactive campaign to highlight different ways image-based abuse can play out. Head to the Digital Skills, Design and Training blog for more info (link in our bio).

Know what to do when you or someone you support experiences image-based abuse:
🫴 The Equally Safe team (UoE) provide in person or online help for GBV survivors
🫴 Report to the Revenge Porn Helpline
🫴 Victim Support Scotland and other organisations can help you navigate legal processes for reporting

Day 12

What do you do when an intimate image of you is uploaded online against your consent?

🌻 Firstly, take a deep breath! This situation can be scary and it is important that you look after your wellbeing first. Maybe a person you trust can hep you through the next steps.
🗣️ Social media and online platforms usually have mechanisms to report sexual harassment and image-based abuse. You can request derogatory and intimate content to be deleted. Look at the relevant platforms reporting policy for specific details.
📝 In reality, while these mechanisms are a great step, they do not always work for a variety of reasons. In that case you can further escalate with a ‘takedown letter’. This is a letter or email in which you formally request the platform to remove the content. It helps make the process more efficient, creates a paper trail and can help you in future potential (legal) proceedings. You can also always decide to contact university support or the police.

Day 13

Artificial Intelligence (AI), and especially its image-generation features have had a big impact on image-based abuse and online sexual harassment. So called ‘deepfakes’ are images, recordings or videos that were altered to depict something that has never happened.

Deepfakes have existed before AI, but, as a Humane Intelligence report on image-abuse in the age of AI puts it: “generative AI lowers the barrier and scales up the creation of abusive images by facilitating the creation of more realistic images in just a few clicks” (1). The scale is considerable; studies estimate that over 90% of generated deepfakes are of non-consensual, intimate nature (2).

As image-based abuse usually aims to distress and incite fear in the survivor, and others seeing the images will likely not know that they are fake, the impact is not really distinguishable from ‘real’ pictures.

Often, generative AI systems are coded to block the creation of sexualised images, although these can be circumvented with targeted prompts and are not always as effective as needed. Also, what is considered an intimate image differs between cultures, and profanity safety barriers may simply not ‘understand’ that the requested image may cause harm.

Here’s what you can do to keep you and others safe:

📲 Train media literacy, and critically assess media sources

🗣️ Report derogatory images, emphasising their distressing affect, regardless of whether they are deepfakes or not

🤔 When generating images, consider who you could (even without intending to) hurt or endanger due to their religious or cultural beliefs.

Day 14

Tool sharing! We have gathered digital tools that can help students and universities prevent and react to technology-facilitated gender-based violence:

Reporting support AI by Chayn

Intersectional feminist organisation Chayn has launched an AI-powered support tool. With the information you share the tool can help you formulate a takedown letter or email. See our previous post for more info on what that is. The tool lists a range of reporting features for popular online platforms and, through guided prompts, helps you to draft and send a takedown letter to the relevant recipient.

 

The eSafety Guide from the Australian eSafety Commissioner

The guide lists hundreds of digital platforms, games and apps, giving an overview of what they are, how they are used, and relevant links on how to secure your data and report someone.

Day 15

Even more tools:

GenderSAFE Institutional Self-Assessment Tool by UniSAFE

This interactive self-assessment tool was developed by researchers across Europe to help universities identify their strengths and gaps in making their campuses safer for women and girls. Aimed at self-development, higher education institutions have used it to identify areas for learning and strategy planning.

 

Spot the deep fake game by the New York State Office for the Prevention of Domestic Violence

This educational tool can help test your ability to recognize a deepfake image. In various rounds you identify the deepfakes, after which the tool shows how to identify they are fake. While this is a solid start, this is no guarantee that you’ll always be able to identify deep fakes. Image alteration software improves consistently, so make sure to stay alert.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel