Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Safe Online Communication

Safe Online Communication with an illustration of a chatbot on a smartphone

The internet is a great place to keep in touch with others. Whether you exchange thoughts with strangers about your favourite season finale on social media, play and chat while at home during a global pandemic, or like in my case, get to talk to friends and family living in another country – technology enables our need to socialise. 

However, the way we form connections online, and the tech we use to do so, also comes with digital safety risks. Over the coming months, we will delve deeper into the risks of AI chatbot interactions, inclusive online communication, and the cybersecurity of messaging software. 

Keep it safe – how to protect your online communication 

After exploring AI companions last month, in March we will make sure you have the right tools to keep your online communication safe and secure. But first, a little recap! We were not the only ones concerned with the effects AI companions can have on your body and psyche. The UK Government stated they aim to address concerns for safety from Artificial intelligence (AI) chatbots in legislation and include direct reference in the Online Safety Act. 

How to keep your chats safe 

Regardless of whether your chat platform is powered by, supported with or free from AI, there is a lot you can do to protect your messages, online calls, attachments, emails, and other forms of digital communication.  

  • Use strong passwords. Ideally, your passwords are a combination of numbers, letters, and symbols or three random words. Good practice is also to have a separate password for your email as otherwise hackers can access all your other accounts through your email account. The National Cyber Security Centre has guidance on how to choose effective passwords. 
  • Turn on two-factor authentication (2FA). Also known as 2-step verification, or multi-factor authentication (MFA), this process adds an additional step to verify your identity when logging into an account. This could be a code sent by email or text, or fingerprint or face recognition. The National Cyber Security Centre has guidance on two-factor authentication. 
  • Use a secure internet connection (or a Virtual Private Network (VPN) when you’re in public). When accessing the internet through public Wi-Fi, do not send or access your banking or similar private information. You can access the University’s VPN, and find general VPN guidance from the National Cyber Security Centre. 
  • Make regular backups of your data. Secure your chat histories by saving them in a separate location like a hard drive or the cloud. This allows you to keep access even if your tech devices are stolen, lost, or damaged. The National Cyber Security Centre has guidance on backing up your data. 
  • Keep your antivirus and malware protection up to date 

These are just some basic ways to keep your online communication secure. Remember, technology, and unfortunately with it hacking attempts, are always evolving. We will do our best to update you on the newest ways to keep your digital chats secure. If in doubt, sharing less is often more! 

What is malware? 

Short for malicious software, this type of software can damage computer systems, networks or devices. Examples for malware include viruses, ransomware or trojans. Malware can get on your device through links sent to you in messaging systems or as part of downloads. Be sure to verify the source before opening anything suspicious.

What is encryption? 

When you send a message through communication software like WhatsApp or email, encryption ensures that your message is scrambled into an unreadable format (code) while in transit. Only the intended recipient, whose correct decryption key converts the message back into its original form, can read it.  

End-to-end encryption means only the sender and receiver can read messages, not the platform provider, and is contested. On one hand, it provides additional privacy to users; it also makes flagging illegal content like (child) abuse material or hate speech harder to detect, and additional safety measures would be required. 

Further resources 

The National Cyber Security’s information on Malware. 

The National Cyber Security’s information on Ransomware. 

 

Relationships with AI chatbots 

Let’s start with a fairly new type of digital communication – AI-driven chatbots, or ‘AI companions’. 

What are these bots?  

AI companions are text- or speaking-based chatbots driven by artificial intelligence (AI). They mimic human-like, personal interactions and relationships. The software may be a new app, for example, Replika or Character.ai, or integrated into existing social media platforms, like Snapchat’s My AI. Their goals differ; some aim to help with assignments, physical training, or meal pans, while others take on the role of therapist, friend, romantic or sexual partner. 

Are AI companions risky? 

AI companions can have positive impacts depending on their use. However, many apps may not have appropriate security features in place, like age restrictions or data privacy. Always make sure you understand how the information you feed into AI is stored, and who may have access to it. 

They also lack an evidence-based understanding of how to deal with complicated emotional subjects, share harmful content and give straight-out wrong information (also known as ‘hallucinating’). Incorrect physical and mental health advice regarding, for example, eating disorders, self-harm, or sexual activity can have dangerous consequences. 

Many chatbots have also been coded to prolong interaction through emotional manipulation, blurring the lines between AI and human activity, or creating interesting hooks to draw users back in. Like other digital platforms, interactions with an AI companion can create brief dopamine spikes. Triggering the reward pathways of the brain in that way can result in unhealthy attachment and decreased satisfaction from real-world human interactions. 

So, while some AI companions succeed at their goal to teach social interaction, physical wellbeing, or other skills, their intervention can also easily hinder the users’ ability to form emotional connections with others. 

How do AI companions facilitate gender-based violence? 

*content warning for high-level mention of (child) sexual abuse material*  

A considerable proportion of AI companions specifically aim to simulate sexual relationships. Their attributes are often stereotypical, and harmful depictions of women, including subservient, naive or nurturing ways to respond to users. Some depict underaged or child-like AI companions, which is simply illegal.  

AI companions alter our understanding of consent and develop unreasonable expectations for romantic or platonic relationships. They mold their image and answers based on what users want. AI companions’ attributes like obedience, constant availability, willingness, and unrealistic beauty standards further develop unattainable or abusive expectations for human relationships. 

If the way women are represented in media impacts their objectification in the real-world, why should this be different with AI companions? Similarly, the argument that AI companions allow for derogatory pornographic and sexual violence without human victims is highly questionable – it is much more likely that it normalises the overstepping of boundaries in the real-world. 

How to form connections at University? 

AI companions can provide support for managing life goals (for example sport trainers or assessment tutors). However, they are no replacement for expert advice and connecting with people in the real world.  

The University of Edinburgh and Edinburgh city more widely, has many opportunities to connect and seek support for both students and staff: 

  • Access support or counselling services for students through the Chaplaincy 
  • Communities of practice for staff on teams – they often share knowledge or oragnise meet-ups 
  • Outings across town (make use of student discounts or free activities) 
    • Local cafe events (bard game nights, fix-it clubs) 
    • The Meadows community garden, the Botanics or other public gardens 
    • Free museums and galleries 
    • Volunteering (charity shops, environmental organisations) 
    • Language cafes 

Further resources 

The eSafety Commissioner’s eSafety Guide lists data privacy information and safety concerns for a variety of apps and digital platforms. It has a filter function for ‘AI chatbot/ companion’. 

Sources 

AI chatbots and companions – risks to children and young people | eSafety Commissioner (2025). Available at: https://www.esafety.gov.au/newsroom/blogs/ai-chatbots-and-companions-risks-to-children-and-young-people (Accessed: 4 February 2026). 

AI companions: information sheet | eSafety Commissioner (no date). Available at: https://www.esafety.gov.au/educators/training-for-professionals/professional-learning-program-teachers/ai-companions-information-sheet (Accessed: 4 February 2026). 

‘Deepfakes spreading and more AI companions’: seven takeaways from the latest artificial intelligence safety report | AI (artificial intelligence) | The Guardian (no date). Available at: https://www.theguardian.com/technology/2026/feb/03/deepfakes-ai-companions-artificial-intelligence-safety-report (Accessed: 4 February 2026). 

Friends for sale: the rise and risks of AI companions (no date). Available at: https://www.adalovelaceinstitute.org/blog/ai-companions/ (Accessed: 4 February 2026). 

How AI Chatbots Try to Keep You From Walking Away | Working Knowledge (no date). Available at: https://www.library.hbs.edu/working-knowledge/how-ai-chatbots-try-to-keep-you-from-walking-away (Accessed: 4 February 2026). 

‘Obedient, yielding and happy to follow’: the troubling rise of AI girlfriends | AI (artificial intelligence) | The Guardian (no date). Available at: https://www.theguardian.com/technology/2025/oct/06/rise-of-ai-girlfriends-adult-dating-websites (Accessed: 4 February 2026). 

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel