Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Digital Research Conference 2026 at EFI: From Big Questions to Responsible Practice

Conference stage set with grey lounges and a large screen above. The screen has an image which says Digital Research Conference. This image is for decorative purposes only

Summary

On 26 February 2026, Digital Research Services held the Digital Research Conference at the Edinburgh Futures Institute (EFI). The day mixed short talks, panel discussions, posters, and an expo—focused on how digital tools (especially AI) are changing research, and how to use them responsibly.

On Thursday, 26 February 2026, the University of Edinburgh’s Digital Research Services welcomed colleagues from across disciplines to the Edinburgh Futures Institute (EFI) for the Second Annual Digital Research Conference—a day that made one thing clear: digital research is no longer a niche specialism, but the connective tissue running through modern scholarship.

From the moment attendees arrived—badges on, coffee in the atrium—the tone was set: practical, collaborative, and openly curious. EFI’s main space was transformed into a relaxed “studio” setting: sofas and low tables on stage, large projection screens above, plants framing the discussion area, and a steady flow of questions from the audience. The format worked: it invited conversation rather than performance, and it suited the conference’s central theme—how we do rigorous, ethical research when the tools (and risks) are evolving quickly.

A day built around the full digital research lifecycle

The agenda moved briskly through the day, reflecting the breadth of digital research support and practice—from methods and tooling, to integrity and governance, to sustainability. Sessions spanned:

  • Interdisciplinary Digital Research: From Humanities to Medicine
  • AI in Research: Promise, Pitfalls & Practice
  • Keynote on organising knowledge and challenge-led research
  • Digital Research Ambassadors Programme and panel discussion
  • Ethics, Security & Integrity in Digital Research
  • Poster browsing and Digital Research Expo
  • Green Digital Research Practices & Sustainability
  • Networking reception

That structure mattered. Rather than treating “AI” as a standalone topic, the conference repeatedly positioned it inside the wider ecosystem of research design, data stewardship, reproducibility, collaboration, and impact.

Interdisciplinarity in practice, not in slogan

A recurring thread across sessions was that interdisciplinarity is hard work—but worth it. Slides captured this candidly thorough reflections on the realities of cross-disciplinary collaboration: terminology clashes, different ideas of what “good evidence” looks like, and the challenge of translating methods between fields. Yet those same reflections also highlighted what interdisciplinary teams gain: clearer framing, better questions, and a more grounded sense of what technology can and can’t do in context.

The Digital Research Ambassadors segment reinforced this message: impact often happens when people take methods learned in one context and adapt them respectfully—pairing technical capability with domain insight and lived experience.

AI in research: acceleration—with new responsibilities

If there was a single “energy centre” to the day, it was the set of talks exploring AI’s role in research workflows. The mood was neither hype nor fear—it was pragmatic: what can we do now, what should we not do, and how do we stay methodologically honest?

Several presentations tackled AI as a research assistant rather than an authority. One memorable framing described LLMs as a “panel of fallible coders/coders”—useful, but requiring supervision, validation, and scepticism. Rather than aiming for blind automation, speakers emphasised:

  • treating models as assistive tools(structure, drafting, code translation, documentation),
  • maintaining human oversight,
  • and avoiding the use of sensitive data in systems not designed for it.

Another practical theme was the rise of rapid prototyping—moving from a “what if?” idea to bespoke tools (e.g., text extraction pipelines, categorisers, analysis helpers). The promise is real: barriers to building research software are falling. The risk is equally real: it becomes easier to generate results faster than we can meaningfully verify them.

Sensitive data, open science, and the “research paradox”

One of the most thought-provoking sections of the conference explored a problem many researchers will recognise immediately:

  • Rigorous science requires transparency and reproducibility
  • But in many domains, raw data cannot be shared

This tension came into sharp focus in work on animal health and disease modelling—where understanding movement networks and outbreak spread depends on highly sensitive datasets. Slides asked bluntly: what counts as sensitive data? Commercial information, location records, identifiable movement patterns, and datasets restricted under GDPR-like constraints were all part of the discussion.

Researchers shared approaches that try to resolve this, including anonymisation via simulation and synthetic datasets designed to preserve key structural properties (movement frequency, regional density, network structure) while removing traceability. The emphasis wasn’t just technical—it was ethical: how to enable scrutiny and reuse without exposing individuals, farms, organisations, or communities to harm.

Ethics, privacy, and integrity: governance as research method

The Ethics, Security & Integrity session made it clear that governance is not an afterthought—it’s part of method design.

Talks explored privacy as a legal and social concept (from “the right to be let alone” to modern human-rights frameworks), and highlighted real limitations in current regulatory systems—especially in automated public administration, where power asymmetries and algorithmic opacity can make rights difficult to exercise in practice.

A particularly striking contribution addressed auditing research integrity when data must remain private, proposing “zero-knowledge” style governance approaches: auditability without direct inspection of underlying sensitive datasets. The point wasn’t jargon—it was a response to a growing reality in digital research: we increasingly need mechanisms for trustworthy verification in environments where sharing everything openly is impossible.

The expo atmosphere: skills, support, and community

Away from the stage, the conference also had a strong “hands-on support” feel. Photos from the expo spaces showed stalls and screens advertising training and guidance—including the Digital Skills Programme and Digital Curation Centre presence—alongside friendly conversation and the kind of informal troubleshooting that often becomes the most valuable part of a conference day.

That balance—between high-level questions and practical help—felt like a signature strength of Digital Research Services: not just presenting ideas, but enabling people to apply them.

What attendees took away

Across disciplines and sessions, a few shared takeaways stood out:

  1. Reflexivity matters as much as replicability.Scaling computation can amplify hidden assumptions; good research still requires interpretive care.
  2. AI can accelerate research, but not outsource responsibility.Supervision, validation, and documentation remain central.
  3. Privacy and openness must be engineered together.Synthetic data, simulation, and privacy-preserving audits are becoming core research skills.
  4. Interdisciplinary work needs translation infrastructure.Shared definitions, agreed standards, and time to negotiate meaning are not “overhead”—they’re the work.

Closing note

Held in EFI’s deliberately collaborative environment, the Digital Research Conference felt like a snapshot of where the University—and the sector—now sits: enthusiastic about what digital methods make possible, but increasingly serious about robustness, safety, and sustainability.

The message from the day wasn’t “use more technology.” It was: use technology well—openly where you can, carefully where you must, and always with methods you can explain.

Thank you, Digital Research Services for hosting this very special event.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel