In this workshop, we focused on law, regulation and policy in relation to the media and PSM, asking:

  • What are the key legal and regulatory challenges for public service media in relation to artificial intelligence?
  • What are the open questions and what research and innovation is needed?
  • What difficulties are PSM legal teams facing on the ground?

Four main themes emerged from our conversations:

Laws and Regulation Workshop Output Graphic


 

The legal and regulatory environment related to AI and data is changing. The European Commission (EC) recently proposed draft AI Regulation and a new Data Act is expected by December 2021. In the UK, the government is consulting on a new National Data Strategy and data protection regime. Like all institutions, PSM will have to adapt.

PSM are committed to respecting fundamental rights (e.g. to privacy, freedom of expression and information) in their use of AI and must ensure their use of AI – and that of third parties they interact with – helps them fulfil their public service remits. Organisations such as the BBC have already developed strategies with regard to AI but a rapidly shifting environment means there is a continuing need to adapt. The following themes emerged from our discussions:

  1. Interpreting draft AI regulations and understanding their impact for PSM

  • Unclear which areas of use of AI in media might be considered high-risk – under which conditions and in relation to which technologies – (e.g. biometrics, recommender systems, voice)

  • Research could be designed to work out what positive interventions might be needed to balance out the current ‘negative’ risk-based approach

  • New forms of partnership/mechanisms for collaboration between PSM could be designed for researching and using AI to build capacity and potential requirements to share with third parties.

  • There is a need to reflect specifically on the role of public sector obligations – and think about whether/which elements of AI developments by PSM are/should be part of their public service activities.

  • Sandboxing may be a fruitful avenue for creating candid and experimental spaces to pro-actively explore these issues

 

  1. Bringing end users/audiences into the conversation

  • Audiences and citizens are often overlooked in this space

  • Research on and with audiences/users can help us understand perceptions of AI and provide input for AI solutions

  • There is a need to look beyond PSM and into the supply chain when contracting with commercial AI companies – audiences may not be aware or (be able to) consent to their data or content being used and commercial companies may outsource parts of their pipeline to unethical third parties

  • There may be an important role for Human Computer Interaction for intelligible interface design here – for both practitioners (e.g. PSM journalists) and for end users

 

  1. Ensuring AI legislation is workable/usable in practice and not overly onerous on industry

  • AI regulation and definitions can often be so complex that AI innovation and use in PSM is limited on the ground

  • Procurement and contracting processes around AI need to be suitable for PSM needs to avoid delaying and stifling innovation

  • PSM are all having to define terminology and creatively devise new rules to deal with machine learning models, data, and algorithms – they need ways to share knowledge, best practice and create leverage when dealing with big tech players

 

  1. Dealing with the power imbalance between AI tech giants and PSM

  • PSM holds a different position to commercial operators and has obligations around equality and transparency that can be difficult to impose when contracting with private companies

  • Asymmetric relationship between big tech platforms with technical skills and PSM holding large quantities of data (often from users)

  • Contracting with tech giants creates the possibility for value exchange which needs to be explored further – including non-financial notions of value

  • Need to consider what can be done to help ensure legislation doesn’t unfairly impact PSM compared to commercial organisations

Post-It Notes of Workshop outcomes
Post-it note of workshop outcomes

Next steps:

Building a network

We will continue to bring researchers into conversation with industry to build a network of experts interested in ensuring AI works in the public interest in media and journalism. If you want to be kept in the loop, please contact: Bronwyn.jones@ed.ac.uk

Constructing a research agenda

We will take the insights from this workshop and build a mission-driven research agenda – identifying funding opportunities and scoping out work packages that address these pressing issues of societal significance.