In this blog, Dr Ine Steenmans and Prof Joanna Chataway, from the Department for Science, Technology, Engineering and Public Policy (UCL STEaPP), offer their initial position concerning the discussions at the first UCL Digital Ethics Forum.
Dr Ine Steenmans is a Research Associate in ‘Foresight and Futures’. She joined UCL in early 2017, and her work aims to develop UCL’s role as a global centre of excellence in foresight for public policy.
Prof. Joanna Chataway is Head of Department at UCL Department of Science, Technology, Engineering and Public Policy (UCL).
This is a wonderful initiative and very timely. Four observations are offered here for a UCL digital ethics vision. These reflect ongoing work within the Department for Science, Technology, Engineering and Public Policy (STEaPP) and could be areas of interest for further exploration in the digital ethics forum, or support for the UCL digital strategy report. The first two points are more substantive in flavour and capture some of the issues likely faced in developing a vision connecting ethics and digital policy. The last two are more procedural and reflect thoughts about the ways we might work to engage with these issues.
A. Challenging the logic of neutrality
Thinking about ethical dimensions about technology policy has sometimes framed technology as neutral. The challenge has therefore sometimes been framed as being one of using good judgment and evidence to use technologies. When it comes to the development of policy to guide their use, similar logic has applied. STEaPP research and policy engagement aims to go beyond that logic exploring how social and physical technologies are co-created and the role that public policy plays in that co-creation.
B. Beyond neutrality: Moving towards socio-technical knowledge for policy
Whilst the use of evidence for policy development and regulation of technologies sounds harmless -and good judgment and policy related to technology use are surely needed-, many of us have thought for a long time that the narrative around technologies as neutral artefacts is hugely inadequate as a way of describing the complex ways in which humans and technologies shape each other. If that has always been true to some extent, it is certainly the case that new developments related to AI, internet of things and other aspects of digital innovation make it hugely inadequate. The pace of change, the scale and scope of these transformations make the formulation of policies that engage with these issues particularly difficult. Throughout their development, technologies embody ethical and normative decisions, and they are subject to path dependencies that influence the direction of innovation from early stages of development. Our thinking about ethics and policy related to ethics and digital technologies has to come therefore at all stages of development and from multiple angles.
Within STEaPP we have a ‘Digital Technologies Policy Laboratory’ that works via a number of projects collaboratively with the policy community nationally and internationally to develop an understanding about the complex ways in which the social and technical are interwoven at all stages technical development and use and impact on each other. The aim is to be better able to respond to the challenges and opportunities of emerging technologies.
C. Interdisciplinarity
In order to be able to understand how social and physical technologies shape each other, we need deep interdisciplinarity. We need to work across departments, subjects and intellectual traditions drawing on different skills, competencies and insights. We have such huge potential to do that at UCL – and the Digital Ethics Forum is a great initiative to facilitate some of these conversations.
D. Knowledge from practice
A digital ethics vision will also need to draw on many types of different knowledge – not just that which we have here at UCL and in other universities, but also policy practitioner knowledge and the knowledge of those working directly in technology development. There are a number of reasons for this. The type of knowledge produced in different domains is different and the means by which knowledge is generated also varies significantly. Not all knowledge is codified for example. Tacit knowledge and learning by doing and using are very powerful factors in the development of digital technologies. Academic research is enhanced as a result of acknowledging and drawing on this variety of knowledge sources.
There are multiple research projects and work streams at STEaPP that explicitly explore the practical ways by which knowledge is generated not just for practice, but also with practice. Examples include a project with the insurance sector exploring the changing nature of long-term risk of increasing digital connectivity; a project in partnership with regulators on the development of personalised medicine; a project with homeowners on the issues of informed consent when using digital devices; a project with law enforcement agencies on issues of gender bias within digital devices; and a project with a standards body about governance frameworks for managing data protection. STEaPP also hosts part of the national ‘PETRAS’ hub that brings together researchers, technology practitioners, policy officials for collaborative reflection on issues of privacy, ethics, trust, reliability, acceptability, and security of the Internet of Things.