ThinkBIG Workshop (Day 1)

Ethical and Social Challenges posed by Artificial Intelligence

Cumberland Lodge, venue for the ThinkBIG workshop on Ethical and Social Challenges posed by Artificial Intelligence

Date/Time: 25th June 2018, 9.30am – 6:00pm

Location: Cumberland Lodge, The Great Park, Windsor SL4 2HP

Organisers: Christopher Burr and Nello Cristianini


The rapid adoption of AI and data technologies in everyday life has created new opportunities but also risks that were not considered before. Among them are concerns about privacy, fairness, transparency, and accountability, which arise from devolving many consequential decisions to machines. There are also concerns about public opinion, employment and individual autonomy and wellbeing, particularly when those decisions are made on the basis of personal data.

In this workshop we will review the main challenges, with an eye to identifying positive contributions that we can make, either in a technical or in a conceptual way. Among the questions that will be asked and debated, are “can machine-decisions ever be unbiased?”, “are we entitled to an explanation?”, “how do we embrace the benefits of AI without accepting also the risks?, and “how can policy makers regulate a fast moving technical field?”


Programme

  • 9:20 – 9.50: Coffee & Registration
  • 9.50 – 10.20: Nello Cristianini (Professor of Artificial Intelligence, University of Bristol) — “Introduction – Artificial Intelligence and Media”
  • 10.20 – 11.00: James Pennebaker (Regents Centennial Professor of Psychology, University of Texas) – “Using everyday words to understand people’s psychological states”
  • 11.00 – 11.40: David Stillwell @david_stillwell (Lecturer in Big Data Analytics and Quantitative Social Science, Judge Business School, University of Cambridge) — “How should organisations use data driven psychological predictions ethically?”
  • 11.40 – 12.00: Coffee Break
  • 12.00 – 12.40: Karen Yeung (Interdisciplinary Professorial Fellow in Law, Ethics and Informatics—Birmingham Law School & School of Computer Science) — Algorithmic government: Towards a New Public Analytics?: In this paper, I will argue that we are witnessing the emergence of a ‘New Public Analytics’ (NPA) within public administration which denotes a reform movement within the public sector that seeks to harness the power of data in order to render the delivery of public services more efficient, accurate and timely. In particular, I will suggest that this incipient movement can be understood as the conceptual and ideological successor to the ‘New Public Management’ in the administration of government, in so far as both movements share a common concern to drive improvements in public administration in ways that mimic the techniques that have proved successful in the commercial sphere. However, based on a pilot study of selected sites in which British government administrators are seeking to embrace, I will suggest that at present, NPA in the UK public administration remains very much at the stage of aspiration rather than achievement, with the realisation and implementation of quite far behind the data-driven vision with a wide gap between rhetoric and reality.
  • 12.40 – 13:20: Lina Dencik @LinaDencik (Senior Lecturer, Co-Founder Data Justice Lab, School of Journalism, Media and Culture, Cardiff University) — “A social justice approach to datafication”
  • 13:20 – 14:20: Lunch
  • 14.20 – 14.50: Teresa Scantamburlo (Postdoctoral Researcher in Data Studies, University of Bristol) — “Machine Justice”
  • 14.50 – 15.20: Christopher Burr (Postdoctoral Researcher in Data Studies, University of Bristol) — “Interactions Between Intelligent Software Agents and Human Users”
  • 15:20 – 16.00: Karina Vold (Research Associate, Leverhulme Centre for Intelligence, University of Cambridge) — “AI and us – who should be nudging whom? [Co-Authored with Huw Price (Bertrand Russell Professor of Philosophy, University of Cambridge)]”: About a decade ago, the concept of ‘nudging’ was popularized by behavioural scientists and economists including Richard Thaler and Cass Sunstein (2008). Nudge theory proposes ways in which behaviour can be indirectly influence by altering the environment, or choice architecture, in different ways, usually to trigger some kind of desired behavioural outcome by exploiting our natural cognitive biases. The idea is, in a sense, nothing new—advertisers have long known that by hijacking our attention through pictures and words they can influence our decision-making. What is new however, are the various ways in which this can now be done, for example, by manipulating our search results, through suggestive search engines, purchasing recommendations, targeted advertisements, and even by integrating advertizing into our social media feeds. Moreover, by using algorithms that operate on big data analysis, governments, corporations, and other institutions now have the capacity to target nudges for each individual. These technologies raise a whole bunch of new thorny ethical questions, about consent, privacy and manipulation. In this talk I will examine the ethics of nudging effects of AI systems on human behaviour (e.g. influence of recommendations) as well as how humans might in turn nudge these AI systems to achieve more desirable outcomes.
  • 16.00 – 16.20: Coffee Break
  • 16:20 – 17:10: Panel Discussion (1) with James Ladyman (Professor of Philosophy, University of Bristol), James Pennebaker, David Stillwell, and Karina Vold  (Chair: Christopher Burr) — “Artificial Intelligence and the Individual” 
  • 17.10 – 18.00: Panel Discussion (2) with Andrew Charlesworth (Reader in IT and Law, University of Bristol), Lina Dencik, Marcello Pelillo (Professor of Computer Science, University of Venice), Sana Kharegani (Office of A.I.) (Chair: Teresa Scantamburlo) — “Artificial Intelligence and Society”


Registration details

Registration is now closed. If you would like to be notified of any tickets that become available, please email chris.burr@bristol.ac.uk


Additional (Optional) Workshop:

A second workshop on the ‘Digital Humanities and Computational Social Sciences’ (26th June, 2018) is also being run separately by the thinkBIG project. Please visit http://thinkbig.blogs.bristol.ac.uk/dh-css-workshop/ for further information.


How to reach Cumberland Lodge

Cumberland Lodge is located in Windsor Great Park, an ancient royal hunting forest and part of the Crown Estate (see on Google maps).

By air:
Cumberland Lodge is about 20 minute taxi ride from Heathrow Airport. From Gatwick there is the choice of train into London, then via Waterloo to Egham (the closest station to Cumberland Lodge), and from there by taxi.

By rail:
From London Waterloo to Egham the journey time is approximately 37 minutes. From Reading and the west there is a direct service to Egham (Reading —Waterloo line). Egham station is 3.5 miles or 10 minutes by taxi from Cumberland Lodge

By road:
Cumberland Lodge can be reached via M25, from the west via M40 & M4 and from the south via M3 & A30. More details are available on Cumberland Lodge website.

Contact information:
Cumberland Lodge email: enquiries@cumberlandlodge.ac.uk and phone number: 01784 432316