As part of our mission for the ThinkBIG project, we want to engage in a conversation with the public about the role of AI in a data-driven society. Below are some examples of aimed at stimulating this conversation.
In January, Charlotte was selected to take part in Bristol’s first Creative Reactions science-meets-art event for the Pint Of Science Festival 2017. She was paired with Professor Nello Christianini, Faculty of Engineering at the University of Bristol, looking at the rise of Artificial Intelligence and what it means for global societies.
During their meetings, they discussed how AI is not currently the fembot or Terminator of sci-fi movies but exists in more abstract forms, namely sophisticated algorithms embedded in our everyday digital infrastructure.
However, solving the question of ‘Can we create AI?’ and ‘How intelligent can we go?’ currently lacks a critical dimension – a comprehensive mapping of its implications for societies and a global infrastructure for its regulation. The social contracts that underpin our societies’ functioning are codified in our systems of law. But, even as newer architectures of existence (and behaviour modification) are overwriting the old, the checks and ethical guidelines of a digital age have not caught up.
Moreover, whilst “it might look like highly adaptive behaviour, that feels intelligent to us,” the agent has “no internal representation of why it does what it does”. Crucially, therefore, without the self-internalisation of a code of ethics, AI implementation risks a different kind of blindness to the blindfold of Justice’s impartiality.
This series of work plays with themes of blindness, erasure and legibility. I explore these notions via the shrouded figure of Justice, with the more familiar figurative representation slowly re-written by digital circuitry. A binary system is inherently dual in its possibilities; one, zero, on, off. Here, the technological forms a ghostly, shadow double that both reassures and disturbs with its dystopian as well as utopian possibilities.