AI-2022 Forty-second SGAI International Conference on Artificial Intelligence
CAMBRIDGE, ENGLAND 13-15 DECEMBER 2022


home | schedule | technical stream | application stream | poster sessions
workshops | proceedings | exhibition | registration | sponsors | organisers
enquiries | social | visa info | venue | accommodation | panel session | special poster session
ai open mic | information for speakers | previous conferences | letter of invitation
call for papers | paper submission and info for authors | accepted papers
internet access for delegates | walking tour

BCS

Workshops

The first day of the conference comprises a range of workshops, to be held on Tuesday 13th December. Delegates will find these events to be especially valuable where there is a current need to consider the introduction of new AI technologies into their own organisations.

There will be four half-day workshops, and delegates are free to choose any combination of sessions to attend. The programme of workshops is shown below. Note that the first session starts at 11 a.m. to reduce the need for delegates to stay in Cambridge on the previous night. There is a lunch break from 12.30-13.15 and there are refreshment breaks from 14.45-15.15 and from 16.45-17.00.

Workshops organiser: Professor Adrian Hopgood, University of Portsmouth, UK


Sessions 1 and 2 - Stream 1 (11.00-12.30 and 13.15-14.45 Lubbock Room)

To be announced

Sessions 1 and 2 - Stream 2 (11.00-12.30 and 13.15-14.45 Peterhouse Lecture Theatre)

User Preferences in Intelligent Systems

Chairs:
Prof. Juan Augusto and Dr Mark Springett, Middlesex University

Details to follow.


Sessions 3 and 4 - Stream 1 (15.15-16.45 and 17.00-18.30 Lubbock Room)

To be announced

Sessions 3 and 4 - Stream 2 (15.15-16.45 and 17.00-18.30 Peterhouse Lecture Theatre)

Explainable AI

Chairs:
Dr Mercedes Arguello Casteleiro, University of Southampton, and Dr Anne Liret, BT

Explainable AI (XAI) aims to enhance machine learning (ML) techniques with the aim of producing more explainable ML models that would enable human users to understand and appropriately trust ML models.

Part 1 - Dr Mercedes Arguello Casteleiro, University of Southampton
Deep Learning algorithms are considered black box algorithms, where a close examination by humans does not reveal the features used to generate the prediction. This part of the workshop will focus on explainable AI for Deep Learning algorithms in domains with abundant unlabelled text, such as biomedicine. The workshop will exemplify how to provide predictions (outcome) with accompanying justifications (outcome explanation). The approach presented belongs to the new field of explainable active learning (XAL), combining active learning (AL) and local explanations.

Part 2 - Dr Anne Liret, BT
TBC

Part 3 - Dr Frederic Stahl, Dr Christoph Tholen, and Dr Mattis Wolf, DFKI: German Research Center for Artificial Intelligence
TBC


AI-2022 Forty-second SGAI International Conference on Artificial Intelligence
CAMBRIDGE, ENGLAND 13-15 DECEMBER 2022


home | schedule | technical stream | application stream | poster sessions
workshops | proceedings | exhibition | registration | sponsors | organisers
enquiries | social | visa info | venue | accommodation | panel session | special poster session
ai open mic | information for speakers | previous conferences | letter of invitation
call for papers | paper submission and info for authors | accepted papers
internet access for delegates | walking tour

BCS