As digital technology permeates every area of modern life, we risk becoming over-dependent on complex systems that operate in an opaque way, creating a risk that they exhibit emergent properties that adversely affect their users or their wider environment. This is particularly true as developers increasingly rely on AI or ML techniques as a means to define system behaviour when the problem space is too complex or poorly understood for human developers to explicitly specify that behaviour. We are tackling incompletely understood problems by developing systems whose behaviour and wider impact are by necessity also incompletely understood. This trend, which is largely enabled by an abundance of data harvested from (e.g.) mobile devices, sensors and social media, is radically changing how systems are developed and how they are used. We need a new approach to software engineering that: (i) places greater emphasis on making explicit the risks of unintended behaviour for innovative new software products either through limitations on our understanding of the envisioned product's behaviour or through misuse, and (ii) actively supports explainability of the exposed behaviour by the running system. Twenty20Insight is an interdisciplinary project bringing together academic experts in Software Engineering (SE), RE, Design Thinking and ML to help system stakeholders and developers understand and reason about the impact of intelligent systems on the world in which they operate.
QuantUn’s contribution is to develop a novel technique to explicitly quantify uncertainty to support decision-making by self-adaptive systems (SAS). The project is based on the idea of Bayesian surprises as the basis for quantitative analysis to measure degrees of uncertainty and deviations of SAS from the expected behaviour. A surprise implies that the system may decide either to adapt accordingly or to flag that an abnormal situation is happening.