II Ethical Subproject

Description
With regard to solidarity, there is a deep-rooted notion that solidarity requires at least a degree of insecurity. The ethical subproject ties in with this idea by questioning existing normative discourses on the relationship between solidarity and its respective entanglement with degrees of certainty.

  • The first task addresses the epistemological question of the extent to which acts of giving in solidarity are challenged by the promises of new AI-generated degrees of certainty.
  • A second task is to elaborate in detail to what extent the clinical use of AI questions trust in institutions – as a prerequisite for solidarity.
  • The third task is to examine the changing forms of individual and collective controllability in times of clinical use of AI. We already have a more or less sharp (culturally handed down) idea of individual possibilities of control – such as the right not to know, claims to transparency or responsibility and liability – which are necessary prerequisites not only to be able to make free decisions, but also to decide under which conditions giving in solidarity is an expression of individual freedom. An important question will be how these forms of controllability are challenged when they not only have to cope with degrees of uncertainty, but are also confronted with the idea of a growing corpus of (postulated) certainty. While this in itself is a complex issue, things become even more complicated when we think about modes of collective controllability and their embedding in more or less sharp concepts of spatiality and temporality.

Responsible

Dr. Matthias Braun

Project Manager (Ethical Subproject)

Friedrich-Alexander-Universität Erlangen-Nürnberg
Chair of Systematic Theology II (Ethics)

Max Tretter

Research Assistant (Ethical Subproject)

Friedrich-Alexander-Universität Erlangen-Nürnberg
Chair of Systematic Theology II (Ethics)