1
|
van Kersbergen K, Tinggaard Svendsen G. Social trust and public digitalization. AI & SOCIETY 2022. [DOI: 10.1007/s00146-022-01570-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
2
|
Salmi J. A democratic way of controlling artificial general intelligence. AI & SOCIETY 2022. [DOI: 10.1007/s00146-022-01426-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
AbstractThe problem of controlling an artificial general intelligence (AGI) has fascinated both scientists and science-fiction writers for centuries. Today that problem is becoming more important because the time when we may have a superhuman intelligence among us is within the foreseeable future. Current average estimates place that moment to before 2060. Some estimates place it as early as 2040, which is quite soon. The arrival of the first AGI might lead to a series of events that we have not seen before: rapid development of an even more powerful AGI developed by the AGIs themselves. This has wide-ranging implications to the society and therefore it is something that must be studied well before it happens. In this paper we will discuss the problem of limiting the risks posed by the advent of AGIs. In a thought experiment, we propose an AGI which has enough human-like properties to act in a democratic society, while still retaining its essential artificial general intelligence properties. We discuss ways of arranging the co-existence of humans and such AGIs using a democratic system of coordination and coexistence. If considered a success, such a system could be used to manage a society consisting of both AGIs and humans. The democratic system where each member of the society is represented in the highest level of decision-making guarantees that even minorities would be able to have their voices heard. The unpredictability of the AGI era makes it necessary to consider the possibility that a population of autonomous AGIs could make us humans into a minority.
Collapse
|