Abstract
This article discusses the legal implications of a novel phenomenon, namely, digital reincarnations of deceased persons, sometimes known as post-mortem avatars, deepfakes, replicas, holographs, or chatbots. To elide these multiple names, we use the term 'ghostbots'. The piece is an early attempt to discuss the potential social and individual harms, roughly grouped around notions of privacy (including post-mortem privacy), property, personal data and reputation, arising from ghostbots, how they are regulated and whether they need to be adequately regulated further. For reasons of space and focus, the article does not deal with copyright implications, fraud, consumer protection, tort, product liability, and pornography laws, including the non-consensual use of intimate images (‘revenge porn’). This paper focuses on law, although we fully acknowledge and refer to the role of philosophy and ethics in this domain.
We canvas two interesting legal developments with implications for ghostbots, namely, the proposed EU Artificial Intelligence (AI) Act and the 2021 New York law amending publicity rights to protect the rights of celebrities whose personality is used in post-mortem ‘replicas’. The latter especially evidences a remarkable shift from the norm we have chronicled in previous articles of no respect for post-mortem privacy to a growing recognition that personality rights do need protection post-mortem in a world where pop stars and actors are routinely re-created using AI. While the legislative motivation here may still be primarily to protect economic interests, we argue it also shows a concern for dignitary and privacy interests.
Given the apparent concern for the appropriation of personality post-mortem, possibly in defiance or ignorance of what the deceased would have wished, we propose an early solution to regulate the rise of ghostbots, namely an enforceable ‘do not bot me’ clause in analogue or digital wills.
We canvas two interesting legal developments with implications for ghostbots, namely, the proposed EU Artificial Intelligence (AI) Act and the 2021 New York law amending publicity rights to protect the rights of celebrities whose personality is used in post-mortem ‘replicas’. The latter especially evidences a remarkable shift from the norm we have chronicled in previous articles of no respect for post-mortem privacy to a growing recognition that personality rights do need protection post-mortem in a world where pop stars and actors are routinely re-created using AI. While the legislative motivation here may still be primarily to protect economic interests, we argue it also shows a concern for dignitary and privacy interests.
Given the apparent concern for the appropriation of personality post-mortem, possibly in defiance or ignorance of what the deceased would have wished, we propose an early solution to regulate the rise of ghostbots, namely an enforceable ‘do not bot me’ clause in analogue or digital wills.
Original language | English |
---|---|
Article number | 105791 |
Number of pages | 12 |
Journal | Computer Law and Security Review |
Volume | 48 |
Early online date | 17 Feb 2023 |
DOIs | |
Publication status | Published - Apr 2023 |
Bibliographical note
Copyright © 2023 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (https://creativecommons.org/licenses/by/4.0/).Funding: This paper is a result of the “Emerging Technologies, Privacy Law and the Dead” workshop in April 2021, funded by the Modern Law Review. The workshop was facilitated by members of the Leverhulme-funded research group “Modern Technologies, Privacy Law and the Dead”, grant number: RPG 2020-048.
Keywords
- Deepfakes
- Digital legacy
- Digital remains
- Ghostbots
- Post-mortem privacy