Regulating Unreality: From Fake News to Deep Fakes
"Deepfakes" - or the use of AI to convincingly simulate or synthesis content, voice, images or video for malicious purposes (Nesta, 2018) - has become prominent recently, perhaps most obviously as a means to create realistic but fake pornography involving celebrities or particular victims.
As such, there has already been some discussion regarding whether these "unreal" products constitute criminal material such as revenge porn (better referred to as nonconsensual sharing of private or sexual images) or images of child sexual abuse ("child pornography") (Chesney and Citron, 2018).
Its implications are, however, far greater. Techniques to generate deepfakes are evolving in response to a parallel arms war of detection techniques, and may eventually result in a world where the problems currently being experienced with "fake news" expand to everything we see, hear and experience, not just news we read.
Obvious areas where this may impact on the law include the law of evidence: the law of intellectual property, primarily copyright: and fraudulent misrepresentation and anti-consumer scams: but these will only be the start of a deluge when our world of reality becomes inscrutably a constructed and manipulated text. Should the right to know what is real be a new human right?
### Free public lecture
We welcome registrations from a diverse audience, including not only university researchers but also representatives from industry, local government, and third sector organisations.
1330 - 1400 - Arrival and Registration
1400 - 1500 - Lecture
1500 - 1630 - Refreshments and Networking
### About the Speaker
Professor Lilian Edwards is a leading academic at Newcastle Law School, and frequent speaker on issues of Internet law, intellectual property and artificial intelligence. She is on the Advisory Board of the Open Rights Group and the Foundation for Internet Privacy Research and is the Professor of Law, Innovation and Society at Newcastle Law School at Newcastle University.