Channel 4 has sparked controversy and debate with a deepfake video of the Queen as an alternative to its traditional party show, which will air on Christmas Day.
The broadcaster will show a five-minute video in which a digitally altered version of the Queen shares her thoughts on the year, including the departure of Prince Harry and Meghan Markle as members of the Royal Family and the involvement of the Duke of ‘York with disgraced financier Jeffrey Epstein.
The Deepfake Queen, voiced by actor Debra Stephenson, can also be seen performing a dance routine from the TikTok social media platform.
Channel 4 said the show aimed to give a “stark warning” about the threat of fake news in the digital age, with its program director Ian Katz describing the video as a “powerful reminder that we can no longer do this. trust in our own eyes ”.
Some pundits have suggested that the show might trick audiences into believing deepfake technology is more commonly used than it is.
“We have yet to see deepfakes widely used, except to attack women,” said Sam Gregory, program director of Witness, an organization using video and technology to protect human rights. “We have to be very careful that people think they can’t believe what they see. If you haven’t seen them before, it might make you think that deep scythes are a more common problem than they are, ”he said.
“It’s okay to expose people to deepfakes, but we shouldn’t step up the rhetoric to pretend we’re surrounded by them. “
Areeq Chowdhury, a tech policy researcher behind Jeremy Corbyn and Boris Johnson’s deepfakes in the 2019 general election, said he supports the move to highlight the impact of deepfakes, but the technology won’t was not a widespread threat to information sharing.
“The risk is that it becomes easier and easier to use deepfakes, and there is the obvious challenge of having false information out there, but also the threat that they undermine real video footage that could be considered deepfakes, ”he said.
“My opinion is that we should generally be concerned about this technology, but the main problem with deepfakes today is their use in non-consensual deepfake pornography, rather than news.”
Deepfakes expert Henry Ajder said, “I think in this case the video is not realistic enough to be of concern, but the addition of disclaimers before a deepfake video is released, or adding a watermark so that it can’t be cropped and altered, can help deliver them responsibly.
“As a society, we need to determine what uses for deepfakes we find acceptable and how we can navigate a future where synthetic media is becoming increasingly important in our lives. Channel 4 should encourage best practice. “