The Capture: Deep Learning and Deep Fakes
Forget artificial intelligence—in the brave new world of big data, it’s artificial idiocy we should be looking out for. —Tom Chatfield, British author and tech philosopher
China has the reputation of being the most CCTV-monitored country in the world, with over 600 million cameras in public places. With a population of 1.5 billion people, this is almost one camera for every two people. The pro-consumer website Comparitech rates London as the most surveilled city in Europe with 40 per cent more cameras per person than Beijing.
Australia, so far, relatively speaking, is CCTV-free, with merely 24,000 cameras in Sydney and 11,000 in Melbourne. but a recent news story uncovered more than 900 pieces of surveillance equipment built by companies linked to the Chinese government and installed in Commonwealth government buildings. Following precedent by the UK and US governments, and in the interest of national security, the Australian government is removing them all.
Joe Dolce appears in every Quadrant.
Click here to subscribe
The Capture (2019) is a BBC One six-part thriller television series, focusing on the abuse and manipulation of cutting-edge deep-fake CCTV footage for personal and political ends. It was created by the British director-writer Ben Chanan, and stars Callum Turner, Ron Perlman and Holliday Grainger. It is currently streaming in Australia on Stan and Apple TV.
Chanan worked as a director on the series The Last Kingdom (2015; reviewed in Quadrant, October 2018) and The Missing (2016). Originally a documentary-maker, he interviewed many counter-terrorism operatives, discovering the importance of video surveillance evidence in the successful prosecution of cases. His intention with The Capture was to create a show with the tone of a 1970s conspiracy thriller, in the style of Three Days of the Condor (1975; reviewed in Quadrant, November 2018).
According to Hannah Smith and Katherine Mansted of the Australian Strategic Policy Institute:
A deep fake is a digital forgery created through Deep Learning (a subset of AI) … any technology that can be used to generate false or misleading content, from photocopiers and Photoshop software to deep fakes, can be weaponised.
In the series, ex-Lance-Corporal Shaun Emery (played by Callum Turner) has just had a war crime conviction, for murdering an unarmed Taliban fighter in Afghanistan, overturned. He has been in prison for six months but is now free, due to what appears to be manipulated video evidence. After a celebratory release party, he meets up with his barrister, Hannah Roberts, near a bus stop. They embrace. He says goodbye to her as she turns to board the bus. But she subsequently disappears, and local CCTV footage shows Emery attacking and kidnapping her.
DI Rachel Carey is put on the case and with the assistance of facial recognition technology identifies and arrests Emery. Emery claims he is innocent and the footage must have been tampered with, but Carey tells him that falsifying live CCTV is impossible. Nobody believes Emery, but nobody can find Roberts.
Carey decides to re-examine the footage from the scene but is told that crucial sections have now been redacted in the interests of national security. Emery is released on bail but later abducted off the street by masked gunmen and taken to an undisclosed location for interrogation. Frank Napier, a US intelligence officer, questions Emery, who continues to deny having any knowledge of his missing barrister.
An upper-level administrator from SO15, DSU Gemma Garland, is put in charge of the investigation, stopping Carey’s initiatives at every turn. Emery manages to escape from Napier. He finds a video-tech specialist named Marcus Levy who explains that “deep fake” techniques are commonly used to manipulate film but no method has yet been discovered for falsifying live CCTV. But he finally admits that it could be possible, using delayed feeds.
DS Patrick Flynn uncovers unredacted footage from the CCTV bus camera that proves Roberts was not abducted but boarded her bus on the evening of her disappearance. This evidence apparently exonerates Emery, but he is again taken hostage—this time by a group of anti-establishment activists.
One of them, Alma, tells him about the practice of “correction”, a covert intelligence technique in which CCTV is regularly manipulated by a rogue faction in the government in order to gain convictions in cases with insufficient evidence. Alma’s activist group had been the ones behind the bus stop deception in a plan to publically expose this illegal practice of “correction”. He says Roberts has been a willing participant in this plan from the beginning.
Napier decides, in order to shield the CIA, that the faked abduction and death of Roberts now have to become real. Her dead body is shortly after discovered in the boot of a car belonging to Emery.
Napier confronts Emery, warning him that they have compiled enough footage of him, which can be “corrected”, in any number of ways, including showing him abusing his own daughter—unless he confesses to the murder of Roberts. To protect his daughter, Emery signs the confession.
While in prison, Emery confides to his ex-girlfriend, Karen Merville, that he was guilty of killing the Afghani insurgent in cold blood: “Deep down, I knew he wasn’t reaching for his weapon. Just a bloke begging for his life.” He accepts now, in a twist of fate, that justice has been served.
HOW close are we to the kind of Big Brotherism depicted in The Capture in today’s world of surveillance?
UK police are employing live facial recognition via mounted CCTV cameras on patrol cars. South Wales Police are even testing facial recognition software on officers’ mobile phones. But there are imperfections in the present technology, which include difficulty in telling black people apart and even distinguishing women from one another.
The UK human rights organisation Liberty is arguing for a ban of the use of all facial recognition software by police. They have successfully represented clients in legal challenges against South Wales Police.
In Australia, during the Covid-19 quarantines, New South Wales and Victoria trialled facial recognition software to help enforce movement-restriction laws. Reuters reported that police in Western Australia used it to put almost 100,000 people through home quarantine. However, the West Australian government banned the use of the data for any non-Covid matters.
Ray Kurzweil, an American inventor and futurist, has said:
Artificial intelligence will reach human levels by around 2029. Follow that out further to, say, 2045, we will have multiplied the intelligence, the human biological machine intelligence of our civilization a billion-fold.
Louisa Mellor, UK TV Editor for Den of Geek, wrote:
Real-time deep fakes are already here. Thanks in large part to a grimy appetite for deep fake pornography—in which celebrities’ faces are digitally mapped onto those of sex performers at work—we have the technology … deep faker cybercriminals have already successfully committed financial fraud by cloning voices and duping employees into believing they’re talking to their company director … a Hong Kong bank manager who transferred $35 million to fraudsters [did] just that in 2020.
Laura Martin of iNews quoted a senior military intelligence officer and security expert, Colonel Philip Ingram. Ingram served with the British Army for twenty-six years. He commanded the Military Intelligence Battalion and was part of the team that brought peace to Bosnia. He served operationally in Germany during the Cold War, and in Northern Ireland, Cyprus, Northern Macedonia, Croatia and Iraq. Ingram said:
The technology to do this certainly exists and with the growth of artificial intelligence it’s going to become easier and easier … I think in a short space of time it could become an issue. But there are programs that allow you to feed it digital imagery to see if it’s been manipulated. It then looks at the metadata behind the imagery and you can see what’s been done with it or if it’s been altered. You beat technology with technology. You don’t beat it by banning it, because then it will go underground.
Rotten Tomatoes, the popular film review site, gave The Capture an impressive 92 per cent rating based on audience response. Most press reviews of the series have been positive.
Sarah Hughes of the Guardian said, “Ben Chanan’s surveillance thriller isn’t just one of the most cleverly plotted dramas of recent years—it’s also one of the most satisfying … the most adult and interesting thriller on TV this year.” Carol Midgley wrote in the Times, “Duplicity reigns in the best drama on TV.”
But Rob Owen, writing for the Pittsburgh Post-Gazette, disagreed, saying The Capture “offers a whiplash-inducing premiere that goes from, ‘This is a ridiculous investigation that appears to lack a crime’ to ‘How is that possible?’ … A somewhat dubious premise.” Louise Rugendyke of the Sydney Morning Herald wrote, “It’s gripping up to a point, but really it’s just you staring at a screen while watching another person staring at a screen.” Chuck Bowen of Slant Magazine also played devil’s advocate:
What The Capture doesn’t have is the sense of violation that made 24 such an unmooring experience in its best seasons. That show’s protagonist, Jack Bauer, was a charismatic hawk who did things that most people to the left of Dick Cheney would find monstrous. Kiefer Sutherland allowed you to see the humanity and the savagery of Bauer, which rendered the character all the more disturbing. Whatever its faults, 24 is a distinctive, authentic reaction to the political atrocities that marked the post-9/11 world. By contrast, the violence of The Capture is just noise to further the plot. Even the notion of doctored surveillance footage has been examined before and more artfully, especially in Philip Kaufman’s atmospheric Rising Sun. A newer element of our surveillance state, social media, is mentioned obligatorily but is barely explored.
Many of us have had our social media sites, such as Facebook, hacked by impersonators. Anyone can set up a fake social media page, appropriate a few photos from a legitimate page, and make a clone of it. This has been done to me a few times. There are hundreds of people who think they are one of my Facebook friends when in fact they are following these imposters. We could call it “identity theft lite”. I used to worry about it, but now I ignore it.
A more sophisticated way of manipulating social media, within grasp of just about anyone who can navigate a keyboard, is known as “catfishing”. In this scam, you set up a fake Facebook page, but under an alias, using a separate Yahoo or Gmail address, fill it up with attractive and alluring photographs of “yourself” (actually, pictures of models or little-known foreign film stars, all taken from the internet) and, in your new disguise, encourage romantic connections, via extremely personal messaging, with the unwitting. Many people have fallen in love with people using these fake personae (also known as “sock puppets”) or have lost significant amounts of money through associated scams. Some of these impersonators are eventually found out; others get away with it and just disappear.
The hit documentary film Catfish (2010), which later became a television series, centres on the brother of one of the New York film-makers who fell for a charming young woman he met on Facebook. They exchanged over 1500 messages over nine months. The social media profile of his paramour was actually the fake creation of an average rural middle-aged Michigan housewife named Angela (clearly with above average computer skills) who lived with her husband (unaware of his wife’s online activities) and looked after a severely retarded son. The film-makers suspected something strange and tracked her down, paying her and her family a visit in person and documenting the entire drama. They discovered that Angela had set up fifteen fake Facebook profiles. In the end all was forgiven and everyone remained on good terms, not to mention profiting from a commercial and creative film project.
Fergus Hanson, director of the Australian Strategic Policy Institute’s International Cyber Policy Centre, told Anthony Galloway of the Sydney Morning Herald that deep-fake manipulation will only get more sophisticated:
The way we typically verify identities is going to have to change. We just take it for granted that I can recognise your voice. We need to be looking at an extra layer of authentication—especially for phone calls that involve commercial dealings and political dealings. With voice calls, voice is now easily synthesised, particularly when you’ve got a body of voice samples. Politicians and diplomats talk a lot publicly so that’s easy to do. So you could have a CEO ringing up a staff member asking them to make payment to a particular bank account. You could have a defence minister ringing the Chief of Defence Force asking them to do something.
Smith and Mansted wrote:
In May 2019, a video circulated on social media showing US House of Representatives Speaker Nancy Pelosi slurring her words during a news conference, as though she were intoxicated or unwell. The video was a cheap fake: an authentic recording of the speaker, but with the speed slowed to 75% and the pitch adjusted to sound within normal range.
In the world of AI, “machine learning” is where machines perform tasks that human usually do. For example, there are pizza-making robots now doing most of the work in several US pizza shops.
Deep Learning is a subset of machine learning involving neural networks and algorithms, combined with a vast amount of data. It is estimated that we generate 2.5 quintillion bytes of data daily. The more data, and the stronger computing power becomes, the closer Deep Learning algorithms approach human thought.
Some common applications of Deep Learning we are familiar with today are our virtual assistants, such as Siri and Alexa, language translating apps, driverless vehicles, personalised shopping (those pesky ads that suddenly appear on your social media platform right after you have been looking around on the internet for something), medicines that can be tailored to one’s individual genome, and facial recognition.
But in case you start losing sleep about all this, remember what Alan Kay, an American computer scientist and winner of the 2003 A.M. Turing Award, the highest honour in computer science, once said: “Some people worry that artificial intelligence will make us feel inferior, but then anybody in his right mind should have an inferiority complex every time he looks at a flower.”
For a comprehensive description of existing surveillance technology and the use of deep fakes, see the report Weaponised Deep Fakes, by Hannah Smith and Katherine Mansted, prepared for the Australian Strategic Policy Institute
Many will disagree, but World War III is too great a risk to run by involving ourselves in a distant border conflict
Sep 25 2024
5 mins
To claim Aborigines have the world's oldest continuous culture is to misunderstand the meaning of culture, which continuously changes over time and location. For a culture not to change over time would be a reproach and certainly not a cause for celebration, for it would indicate that there had been no capacity to adapt. Clearly this has not been the case
Aug 20 2024
23 mins
A friend and longtime supporter of Quadrant, Clive James sent us a poem in 2010, which we published in our December issue. Like the Taronga Park Aquarium he recalls in its 'mocked-up sandstone cave' it's not to be forgotten
Aug 16 2024
2 mins