• Dr. Timothy Smith

Article: Is That Really You? Deep Fake on the Rise


Pictured Above: Peyton Manning, former NFL quarterback

Photo Source: US National Archive


The often-quoted “What has been seen cannot be unseen, what has been learned cannot be unknown” by the writer C.A. Woolf stands as a profound warning to the rise of the maturing technology of deepfake and the ways it can change the world. In other words, we may have lost control of our own narratives. Recently, advertising firms have used deepfake in commercials for Michelob beer and ESPN with State Farm Insurance. Much more concerning, malicious people have released deepfake videos of others without their consent or knowledge.


The term deepfake derives from the mashup of two concepts—deep learning and fake or forged. Deepfake technology uses artificial intelligence called deep learning that allows a machine to learn all the details of one person through video and transpose those details onto a video of someone else. Deepfake works for voice too! Unfortunately, such technology in the wrong hands can turn unsuspecting people into puppets that appear to say and do things they never did.

Recently, advertising for Michelob Ultra beer in support of Capital One’s The Match: Champions for Charity golf event used deepfake technology to superimpose the face of Peyton Manning in a scene from the iconic comedy Caddyshack. The commercial depicts the climactic scene of the final putt of a highly wagered round of golf. In the commercial with deepfake technology, the face of the former NFL quarterback, Manning, replaces the face and voice of Judge Smails played by Ted Knight when Smails famously says, “Well, we’re waiting!” (twitter.com) According to the New York Times in an article titled “An ESPN Commercial Hints at Advertising’s Deepfake Future,” ESPN expertly transformed old video footage of the sports anchor, Kenny Mayne, from 1998. The doctored footage has him pitching a documentary about the Chicago Bulls to be released in 2020 and using slang that did not exist in 1998. (nytimes.com) The author of the article, Tiffany Tsu, adds that the commercial did not appear completely real on purpose. The ESPN-State Farm team expressly wanted the commercial to appear as a joke.


On a different note, The Wall Street Journal recently reported two separate deepfake instances. In one case, according to Catherine Stupp, a reporter for the Journal, criminals used a deepfake voice emulator to defraud a UK energy firm of over €220,000 ($243,000). The criminals using the deepfake voice over the phone impersonated the CEO of the UK energy firm’s parent company in Germany. They convinced the UK CEO to transfer money to Hungary to meet an “urgent obligation.” Once transferred, the money was dispersed to other countries. Neither the crooks nor the money have been found. (wsj.com). In another instance, the David Austin Professor of Management at MIT’s Sloan School of Management, Sinan Aral, notified his Twitter followers that a deepfake video of him had surfaced fraudulently, depicting him endorsing an investment fund’s stock-trading algorithm. (wsj.com) Although it appears no harm has come from this video, it does signal that anyone can get targeted by deepfakes.


The wide availability of digital video and voice media and social media’s lightning speed has transformed how people connect and get news. We have grown up trusting video and television news. However, a new technology called deepfake has emerged that can take the video and voice of one person and transpose those into someone else’s video. It gives people an unprecedented ability to forge realistic-looking and sounding videos of other people doing and saying things they never did. Advertisers have used such technology to produce a new type of commercial that blends historical scenes from film and video with unique elements of today. On the darker side, deepfake can create video and voice for evil purposes such as humiliation, crime, and disinformation. Some cases may seem small, but as the technology improves, it could profoundly affect political and economic events. Imagine just before an election, and a deepfake could plant some misinformation about a candidate. Once seen, even if untrue, it could be tough to unwind. We as consumers of media must take a skeptical view of media that feels untrue, and our news and tech companies more than ever need to analyze media and voice for signs of deepfake carefully.




Dr. Smith’s career in scientific and information research spans the areas of bioinformatics, artificial intelligence, toxicology, and chemistry. He has published a number of peer-reviewed scientific papers. He has worked over the past seventeen years developing advanced analytics, machine learning, and knowledge management tools to enable research and support high level decision making. Tim completed his Ph.D. in Toxicology at Cornell University and a Bachelor of Science in chemistry from the University of Washington.


You can buy his book on Amazon in paperback and in kindle format here.