Disruptive by Design: The Realities and Challenges of Legislating Deepfakes

September 1, 2019
By Jennifer Miller

A deepfake is an artificial intelligence-based technology used to produce content that presents something that didn’t actually occur. This includes text, audio, video and images. Deepfakes are a recent disruptive development in the technology world, and both the House and Senate are investigating the phenomena and have introduced bills to combat these potentially dangerous creations.

A House of Representatives website explains that advances in machine learning algorithms have made it cheaper and easier to create deepfakes. Such advances also support the production of fake audio, imagery and text at scale, and these capabilities are fast becoming more accessible and widely available. “Deepfakes raise profound questions about national security and democratic governance, with individuals and voters no longer able to trust their own eyes or ears when assessing the authenticity of what they see on their screens,” the site says. 

The word “deepfake” is a portmanteau reflecting the combination of deep learning and fictional or fake content. In late 2017, an individual created a stir by placing nonpornographic pictures of celebrities’ faces into pornographic images. Recently, two artists and an advertising agency collaborated on a deepfake video featuring Mark Zuckerberg, co-founder of Facebook. At least one U.S. politician also has been a victim.

Deepfakes can have both positive and negative effects. This includes, on the one hand, effective and efficient techniques for filmmaking and other art forms as well as low-cost enhancements for modeling and simulation, and training and education. On the other hand, deepfakes can be used to rapidly spread fake news, sensationalism and propaganda for a number of nefarious purposes. Like most any technology, deepfakes can be used for either good or evil. That’s where the bills introduced in the House and Senate come into the picture—the real picture.

The current efforts implemented by social networks and video hosting sites to manage deepfakes are not enough in the eyes of lawmakers. Hence, the Deepfake Report Act of 2019, which, if passed, will require the Department of Homeland Security (DHS) to conduct an annual study of deepfakes and related content. In my opinion, the bills will struggle with implementation and execution given the focus on proving deepfake creators’ intentions to mislead. It will be like a parent accusing a child of drawing the neighbor as a gun-wielding murderer when the child actually intended to sketch a person with a broom.

Only the creator can provide the absolute truth of intent in the creation of the content. Too many times I have seen an innocent joke perceived as a hate crime, bullying and/or harassment. Defining intent can be as challenging as defining pornography. Regardless of my opinions and the disruptive deepfakes, the DHS will be directed to address deepfakes if the bills succeed. This includes research, hearings, reports and consultations.

Other organizations tagged with involvement to address deepfakes include the Department of Justice, Federal Election Commission and the National Science Foundation. I predict other government entities will also engage, whether voluntarily or voluntold. Those might include the U.S. Department of Defense, the CIA, and state and local governments.

With some misgivings about the proposed legislation, I’ll be watching some version of reality play out in the coming months as the bills progress.


Jennifer Miller is an operations research analyst for the Air Force’s Cost Analysis Agency. She previously supported the National Guard Bureau Headquarters’ Joint Staff, and the Air Force and Army at locations along the East Coast. She is a certified government financial manager, and a certified defense financial manager with acquisition specialty and a member of the American Society of Military Comptroller’s Washington Chapter.

Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.


Share Your Thoughts: