A hot potato: As deepfake technology continues to evolve, the potential for misuse grows. While current tools still require users to mimic mannerisms, voice, and other details, advancements in voice cloning and video synthesis could make creating digital doppelgängers in real-time even more convincing.

In the past few days, a new software package called Deep-Live-Cam has been making waves on social media, drawing attention for its ability to create real-time deepfakes with incredible ease. The software takes a single photo of a person and applies their face to a live webcam feed, tracking the person's pose, lighting, and expressions on the webcam. While the results are not flawless, the technology's rapid advancement underscores how much easier it has become to deceive others with AI.

The Deep-Live-Cam project has been in development since late last year, but it has recently attained viral attention after example videos began circulating online. These clips show individuals imitating prominent figures like Elon Musk and George Clooney in real time.

The sudden surge in popularity briefly propelled the open-source project to the top of GitHub's trending repositories list. The free software is available from GitHub, making it accessible to anyone with a basic understanding of programming.

The potential misuse of Deep-Live-Cam has sparked concern among tech observers. Illustrator Corey Brickley had the epiphany that most recent breakthrough technologies are ripe for abuse. "Weird how all the major innovations coming out of tech lately are under the Fraud skill tree," Brickley tweeted, adding, "Nice remember to establish code words with your parents everyone."

While Brickley's comment is intentionally sardonic, it highlights the potential for bad actors to use such tools for deception. Considering the prevalence and accessibility of deepfake technologies, setting up a safe word to confirm your identity to family and friends is not that crazy an idea.

Face-swapping technology itself is not new. The term "deepfake" has been around since 2017. It originated from a Reddit user who used it as a handle. The redditor frequently posted pictures and videos in which he swapped a porn performer's face with that of a celebrity. At that time, the technology was slow, expensive, and far from real-time. However, those primitive techniques have improved to an incredible degree. Projects like Deep-Live-Cam and others are smarter and faster and have lowered the barrier of entry, allowing anyone with a standard PC to create deepfakes using free software.

The potential for abuse is already becoming well documented.

In February, scammers in China impersonated company executives, including the CFO, in a video call and tricked an employee into making over $25 million US in money transfers. The employee was the only real person on the conference call. In a similar case, someone in the US recently cloned Joe Biden's voice to dissuade people from voting in the New Hampshire primary. With the rise of real-time deepfake software like Deep-Live-Cam, instances of remote video fraud may become more common, affecting not just public figures but ordinary individuals as well.