The Business Outlook

Publishing the Tales With More Insight

Last year, Ashley Suarez discovered that someone had reposted her images on Instagram, showcasing dresses and hairstyles, but with her face replaced. This impersonator, using the name Elizabeth Reid, built a large social media following and profited by accepting gifts from fans and through pornography.

“I received many of these videos because people recognized the background as my own. I made most of my TikToks at my parents’ home,” says Suarez, who earns her living from modeling and promoting products online to her nearly 500,000 followers on Instagram and TikTok. “They were like, ‘This looks just like your video, but it’s not your face.'”

This incident was just one of many where Suarez’s content was stolen, altered, and reposted online. The increasing availability and sophistication of “faceswap” deepfake technology have made such theft easier than ever.

The perpetrators make slight alterations to the stolen images, making them difficult to detect using automation. Copyright laws also provide some protection under an exception for works slightly altered from the original.

Many social media companies have been unprepared for the influx of faceswap images, creating significant hurdles for image owners trying to get the images removed. Only recently have they begun addressing the problem more aggressively.

Suarez first encountered faceswapping three years ago when she saw her content, shot at her family home in Naples, Florida, repurposed on Instagram with a different face. She reported the content to Instagram, told her parents, and contacted the police, but none of her efforts succeeded. Frustrated, Suarez deleted her social media accounts and began seeing a therapist. A few weeks later, she hired Takedowns.AI, a service that scans the internet for stolen or AI-altered images of its clients and requests their removal from social platforms.

“I realized I didn’t want to live in fear,” recalls Suarez, now 21.

According to Takedowns.AI CEO Kunal Anand, faceswap deepfake instances have increased by 500% recently, appearing across TikTok, Instagram, Facebook, and sites like Patreon and Fanvue, where pornography is allowed.

When Takedowns.AI contacted Instagram, they received an email stating, “It’s not clear you are the rights owner or are otherwise authorized to submit this report on the rights owner’s behalf.” Instagram asked Suarez to submit a new report from her email address or provide documentation proving her authorization. Even after following these instructions, Takedowns.AI received the same response.

Frustrated, Takedowns.AI sent messages directly to Suarez’s impersonator, threatening legal action unless the account was deleted. A few days later, the “Elizabeth Reid” account was removed.

Takedowns.AI’s services are costly. Suarez and her family spent between $2,000 to $3,000 monthly, putting them into debt before hiring the company on a two-year retainer in 2023 at a discounted rate.

Despite the expense, Suarez believes the investment is worthwhile because platforms like Instagram rarely remove content based on creator complaints alone. She thinks platforms respond more to companies like Takedowns.AI, which sometimes employs attorneys to send complaints.

Creating faceswap images may seem complex, but AI tools like Stable Diffusion and open-source forums like GitHub make it relatively easy. One GitHub post providing faceswap coding software has nearly 50,000 likes.

Faceswap scammers find the effort financially rewarding. One user on an anonymous developer forum claimed to have made over $100,000 since starting in January.

Initially, social media platforms like TikTok and Facebook-parent Meta were unresponsive to Takedowns.AI’s complaints about faceswapping, considering the images “transformative works.” This legal exception allows the use of copyrighted work without permission if it is altered or used differently from the original. Many creators are losing millions because their content is used without their ability to monetize it, Anand laments.

Faceswaps may not be legal

Kristin Grant, a managing partner at Grant Attorneys at Law specializing in intellectual property, acknowledges the legal challenge posed by the concept of transformative works regarding stolen images. However, she suggests that creators like Suarez might have a strong legal claim under a different statute: rights of publicity, which protects against the unauthorized use of a person’s likeness for commercial purposes.

Rights of publicity laws differ from state to state, so the strength of a case involving faceswap fakes depends on the creator’s location. “Using someone else’s image without their authorization for commercial purposes is illegal,” Grant states.

Social platforms struggle to keep up with AI advances

Recently, Anand noted that Facebook, Instagram, and TikTok have improved at removing faceswap images following reports from Takedowns.AI, though there is still room for improvement. Typically, Instagram only removes the offending post rather than the entire account when responding to a complaint.

Some social media platforms have started cracking down more broadly on AI-generated content. Meta plans to use automation and creator complaints to decide when to apply “made with AI” labels to AI-generated videos, images, and audio on its platforms, helping users identify computer-generated content. However, the automation is imperfect, and Takedowns.AI often has to flag faceswaps that Meta’s systems fail to detect.

Meta’s policies do not proactively ban content that infringes on copyright or trademark rights; they only remove content after a violation is reported, sometimes preserving the account. There is no specific policy for faceswap images like those involving Suarez. Meta declined to comment for this article.

Until early May, TikTok required creators to label their own AI-generated videos, lacking policies to address faceswaps. This honor system did little to prevent faceswappers from passing off AI-generated content as authentic to its 1 billion active users.

In May, TikTok updated its policies, joining rivals Meta and Google’s YouTube in automatically applying an “AI-generated” label to AI videos posted on its platform. The effectiveness of this tool for TikTok remains unclear. TikTok declined to comment for this article.

How porn sites fit in

Faceswapping might not be a significant concern for the average social media user, but it poses a major problem for creators, particularly women who post sexually suggestive or explicit content. Loose verification requirements on many adult-oriented creator sites and the lure of easy money for those who steal images make it an attractive venture.

“I think women who are sexualized at a young age are the biggest victims,” says Suarez. “I was very sexualized when I was young… it has been going on for years.”

OnlyFans, primarily known for its adult content, may be the strictest platform in policing AI-generated content. It prohibits creators who use AI from monetizing on its site and requires account verification through photos of their driver’s licenses that match their likeness in the content. Despite this, pornography on OnlyFans continues to be stolen and manipulated by faceswappers, who then post it on other platforms, says Anand, founder of ChatPersona, a company allowing OnlyFans creators to use AI bots for racy chats with customers.

An OnlyFans representative declined to comment on this article.

Fanvue also requires creators to upload their driver’s licenses to open accounts. However, hackers on a digital forum claim that these images aren’t checked to ensure they match the creators’ likenesses.

Tools for detecting faceswaps may be as controversial as the faceswaps themselves

Takedowns.AI offers creators a system to prevent their likenesses from being used in faceswaps, though it raises privacy concerns. Users upload photos of their faces to Takedowns.AI, which uses third-party service PimEyes to crawl the internet for identical and modified versions.

“You can just scan your face, and whatever it finds, you can just take it down,” Anand explained.

This process is similar to Google’s reverse image search, where users insert an image instead of text to find matches. However, unlike Google, Takedowns.AI integrates social media profiles into the results, which is crucial for creators to locate and report faceswapped content. Some privacy advocates worry that this tool could capture images of people without social media profiles, whose faces have been shared online by friends and family.

“Love excuses everything”

Another creator represented by Takedowns.AI is former high school weightlifter Carolina Samani, who posts provocative photos and videos on Instagram and earns money through OnlyFans. Samani’s faceswap deepfaker, known as Julia Rossi, managed to keep her Instagram account live despite multiple complaints from Takedowns.AI until early June, when Instagram finally removed it.

Using Samani’s faceswapped content, Rossi amassed 210,000 Instagram followers, roughly half of Samani’s following, which she built over years of posting.

In response to a fan on Instagram who asked, “Is it worth committing arson for you? 😘😍🤎,” Rossi, or whoever managed the account, casually replied, “Love excuses everything 🫢😂.”

Leave A Reply

Exit mobile version