Sexy internet celebrity MAGA supports Trump! The real face is actually an AI created by an Indian man, estimated to earn thousands of dollars a month

Indian medical students use AI to build sexy MAGA influencers, targeting conservative American men, combining political and erotic content to harvest traffic, earning thousands of dollars a month. Experts worry that an influx of such virtual influencers could flood the scene and ultimately become tools of information warfare, triggering a crisis.

Sexy influencer MAGA supports Trump—behind it is AI

Sexy influencer Emily Hart (Emily Hart) often shares beautiful lifestyle photos on social media. She is a loyal MAGA fan of Trump; she opposes abortion, opposes “woke culture,” and opposes immigration, but her true identity turns out to be an AI made by a man.

Using the pseudonym Sam, the 22-year-old Indian medical student recently revealed to international media outlet Wired that, to raise funds for medical licensing exam fees and his future immigration to the United States, he used AI tools to create Emily Hart, spending only 30 to 50 minutes per day managing social media accounts, allowing each short video to generate 3 million to 10 million views.

Within just one month, Emily Hart’s account accumulated more than 10,000 followers on Instagram. Fans even pay to subscribe to her adult content on the competing platform Fanvue, or buy clothing bearing political slogans.

Sam estimated that this model could easily earn him several thousand dollars each month. However, the good times didn’t last—this February, Emily Hart’s IG account had already been banned, although her Facebook account is still active.

Image source: The Independent UK Sexy influencer Emily Hart (Emily Hart) supports Trump, but it’s actually AI

MAGA AI girls’ management strategy

Emily Hart’s success mainly stems from Sam following recommendations from AI tools, targeting older conservative American men with higher disposable income and greater loyalty as the main audience, and focusing on the Make America Great Again (MAGA) and pro-Trump agenda.

These AI-generated girls follow a specific management template: they are usually configured as blonde white women, whose jobs are often emergency responders such as nurses, police officers, or firefighters. They wear bikinis printed with American flags, paired with posting extreme right-wing statements that support gun rights, oppose abortion, or oppose immigration.

Sam revealed that because social media algorithms favor controversial content, such posts don’t only attract conservative supporters—they also draw comments criticizing them from liberals, which in turn greatly boosts engagement.

This is an attention-harvesting strategy that combines patriotism and soft pornography. Creators attract attention through political fervor and ultimately funnel followers to paid platforms for monetization.

However, because the well-known adult platform OnlyFans strictly requires that creators be real humans, these AI creators typically direct fans to the Fanvue platform, which accepts AI-generated content.

From traffic monetization to information warfare: the flood of virtual influencers raises hidden concerns

Before Wired reported on Emily Hart, The Washington Post also reported in March on Jessica Foster, an AI virtual female soldier who had taken a photo with Trump and Russian President Vladimir Putin; within 4 months, this account attracted more than 1 million followers.

Image source: Jessica Foster / AI virtual influencer Jessica Foster’s account attracted more than 1 million followers within 4 months

Although Jessica Foster’s IG account has been banned, these MAGA AI girls still raise concerns among experts.

Researcher Valerie Wirtschafter at the Brookings Institution said that many fans simply don’t care whether these influencers are real; they only care that the content aligns with their political identity; assistant professor Joan Donovan at Boston University warned that these accounts are easy to set up and have clear profit incentives.

After all, the biggest risk of these AI accounts is that they could be converted into tools of information warfare—becoming bot armies that spread political propaganda and misinformation—and could also bring about unprecedented trust crises and social problems for online communities.

Further reading:
Classic Tournament: AI image rumors of trash piled up at Taiwan’s Tokyo Dome—rumor-makers have already been listed as foreign-influence account holders

Popular posts trigger Taiwanese media’s misreporting: whether Horner’s climb of a 101 photographer is Jin Guowei—media literacy challenges meet their test in the AI era

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin