As the banhammer falls on deepfakes communities and hosting on Discord, Gfycat, Pornhub, Twitter, and Reddit people interested in making and consuming AI-generated fake porn have been searching for new internet clubhouses to share nonconsensual porn clips and tips about how to make it.
“Deepfakes” are videos created using a machine learning algorithm that swaps one person’s face onto another person’s body. Most frequently, this is used to put a celebrity’s face on a video of a porn performer.
Some deepfakes fans are attempting to avoid watchful admin eyes by setting up their own websites, independent of other platforms. But at least one of these websites, called Deepfakes.cc, contains malware that hijacks visitors’ computing power to mine cryptocurrency without alerting the user. Deepfakes enthusiasts may make particularly good miners: The profitability of cryptocurrency mining depends on a computer’s power, and people running machine learning programs may have more powerful CPUs than the average consumer.
A member of the r/fakeapp subreddit (which was not banned because it does not allow porn) first pointed out the surreptitious mining on deepfakes.cc, in an attempt to alert other members of the issue. Motherboard ran the site through an online antivirus program; it showed that deepfakes.cc is running code from Coinhive’s in-browser miner.