Research has indicated a significant increase in the number of deep fake videos being uploaded to the internet, with the disturbing fact that the growth has occurred in a period of only 9 months.
According to studies from cyber security firm Deeptrace, in addition to being utilised in political circumstances, the movies are increasingly being used in pornography as well.
The prospect that they will be used in vengeful porn and online bullying, as well as the rising profitability of the business, have given rise to increased levels of anxiety. According to Deeptrace’s research director, the pornographic aspect is a factor that can be extremely harmful to a huge number of women.
After conducting an investigation, Deeptrace discovered the presence of four pornography websites that were built using deep-fake technology and whose movies had received a combined total of around 134 million views since February 2018.
Several apps that make it easier to create such material have sprung up, and while website owners are occasionally forced to remove such material, the software is still available and available to anyone who wants to turn it into a successful business.
According to the authors of Deeptrace, there are a number of web firms that create and sell similar videos for profit.
The amount of input material necessary to make the deep-fake output today is small, which is a source of great concern to many people. Because of this, it is critical for politicians to find a means to limit the spread of such technology. Some of the areas that require further attention are the development of detection technologies, raising public awareness, and taking into account the political and social processes that have made deepfakes so dangerous.
At the moment, examples of deepfakes being utilised for entertaining purposes are more common than ever, and only a small number of deepfakes being used for malicious purposes have been documented.
Ongoing debates over what is the best approach to take in dealing with the deepfake problem. After the doctored video has gone viral, any alert issued through deepfake detection technology would be followed by an appropriate action. As a result, it may not prove to be as effective. If the victims are found to be women in the short term, which is almost certain to be the case, then employing specialists to deal with their abusers will be a costly endeavour for them.