Singer Chinmayi Sripada Accuses Loan Apps of Extortion Through Image Manipulation Targeting Women

Image Manipulation
Image Credit : India Today

The internet recently witnessed a shockwave as a deepfake video featuring actor Rashmika Mandanna surfaced, raising critical concerns about the misuse of artificial intelligence (AI). In response to this unsettling incident, singer Chinmayi Sripada boldly spoke out against the dark side of AI technology. Sripada pointed out that these manipulated videos are not only targeting celebrities but also common individuals.

Chinmayi called for legal action against those who misuse deepfake technology and sounded the alarm about its potential to become a dangerous weapon for extortion, blackmail, and harm, especially against women. She emphasized the risk of deepfakes being used to target and harass women, extort money from them, or even facilitate sexual assault. Sripada noted that victims from smaller villages or towns might struggle to make their families understand the gravity of the situation when their honor is at stake.

Furthermore, Sripaada unveiled a disturbing trend where women who have borrowed money from loan apps are being harassed by collectors who manipulate their images into explicit content as a means of extorting money from them. She highlighted that deepfake technology is becoming increasingly sophisticated, making it challenging for the average person to detect these forgeries. She called for a nationwide awareness campaign to educate the public about the dangers of deepfakes and encouraged individuals to report such incidents instead of taking matters into their own hands.

Recognizing the gravity of the situation, the central government has taken action by sending an advisory to social media platforms to address the issue of deepfake videos. This incident serves as a stark reminder of the importance of protecting individuals from potential harm resulting from the misuse of AI technology.

Repurposed article originally published in India Today

Leave a Reply