Rashmika Mandanna, Katrina Kaif Deepfake Videos in ‘AI’ of Storm: Govt to Film Industry Stance | Explained
Rashmika Mandanna, Katrina Kaif Deepfake Videos in ‘AI’ of Storm: Govt to Film Industry Stance | Explained
Rashmika Mandanna, Katrina Kaif deepfake videos: The IT Ministry has issued an advisory reminding social media firms of Section 66D of the Information Technology Act, 2000 and the IT Intermediary Rules, which deal with cheating through impersonation using computer resources

Amid the deepfake videos of actors Rashmika Mandanna and Katrina Kaif, the Ministry of Electronics and Information Technology (MeitY) has issued an advisory to social media companies, reiterating the guidelines and the action to be taken in case of violation, said sources.

In Mandanna’s case, the video shows a woman wearing a black outfit entering a lift. However, her face changes to that of the actor in the deepfake video. The video originally featured Zara Patel, a British-Indian influencer. The video was initially shared on October 9. Kaif’s deepfake that surfaced on Tuesday has a behind-the-scenes image from her upcoming film, Tiger 3.

WHAT ARE DEEPFAKES?

Deepfakes are a form of manipulation that uses Artificial Intelligence (AI) technology to create highly convincing fake content, in the form of images or videos. Tools such as AI, Photoshop, machine learning and others available online have been extensively used to create deep fake videos, clips and other content.

The AI generated fake contents are designed to appear as if they were created by or feature real individuals when, in fact, they are entirely fake. Deepfake technology can create fictional photos, morphed videos or even ‘voice clones’ of public figures.

Deepfakes often transform existing content like an image or a video where one person is swapped for another to generate realistic morphed media. The technology can also be used to create original content where someone is shown doing or saying something they didn’t do or say.

GOVT REMINDS SOCIAL MEDIA FIRMS OF IMPERSONATION RULES, WARNS OF STRICT ACTION

In an exclusive conversation with CNN-News18, Union Minister for Electronics and IT Ashwini Vaishnaw confirmed that the notice has been issued to all intermediaries about compliance in the wake of the deepfake video and reminded them about the action if they fail to comply.

The advisory stresses on Section 66D of the Information Technology Act, 2000, which deals with cheating through impersonation using computer resources. The punishment is imprisonment up to three years and a fine of up to Rs 1 lakh.

It also mentions the ‘IT Intermediary Rules’. Rule 3(1)(b)(vii) states that social media intermediaries must exercise due diligence by ensuring that their rules, regulations, privacy policies, or user agreements inform users not to host content that impersonates another person.

Rule 3(2)(b) states that the intermediary shall, within 24 hours from the receipt of a complaint in relation to any content … in the nature of impersonation in an electronic form, including artificially morphed images of such individual, take all .. measures to remove or disable access to such content.

In response to a post on X, Union Minister of State for Entrepreneurship, Skill Development, Electronics & Technology Rajeev Chandrasekhar had posted on Monday: “PM @narendramodi ji’s Govt is committed to ensuring Safety and Trust of all DigitalNagriks using Internet. Under the IT rules notified in April, 2023 – it is a legal obligation for platforms to ensure no misinformation is posted by any user and ensure that when reported by any user or govt, misinformation is removed in 36 hrs. If platforms do not comply wth this, rule 7 will apply and platforms can be taken to court by aggrieved person under provisions of IPC. Deep fakes are latest and even more dangerous and damaging form of misinformation and needs to be dealt with by platforms.”

RASHMIKA MANDANNA ‘HURT’, ZARA PATEL ‘DISTURBED’

Reacting to the video, Mandanna on Instagram stated that she felt “really hurt to share this and have to talk about the deepfake video”. “Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused. Today, as a woman and as an actor, I am thankful for my family, friends and well wishers who are my protection and support system. But if this happened to me when I was in school or college, I genuinely can’t imagine how could I ever tackle this. We need to address this as a community and with urgency before more of us are affected by such identity theft,” she wrote.

Patel, meanwhile, posted on Instagram: “Hi all, it has come to my attention that someone created a deepfake video using my body and a popular Bollywood actress’s face. I had no involvement with the deepfake video and I’m deeply disturbed and upset by what is happening. I worry about the future of women and girls who now have to fear even more about putting themselves on social media. Please take a step back and fact-check what you see on the internet. Not everything on the internet is real.”

AMITABH BACHCHAN TO MRUNAL THAKUR, NAGA CHAITANYA: THE INDUSTRY SUPPORT FOR MANDANNA

As the video surfaced on X, Amitabh Bachchan, who has worked with Mandanna in Goodbye, retweeted the post and urged for a legal action to be taken against such theft. “Yes this is a strong case for legal,” he wrote.

Sharing a statement on Instagram, actor Mrunal Thakur said, “Shame on people who resort to such things, it shows that there is no conscience left at all in such people.”

“It’s truly disheartening to see how technology is being misused and the thought of what this can progress to in the future is even scarier. Action has to be taken and some kind of law has to be enforced to protect people who have and will be a victim to this .Strength to you,” said actor Naga Chaitanya.

Mandanna also thanked Bachchan for standing up for her.

What's your reaction?

Comments

https://rawisda.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!