Modi warns against deep fakes

Modi warns against deep fakes

Modi warns against deep fakes


On Friday, Prime Minister Narendra Modi said that the exploitation of artificial intelligence for making deepfakes was worrisome, and he requested the media to educate people about such activities. He was speaking to reporters at the BJP’s national office in New Delhi during a Diwali Milan.

What are Deep Fakes?

  • Deepfakes are realistic and frequently deceptive media content generated or altered by artificial intelligence (AI), such as movies, audio recordings, or photographs.
  •  Deep learning techniques, particularly generative neural networks, are used to substitute or superimpose existing content with manufactured material in these operations. 
  • The name “deepfake” is derived from the words “deep learning” and “fake.”

What are the manipulating techniques used in the Deep Fakes?

  • Face swapping: Deepfakes may swap the faces of people in films, making it look as if someone else is speaking or performing.
  • Voice Cloning: Some deepfakes concentrate on cloning voices, which allows the generation of audio content that sounds like a specific individual.
  • Lip syncing: Deepfakes may match created speech with lip movements, increasing the authenticity of faked videos.

What are the potential applications of the Deepfakes?

  • Entertainment: Deepfakes have been utilized creatively in the film and entertainment industries to digitally reproduce or imitate characters.
  • Education: They can be employed in educational settings such as language instruction or historical reenactments.
  • Misinformation and Malicious Use: Deepfakes represent a huge risk when used intentionally, such as distributing incorrect information, creating fake news, or impersonating individuals for fraudulent purposes.

What are the concerns and risks of Deepfakes?

  • Misuse in Pornography: Deepfakes first received attention for their usage in making realistic but phoney pornographic content, typically without the knowledge of the individuals involved.
  • Political Manipulation: Deepfakes can be used to distort political narratives, sway public opinion, and potentially interfere with elections.
  • Damage to reputation: Individuals may become victims of character assassination because deepfakes can represent them as engaged in actions or making statements that they did not make.

What are the laws available in India to counter the negative sides of the Deep Fakes?

  • There are no particular rules or regulations in India that prohibit or restrict the use of deepfake technology.
  • India has asked for the development of “ethical” AI tools to be governed by a worldwide framework.
  • Existing laws, such as Sections 67 and 67A of the Information Technology Act (2000), contain prohibitions that may apply to certain parts of deep fakes, such as defamation and the publication of explicit material.
  • Defamation is punishable under Section 500 of the Indian Penal Code (1860).
  • The Digital Personal Data Protection Act provides various safeguards against personal data misuse.
  • The Information Technology Rules of 2021 require that content impersonating others and digitally manipulated photos be removed within 36 hours.
  • India must create a comprehensive legal framework that expressly targets deepfakes, taking into account the potential implications for privacy, social stability, national security and democracy.

What are the countermeasures for Deepfakes?

  • Deepfake Detection Tools: Work is being done to provide tools and technology to detect deepfakes, enabling the detection of manipulated content.
  • Legislation and Policies: Some regions are exploring or enacting legislation to combat the malicious use of deepfakes, with a focus on privacy and harm prevention.


While deepfake technology has many beneficial applications, its potential for abuse presents ethical, legal, and cultural concerns. To limit the risks connected with deepfakes, efforts must be made to increase awareness, create detection systems, and establish suitable regulations.