#GS 03 Cybersecurity
- Deepfakes are synthesised media that are either wholly generated or manipulated by Artificial Intelligence.
- They can include images, audio and video and are used to show real people saying and doing things they never did, or creating new images and videos.
- Deepfakes are used to tarnish reputations, create mistrust, question facts, and spread propaganda.
- The word ‘deepfake’ is a combination of ‘deep learning’ which is a subset of the artificial intelligence and the word ‘fake’.
- India has not enacted any specific legislation to deal with deepfakes so far.
- Currently, most social media companies such as Facebook and Twitter have banned deepfake videos.
- They have declared that if they detect any video as deepfake manipulated or synthetically generated, it will be taken down.
- A voice deepfake can closely mimics a real person’s voice and accurately replicate tonality, accents, cadence, and other unique characteristics of the target person.
Tools used for voice cloning
- Software such as OpenAI’s Vall-e, My Own Voice, Resemble, Descript, ReSpeecher, and iSpeech can be used in voice cloning.
Threat of Deepfakes
- Legal ambiguity, coupled with a lack of accountability and oversight creates avenues for individuals, firms and even non-state actors to misuse AI.
- They compromise the public’s ability to distinguish between fact and fiction and can be used to spread misinformation and propaganda.
- Deepfake have been commonly used to misrepresent/malign well-known politicians in videos.
- This technology is being increasingly used for creating fake news, hoaxes, and committing financial fraud.
Detecting voice deepfakes
- Highly advanced technologies, software, and hardware are needed to detect voice deepfakes by breaking down speech patterns, background noise, and other elements.
- However, Cybersecurity tools have yet to create foolproof ways to detect audio deepfakes.
- Watermarks and blockchain technologies are used by research labs to detect deepfake technology.
For more updates Click Here