AI Voice Cloning Scams on the Rise: How Scammers Are Using Technology to Mimic Voices of Loved Ones and Dupe Victims

    Related

    Why is my wife yelling at me: 5 Major Reason You Must Be Missing Out

    Relationships are complex, and communication is a major factor...

    Why Is My Husband Yelling at Me : 6 Must Reason You Should know

    It's really stressful, confusing, and emotionally draining when someone...

    How Long Does Lip Filler Last : 5 Factors You Must Know

    Lip fillers have recently become one of the most...

    How to Stop Runny Nose: 8 Best Remedies and Solutions

    A runny nose, also known as rhinorrhea, is a...

    How to hand wash clothes: 6 Simple steps

    how to hand wash clothes is a gentle and...

    Share

    According to a recent report by Bharti Airtel’s Chairman, Sunil Mittal, a new wave of AI voice scams seems to target unsuspecting people through voice cloning of friends or family members. At NDTV World Summit, Mittal confessed that scammers had made such a convincing cloning of his voice that he wanted his company executives to transfer a huge sum of money. However, it was not gobbled by his company official, and instead, Mittal went into shock after noticing how close the voice clone had come to being exact and stated how AI was going at a lightening pace in the direction of high-level fraud.

    It does so by allowing scammers to create a close digital copy of a person’s voice through AI-powered voice cloning. To create this voice clone, you’d need a 30-second audio clip that can then be uploaded to one of the many voice cloning services that exist online. Because the tools sell for as low as $5, the technology is alarmingly accessible, giving scammers the capability to clone voices in minutes. With all these tools facilitated through the AI’s analysis and emulations, it generates nearly indistinguishable voice duplication capabilities that may be applied on anything this person says in order to reproduce any situation.

    From the vast ranges of voice cloning AI sectors, various applications abound-including voice-to-text for services or customer services with educational content. There seems to have been abuse against this because scammers assume familiar voices to give other people lies about false reasons why they were or where loved ones are, due to getting arrested or as a result of being into an accident. The calls usually claim that some friend or family member needs urgent attention, being arrested by the police, or in a serious medical emergency, which creates panic in the victim and prompts it to act quickly.

    The scammer impersonates authoritative figures like police officers or government officials, claiming to be a law enforcement officer or government official that adds a layer of authenticity to their claims. Victims report that fraudsters make the victims believe that the money is needed immediately so that their loved one would be released or treated from the hospital. Most of the victims stated being terrified by the sound of voice, especially because the parent reported that it was as if their child’s voice, pleading to save him or her.

    As voice cloning continues to improve with generative AI, experts note that voice cloning is bound to improve further and be used to mimic the actual sound and emotional cues as well as speech patterns of a real voice. The advice is not to accept calls from unknown numbers and never begin with the national prefix; instead, family members should be verified about claims made before any transfer of money or sharing sensitive information.

    The scams of AI voice cloning remind us of the ethical issues involved in the advancement of AI technologies. Though these technologies have immense potential for beneficial applications, they bring new threats and leave individuals more vulnerable to manipulation by bad actors.