Be wary of AI-assisted crimes using cloned voices and faces : The Tribune India

Join Whatsapp Channel

Consumer Rights

Be wary of AI-assisted crimes using cloned voices and faces

Learn to be sceptical of information and communication from all online sources

Be wary of AI-assisted crimes using cloned voices and faces

Picture for representational purpose only.



Pushpa Girimaji

If you get a call from a family member or a close friend living in another city, saying that he or she was involved in an accident and needed money urgently, even if that call is from an unknown number, you will not hesitate to send the money, so long as you recognise the voice on an audio call, or the face on a video call.

However, today, that identity has become questionable, thanks to voice cloning and deep fake technologies that are helping cyber frauds dupe people and extract money. The entry of the generative and large language models of Artificial Intelligence (AI) has made the technology more accessible and easier to use. Cybercriminals have been quick to adopt this technology to impersonate people, imitate voices and come up with more innovative ways to con people.

So, today, as we celebrate the National Consumer Day, we need to recognise and be aware of one of the biggest threats to consumer safety — AI-aided cyber frauds. The Kerala case is a good example. On July 9, a Kozhikode resident, Radhakrishnan, got a call from his former colleague, asking for Rs 40,000 for an emergency surgery of his sister-in-law in Mumbai. Radhakrishnan was a little hesitant to send the money, even though the voice was certainly that of his old friend. He even expressed apprehension about online scams to his friend, who promptly made a WhatsApp video call to confirm his identity. It was only after losing Rs 40,000 sent to him that the senior citizen realised that the fraudster had used deep-fake technology to clone his friend’s face and voice.

Fortunately for him, the police have traced the money trail, identified the culprit and frozen his bank account, but every victim may not be so fortunate. With scams becoming more sophisticated and transnational, enforcement agencies today face many technological and legal challenges — so much so that the rate of chargesheets in respect of complaints of cybercrime in 2022 was just 29.6 per cent (national average), as per the National Crime Records Bureau data. Delay in victims reporting the crime is also a factor. The total number of cybercrime cases was 65,893 in 2022, up from 52,974 in 2021. The rate of crime, calculated per lakh of population, was 4.8 in 2022, an increase from 3.9 in 2021.

During November-December, I have come across at least four cases of AI-assisted voice cloning used by fraudsters. In Hyderabad, for example, a 59-year-old resident lost Rs 1.4 lakh to one such call that left her in no doubt that it was from her nephew living in Canada. He spoke to her in Punjabi and the imitation was so perfect that when he told her that he had met with an accident and was about to be jailed and needed money desperately, she did not hesitate to send the money.

Shimla resident Sanjay was also a victim of such voice cloning — he lost Rs 2 lakh by responding to a call that he believed was from his uncle. A 25-year-old resident of Gomti Nagar in Lucknow lost Rs 44,500 thanks to a perfect copy of his uncle’s voice. In Delhi, Lakshmi Chand Chawla sent Rs 50,000 as ransom money to save his nephew, who he believed had been kidnapped. In order to show that the nephew was in their custody, the so-called kidnappers made Chawla listen to his nephew’s duplicated voice!

Obviously, this kind of scam, using cloned voices is gaining popularity because a global survey published in May this year by McAfee, titled “Artificial Imposters — Cybercriminals turn to AI voice cloning for a new breed of scam”, reported that 47 per cent of respondents in India said they had either been a victim themselves (20 per cent) or known someone who had been a victim (27 per cent). Just three seconds of raw material or audio is enough for such cloning and 86 per cent of Indians made their voices available online at least once a week, making it easy for fraudsters to pick up the voice, the survey said.

This is just the beginning. Such AI-assisted crimes will only get more high-tech, innovative, devious and difficult to discern. So, while law enforcement agencies need to quickly gear themselves up to deal with the problem — fortunately, AI can also be used effectively to counter the menace — consumers also must become extremely cautious about calls and messages from unknown numbers and sources. If there is a request for money, cross-check the veracity of the caller and the information. Do not be in a hurry to send money. Learn to be sceptical of information and communication from all online sources. I would also advise consumers to regularly check the websites of state cyber police to understand the different modus operandi used by cyber-criminals. Remember, the best deterrent is consumer awareness and alertness.


Top News

Arvind Kejriwal, others to march towards BJP HQ tomorrow, ‘today Bibhav, then Raghav,’ claims Delhi CM

Arvind Kejriwal, others to march towards BJP HQ tomorrow, ‘arrest us if you can’, dares Delhi CM

Kejriwal has continued to observe silence in the Swati Maliw...

Arvind Kejriwal's personal secretary Bibhav Kumar accused of assaulting AAP MP Swati Maliwal detained

Arvind Kejriwal's personal secretary Bibhav Kumar, accused of assaulting AAP MP Swati Maliwal, arrested

Kumar had also filed a complaint against Maliwal on her alle...

AAP releases Swati Maliwal's new video walking out of Arvind Kejriwal's residence

AAP releases new video showing Swati Maliwal walking out of Delhi CM Arvind Kejriwal's residence

In her FIR, Maliwal had alleged that she was assaulted by Bi...

8 burnt to death as bus carrying devotees from Punjab catches fire near Haryans’a Nuh

9 burnt to death as bus carrying devotees from Punjab catches fire near Tauru in Haryana

Devotees were returning from pilgrimage to Mathura and Vrind...


Cities

View All