The emergence of Artificial Intelligence (AI) has transformed our daily lives, enhancing efficiency and can streamline numerous tasks. AI has also disrupted traditional educational teaching methods (1). Nonetheless, as with any technological innovation, AI can be weaponized for criminal purposes, and its use in senior fraud scams (2)and kidnapping scams (3) is an alarming trend that we all need to be aware of.
In a traditional kidnapping scam, criminals use deception to trick victims into believing that a loved one has been kidnapped or is in danger. In the past, these scams were carried out through phone calls – the intended target receives a phone call from someone claiming that they have kidnapped the target’s spouse or child and demands a ransom be paid through an e-transfer or the family member’s spouse or child would be harmed. To add realism, an adult or child could be heard crying in the background of the call, leading the target to believe that call is legit. (4)To make the call even more legit, the criminal will spoof the family member’s phone number to make it look like the call was coming from their spouse or child’s cellphone number.(5)
However, with the advent of AI, scammers have found a new tool to make their scams more convincing and harder to detect. Through voice synthesis technology, scammers can create a lifelike synthetic voice that sounds just like the voice of the victim’s loved one. (6) The criminal can then use this voice to make phone calls to their target, pretending to be the kidnapped person. By using a synthetic voice, the scammer can disguise their true identity and make it more difficult for the victim to detect the fraud.
We actually think that in the near future, voice cloning technology will be combined with realistic digital manipulations of images and videos, known as deepfakes, that can make it appear as though someone is saying or doing something that they did not actually do. In the context of kidnapping scams, deepfakes can be used to create convincing videos that show the victim’s loved one being held captive or in danger. By creating a realistic video with a cloned voice of the family member, the criminal can further deceive their target and make it more likely that they will pay the ransom.
The method of voice cloning by these criminals involves exploiting social media platforms. TikTok or YouTube, for instance, provides an ideal platform to obtain a voice sample for cloning purposes (7). With just a three-second audio clip, a true voice clone recording can be created, which when combined with other personal information obtained from the victim’s public social media accounts, adds to the credibility of the scam call.
Let us consider the following scenario:
• Criminal monitors the social media activity of your targeted family member.
• Your loved one share a video clip online discussing their travel plans or vacation.
• The criminal clones the voice from the shared social media video.
• Once your family member has left for vacation, the criminal now uses the cloned voice, in addition to other personal data extracted from the internet, to deceive you into believing that you are conversing with your family member, thus creating a very believable ruse to establish the authenticity of the call and its urgency.
So how do we protect ourselves from these AI cloning scams?:
Strategy #1 – AWARENESS
- individuals should be vigilant and skeptical of unsolicited phone calls, emails, or messages from unknown or unexpected sources looking for a ransom or to assist with bail in getting a loved one out of jail.
- If the criminal hasn’t spoofed the phone number of your family member to call you, then the area code may not be one that you recognize, which is a good indicator to let you know this may be a scam
Strategy #2 – IDENTIFICATION CONFIRMATION
- There are some social media safety advocates who suggest utilizing a safe word to validate the identity of the person and verify that it is not a cloned voice of your loved one. However, the challenge with employing safe words is the likelihood of forgetting them, especially if not regularly practiced.
- A better alternative to using a “safe word” is to pose a question to the person sounding like your family member that only they would know, like “what is the color of our family car” or “what is the name of our aunt residing in Ontario.” If the person cannot respond to the question instantly and accurately, it should raise suspicion that the situation may not be as authentic as it appears.
- if you can, call your family member, or the friends they are with immediately to confirm their status and safety.
Strategy #3 – Keep Things Private
- Given that the fabrication of a cloned voice and the gathering of personal information are crucial to enhance the credibility of this online crime, it is now even more important to secure and privatize your social media accounts.
The use of artificial intelligence (AI) has undoubtedly brought significant benefits to our daily lives, from increasing efficiency to transforming the way we learn. However, as with any technological advancement, there is always the potential for it to be exploited for criminal purposes.
Unfortunately, online scams are becoming increasingly prevalent, and the use of AI in these scams is a concerning trend. Through voice cloning technology and deepfake videos, criminals can now create convincing impersonations of our loved ones, making it harder for us to detect the fraud.
To protect ourselves and our loved ones from this new modern online crime, we must remain aware, be cautious of unsolicited communication, confirm identities through personal questions, and take steps to privatize our social media accounts.
Digital Food For Thought
The White Hatter
References:
- https://thewhitehatter.ca/blog/chatgpt-friend-or-foe-what-parents-educators-need-to-know/
- https://www.cbc.ca/news/canada/newfoundland-labrador/ai-vocal-cloning-grandparent-scam-1.6777106
- https://globalnews.ca/news/9629883/ai-kidnapping-scam-teen-girl-voice-cloned-extortion-arizona-jennifer-destefano/
- https://globalnews.ca/news/9301636/virtual-kidnapping-scam-north-vancouver/
- https://www.androidauthority.com/best-spoof-call-apps-android-1038299/
- https://www.marketwatch.com/press-release/2023-voice-cloning-market-latest-research-report-by-2029-2023-02-22
- https://www.freethink.com/robots-ai/voice-cloning-vall-e