Like something out of a science fiction movie, criminals are using AI to create voice clones of your family, friends and coworkers, all to scam you. In a 2023 survey, the antivirus software company McAfee found that a quarter of adults across seven countries have experienced some form of AI voice scam.
The article from MSN discusses the rising threat of scammers using AI technology to clone voices for fraudulent activities. These scammers utilize voice clips, often obtained from social media or other online platforms, to create convincing replicas of someone's voice. This technology allows them to impersonate individuals, tricking friends, family, or colleagues into believing they are speaking to the real person. The article outlines several safety measures to protect against such scams: being cautious with personal information shared online, using two-factor authentication, setting social media profiles to private, and being skeptical of unsolicited calls or requests for money, even if the voice sounds familiar. It also advises to establish a secret code or phrase with close contacts to verify identity during suspicious calls. The piece emphasizes the importance of awareness and vigilance in the digital age where AI can be used maliciously.