According to the Federal Trade Commission (FTC), imposter scams, in which a criminal impersonates someone to steal money or personal information from a target, continue to be one of the most common forms of fraud in the U.S. A reported $2.6 billion were lost to imposter scammers in 2022, and the degree of impersonation has grown in sophistication along with advancing technology. While some scammers continue to attempt impersonation via email or text, there is a growing trend of employing Artificial Intelligence (AI) to clone and impersonate a person’s voice to call targets and trick them into revealing sensitive information. Read on to learn how title and real estate industry professionals and consumers can avoid falling victim to this rising crime.
With AI becoming more mainstream and affordable, more people are using it in the workplace and at home. However, its accessibility also makes it a dangerous tool for criminals looking to use the technology maliciously. AI voice impersonation (also known as voice cloning) is the creation of an artificial rendering of a person’s voice using AI tools or software. The software creates a synthetic, digital copy of a unique human voice, considering the person’s gender, accent, speech patterns, inflection and breathing. According to a recent report from computer security software company McAfee, these copies can match 85 percent to the real voice. The same report included other alarming statistics, including that, in some cases, just three seconds of audio is all that is needed to clone a person's voice. Scroll through any social media platform or video messaging app, and it's easy to find seconds of someone's voice recorded. In fact, McAfee found that 53 percent of all adults surveyed share their voice online at least once a week.
Once a voice is copied, criminals can then use the cloned version to activate voice-controlled digital devices, contact clients or business associates to extract money or personal information, or even attempt virtual kidnapping. Voice-cloning has become so ubiquitous that the Federal Trade Commission (FTC) recently released a Consumer Alert warning about the rise of “family emergency” schemes targeting parents and grandparents to wire money to a supposed child or grandchild in distress.
So, how does one combat such a chillingly persuasive method of fraud, particularly as a title or real estate professional interacting with clients and customers? Following these best practices can help you avoid falling victim to an AI voice impersonation scam:
Though AI voice impersonation may seem like something out of a dystopian nightmare, knowing the signs, staying calm and always verifying the call yourself can help thwart a potential scam.