Fundtrace logo

A Scammer Used Robotic Technology To Clone The Voice of A Company Director to Steal $35 Million

AI voice cloning is used in a massive robbery being investigated by Dubai police, despite being warned against cybercriminal exploitation of the new technology. A bank manager in the United Arab Emirates got a call in early 2020 from a gentleman whose voice he recognized — a director at a company with whom he’d previously spoken.

Authorities claim that the director had excellent news and intended to make a purchase. He presumably needs the Bank’s approval for some transfers totaling $35 million. According to Forbes, a lawyer named Martin Zelner was engaged to coordinate the operations, and the bank manager had emails from the director and Zelner verifying what money needed to transfer wherein his inbox. The transfers were started by the bank manager. He had no idea he’d been duped as part of an elaborate swindle in which cybercriminals had cloned the director’s speech using “deep voice” technology.

The UAE believes it was a sophisticated scheme involving at least 17 people. Manipulating audio, which is much easier to orchestrate than deep fake videos, will only grow in volume, and without education and awareness of this new type of attack vector, as well as better authentication methods, more businesses are likely to fall victim to very convincing conversations,” said Jake Moore, a former police officer with the Police Department in the United Kingdom.

HE Hamid Al Zaabi, director-general of the UAE Executive Office of Anti-Money Laundering and Counter-Terrorism Financing, said in a statement, “Even with incidents occurring outside the UAE, we will work closely with law enforcement partners around the world to identify and detect those individuals who knowingly engage in deceptive practices such as imposter fraud.” The UAE will then pursue these individuals to the fullest extent of the law, ensuring they are held accountable and brought to justice as soon as possible.”

What is A Robot Scam?

child holding hands with the robot

An old and new fraud manifests itself in some sorts of forex-developed trading systems. These con artists boast about their system’s ability to generate automatic trades that earn large sums of money even while you sleep. 

Because the procedure is now entirely mechanized with computers, the new terminology is “robot.” In many cases, many of these systems have not been submitted for formal review or evaluated by a third party. The assessment of a trading system’s parameters and optimization codes must be included in the examination of a forex robot. If any of the parameters or optimization codes are incorrect, the system will create random buy and sell signals. As a result, naïve traders will do nothing but gamble. Forex robots, often known as Expert Advisors (EAs), are programs that purport to automate Forex trading. It’s the equivalent of putting a plane on autopilot. Traders may sleep soundly at night knowing that their trades will be executed exactly when they specify. Isn’t it simple?

Forex robots have recently received a lot of attention, and Forex robot frauds are not far behind. Almost every Forex broker now allows its account holders to employ a Forex robot to execute trades. They back up the credibility of these machines with massive gains, lulling traders into a false sense of security just to leave them bankrupt. These claims are typically predicated on a relatively small time frame during which the specific product was successful.

Over the last week, there has been an increase in cases involving automated calls from fraudsters impersonating your Bank. It is critical that individuals remain watchful and scrutinize any unexpected phone calls, especially if they claim there has been fraud on their account. Fraudsters may already know anything about you, so don’t use this as proof that their approach is real. If you are suspicious, never give out any personal information. Instead, take five minutes to reflect before contacting your Bank directly at a number you can trust, such as the one on their official website.

LoneRobot

The Bank’s Name is Unknown but 17 People Were Involved in This Scam Case

The Bank’s name has been kept a secret to protect its client’s privacy and security; however, according to UAE officials, 17 persons were engaged in the crime, and the money was transferred to other bank accounts throughout the world. For further inquiry, the UAE authorities enlisted the assistance of American agencies.

“We are planning to make a substantial investment soon for which a large quantity of money would be necessary,” the criminals, posing as bank directors, stated. We urgently require $35 million .” The incident occurred in early 2020, according to reports. Unfortunately, the Dubai Public Prosecution Office, which is handling the inquiry, has yet to discover any of those individuals.

Martin Zelner Had Been Hired To Coordinate Scam Procedures

In early 2020, a Hong Kong bank manager received a call from a man whose voice he recognized—a director at a company with whom he’d previously spoken. 

The director had fantastic news: his company was poised to make an acquisition, and he needed the Bank to allow some $35 million payments. Martin Zelner, a lawyer, had been hired to coordinate the procedures, and the bank manager could see emails from the director and Zelner in his inbox, verifying what money needed to transfer where. The bank manager began making the transfers after concluding that everything appeared to be in order.

rock-n-roll-monkey

What he didn’t realize was that he’d been duped as part of an elaborate swindle in which fraudsters used “deep voice” technology to clone the director’s speech, according to a court document obtained by Forbes in which the UAE has requested American investigators’ assistance in tracing $400,000 in stolen funds that went into Centennial Bank accounts in the United States. The United Arab Emirates, which is probing the crime since it affected organizations within the country, believes it was an extensive conspiracy involving at least 17 people that moved the stolen money to bank accounts throughout the world.

The document supplied little more information, including no names of the victims. At the time of publication, the Dubai Public Prosecution Office, which is leading the inquiry, has not responded to requests for comment. Martin Zellner, a lawyer residing in the United States, was also contacted for comment but had not responded. It’s only the second recorded case of fraudsters allegedly employing voice-shaping technologies to commit a theft, but it looks to have been significantly more effective than the first, in which fraudsters used the technology to mimic the CEO of a U.K.-based energy corporation in an attempt to steal $240,000 in 2019.

AI support centre

Voice-mimicking software was employed

According to the Washington Post, voice-mimicking software was employed in a significant theft in 2019. In a case that some researchers are calling one of the world’s first publicly reported artificial-intelligence (AI) heists, thieves used voice-mimicking software to imitate a company executive’s speech and dupe his subordinate into sending hundreds of thousands of dollars to a secret account. 

According to representatives from the French insurance firm Euler Hermes, the managing director of a British energy company, followed orders in March to send more than $240,000 to an account in Hungary, assuming his supervisor was on the phone. The request was “rather strange,” the director later noted in an email, but the voice was so realistic that he felt compelled to comply. The insurer, whose case was reported by the Wall Street Journal, provided new details on the theft to The Washington Post on Wednesday, including an email from the employee duped by “the false Johannes.”

Silicon Valley tycoons come under fire

Voice-synthesis software, which is now being developed by a wide range of Silicon Valley titans and AI startups, can mimic the rhythms and intonations of a person’s voice and be used to produce convincing speech. 

Google and smaller firms like the “ultra-realistic voice cloning” startup Lyrebird have helped refine the technology. Developers of the technology have highlighted its beneficial applications, claiming that it can help humanize automated phone systems and allow mute persons to talk again. However, its unregulated growth has raised concerns about the possibility of fraud, targeted hacking, and cybercrime.

“Criminals will utilize whatever methods allow them to achieve their aims the cheapest,” said Andrew Grotto, a fellow at Stanford University’s Cyber Policy Center and a senior director for cybersecurity policy in the Obama and Trump administrations. “This is a technique that would have sounded exotic in the extreme ten years ago but is now well within reach of any lay criminal with a little inventiveness,” Grotto added.

robot hands

Symantec researchers discovered at least three incidents of executives’ voices being imitated in order to defraud businesses. Symantec declined to identify the victim companies or say whether the Euler Hermes case was one of them, but it did say that one of the cases resulted in multimillion-dollar losses. The systems function by breaking down a person’s voice into components, such as sounds or syllables, which can then be reassembled to generate new sentences with comparable speech patterns, pitch, and tone. The insurer did not know what software was used, although several of the systems are publicly available on the internet and require little expertise, speech data, or computational capacity.

“Manipulating audio, which is easier to arrange than deep fake films, will only grow in volume, and without education and knowledge of this new sort of attack vector, as well as stronger authentication mechanisms, more organizations are likely to fall prey to very convincing dialogues.”

Once restricted to fictitious capers like Mission: Impossible, voice cloning is now commonly available. From London’s Aflorithmic to Ukraine’s Respeechers, various tech businesses are developing increasingly complex AI voice technology. AI. In recent months, the late Anthony Bourdain’s voice was synthesized for a documentary on his life, causing quite a stir. Meanwhile, realizing the potential for malevolent use of AI, a few companies, including the $900 million-valued security startup Pindrop, claim they can detect synthesized speech and so prevent fraud if recordings of anyone speaking are available online, whether on social media, YouTube, or an employer’s website, a secret war for control of their voice may be taking place without your knowledge.

If you have been scammed through online, then contact us to get your money back!

robot

What Would Funds Trace Do in a Case Similar to This?

Funds Trace is a lifesaver for fraud victims; it is an investigative recovery firm equipped with industry specialists who can examine your case, collect information and data on your criminals, and assist you in discovering your fraudster. 

They don’t stop there; they make certain that the victim’s money is returned in full. Funds Trace provides a completely integrated and functional solution to identifying and recovering stolen assets. The full recovery process could take 1-3 months, depending on the severity of your fraud case. If you suspect funds have been taken from your account, their local team will conduct an investigation and pursue legal action on your behalf. They will facilitate the process for you by registering a dispute and contesting the case for you. What else could someone wish for?

Key Takeaways!

The UAE case demonstrates how destructive such high-tech swindles can be, and it comes amid warnings about the use of AI in cybercrime to create so-called deep fake images and voices.

“Audio and visual deep fakes are a fascinating development of 21st-century technology, but they are also potentially incredibly dangerous, posing a huge threat to data, money, and businesses,” says Jake Moore, a former police officer with the Dorset Police Department in the United Kingdom and now a cybersecurity expert at security firm ESET. “We are currently on the verge of malevolent actors moving their skills and resources into employing cutting-edge technology to deceive people who are blissfully oblivious of the realms of deep fake technology, let alone their existence.”

We need to be vigilant and have firms like funds trace to have our back in situations like this.

Table of Contents

Share this article
Facebook
Twitter
LinkedIn
Leave a comment