Thieves use deepfakes to trick companies into sending money to themselves





From the moment of their appearance in December 2017, dipfeyki, videos with an almost perfect face replacement created by a neural network have caused panic among experts. Many, for example, were then afraid that now “porn revenge” would become even easier, when a former boyfriend with a sufficiently powerful PC could make any dirty porn with a girlfriend. And Natalie Portman and Scarlett Johansson, about whom porn with deepfake was filmed especially much, publicly cursed the Internet.







To combat the impending threat, Facebook and Microsoft recently put together a coalition to deal with diplomatic attacks, announcing a prize pool of $ 10 million to those developers who come up with the best algorithms to detect them. This is in addition to DARPA, the US Department of Defense research management, which has allocated $ 68 million over the past two years for this purpose.







Well, it's too late. The first deepfake crime has already taken place.











According to the Wall Street Journal, in March this year, the managing director of a British energy company was robbed of € 220,000 (about $ 240,000). He sent the money to a supplier company from Hungary because his boss, the head of the parent company in Germany, confirmed this instruction to him several times. But in fact, some cunning attacker simply used software with AI technologies to replace the leader’s face and voice in real time, and demand that he pay him within an hour.







The program that the thief used was able to completely imitate a person’s voice: tone, punctuation, even a German accent. The message came from the address of the boss in Germany, an e-mail with contacts was sent to the British director in confirmation. Assuming that something was going wrong, it could only be that at the request of the boss to conduct the whole deal as quickly as possible, but this is not the first trouble that happened in their business.







As a result, all the money was gone. From the Hungarian account they were transferred to Mexico, and then scattered around the world. But the thieves did not stop there. They asked for a second urgent translation so that “supplies from Hungary” “go even faster.” Then the British director sensed something was amiss, and called his real boss. It turned out some kind of sur: he took turns receiving calls from either a fake one, or from a real leader, speaking in the same voice. The name of the company and its employees were not disclosed, since an investigation is underway in this case, and thieves have not yet been found.











Perhaps this is not even the first theft using Deepfake AI (or its advanced followers). Symantec reports that it spotted at least three cases in which a voice replacement helped thieves outwit companies and force them to send them money. In one of these cases, the damage amounted to millions of dollars. Moreover, judging by indirect evidence, this trick was done by other attackers - not those who robbed the British CEO. That is, deepfake crimes are gradually becoming common property, this is not the notion of any one ingenious hacker.







In fact, soon any student will be able to do this procedure. The main thing is to find a trusting enough victim, and collect the right amount of video / audio samples to impersonate who you need. Google Duplex is already successfully mimicking the voice of a real person to make calls on its behalf. Many small startups, mainly from China, are working to offer deepfake-like services for free. Different diphey programs even compete with each other, who can generate quite convincing video with a person using the minimum amount of data. Some claim that soon one of your photos will be enough for them.







In July, the Israel National Anti-Cyber ​​Threat Protection Agency issued a warning about a fundamentally new type of cyberattack, which could be aimed at the leadership of companies, senior officials, and even senior officials. This is the first and at the moment the most real AI threat. They say that now there are programs that can perfectly convey your voice and accent after listening to you for 20 minutes. If somewhere in the network there is a recording with your speech for half an hour, or if someone sat a bit next to you in a cafe with a voice recorder, your voice can now be used to say anything to anyone.







So far, there are no tools to combat this. Only one option to defend. If someone calls you and asks to transfer a substantial amount of money, it will not be superfluous to confirm that he is the same person through another channel. Via messengers, Skype, e-mail, corporate channels or social networks. And ideally, of course, face to face.







Well, if you have deep knowledge in machine learning and you are not averse to getting a piece from a pie of $ 10 million - you can try yourself in the Microsoft and Facebook contest . Or start your own startup, offering the state and reputable companies a business solution for determining diphakes by picture or by voice. We will not be able to do without it soon.









PS Pochtoy.com can deliver any goods from the USA. Now - not only to Russia, but also to Ukraine , thanks to cooperation with New Mail. The parcel cost is 0.5 kg from $ 11.99 (for Ukraine - from $ 8.00). For new registrations with HABR promo code - 7% discount on first delivery, plus free redemption from American stores by our operators.










All Articles