Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ Wow if they had stopped after the first transfer, they might have gotten away with it. These kind of calls are getting more and more common, I was getting calls a few months ago saying my social security number was suspended. I knew this was impossible so paid no attention to it. I wonder if this means governments and companies are going to crack down on things like lirebird? Maybe it would be for the best if they did. URL: https://forum.audiogames.net/post/460123/#p460123 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ Hm, I mentioned Lyrebird offhandledly because its a similar voice synthesis method using Deep Learning networks like WaveNet or DeepVoice, but yes it has some safeguards in place for account registration and data management that at the very least discourage casual deepfakery. But they aren't responsible for hacked or compromised accounts, or potential data breaches, seeing as the biometric data is stored on third party amazon servers. Depending on how thick of a tinfoil hat your wearing, some could argue even having the data out there at all where something could happen to it is a cyber security risk, given how very new and evolving the technology is. But thats ultimately a decision for people to weigh and make for themselves, and on how much tinfoil they happen to have on hand, heh.Realistically, its unlikely Lyrebird was involved given the ease of access and sophistication of other voice AI libraries available, and the possible methods they used to generate a highly sophisticated deepfake of the target. Why spend the time trying to compromise a platform like Lyrebird for generating a deepfake with the added risk of oversight when you can more easily roll an open API and use a scraped dataset? URL: https://forum.audiogames.net/post/460121/#p460121 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ Well doing that wouldn't exactly work with something like lyrebird because they give you spasific things to say and I believe you have to say them in order for it to work, so yeah. I also think they go through the recordings manually. URL: https://forum.audiogames.net/post/460063/#p460063 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
2019-09-06
Thread
AudioGames . net Forum — Off-topic room : ashleygrobler04 via Audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$ oops... sorry, i ment to type think, but little sister of mine can't keep her hands of the keyboard it seems URL: https://forum.audiogames.net/post/460024/#p460024 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
2019-09-06
Thread
AudioGames . net Forum — Off-topic room : ashleygrobler04 via Audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$ Speaking about where you found the article: doesn't RAM also have some news for us about this? just kidding.Well to the point, it sounds bad. to thinyju9m,k that i can impersonate some one's voice and get allong steeling some money...What about the devices that requires voice recognition?Voice recognission does not sound that smart to me any more... URL: https://forum.audiogames.net/post/460023/#p460023 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ @4Theoretically, yes, you could. Some people have already created deepfakes of Mark Zuckerberg, Obama, and other celebrities by mining online videos. For eample, there's an entirely fake audio recording [here] of Joe Rogan made by engineers called RealTalk, there's links to some articles about it on medium in the description.I've also dug up a more recent [Guide To Speech Synthesis With Deep Learning] posted a week ago, and a list of the Tacotron research articles with some more recent articles up to July [here]. URL: https://forum.audiogames.net/post/459922/#p459922 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ @4Theoretically, yes, you could. Some people have already created deepfakes of Mark Zuckerberg, Obama, and other celebrities by mining online videos.I've also dug up a more recent [Guide To Speech Synthesis With Deep Learning] posted a week ago, and a list of the Tacotron research articles with some more recent articles up to July [here]. URL: https://forum.audiogames.net/post/459922/#p459922 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ @4Theoretically, yes, you could. Some people have already created deepfakes of Mark Zuckerberg, Obama, and other celebrities by mining online videos.I've also dug up a more recent [Guide To Seech Synthesis With Deep Learning] posted a week ago, and a list of the tacotron researxh articles with some more recent articles up to July [here]. URL: https://forum.audiogames.net/post/459922/#p459922 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ Hi.OO you mean I could go and try replicating the voice of a politician or any other public person? Not to generate money, but for trying this out would be quite awsome to see how good these programs are.Greetings Moritz. URL: https://forum.audiogames.net/post/459921/#p459921 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ There are links to the Wall Street Journal embedded in the article, though they insist on you turning off your ad blocker... Its difficult to say exactly how the attackers managed to synthesize the Chief Executive's voice considering they haven't been caught, but open source [deep learning] speech synthesizers like DeepVoice, WaveNet, or Tacotron are likely candidates, among others. All they would need is a sufficient number of voice recordings from the Chief Executive to generate copies, which they could have possibly gotten by installing malware on his phone, compromised an existing database or service, mined youtube videos or podcasts, or maybe hacked alexa's or nest devices to record his conversations to build a suitable training dataset to work with.These deep learning tools are publicly available to anyone online, and only require a certain amount of training material to create passable fakes. If your interested in reading up on some of the progress and audio examples they have I have some previous links for the [Wavenet Library], [Char2Wav], and [Tacotron] research articles. URL: https://forum.audiogames.net/post/459920/#p459920 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Re: Audio Deepfake Used To Steal 240,000$
Re: Audio Deepfake Used To Steal 240,000$ The only thing I know of that can create synthetic voices with AI is Lyrebird, and to create a voice for that, you need to read specific sentences. Obviously the CEO wouldn't make his own voice and then let thieves, or anyone else, use it at their will. Lyrebird makes very clear that only you should have access to your account, and all of your recorded data belongs to you, so I'm a little confused as to how malicious things happened, unless Lyrebird was hacked or the thieves used another service that is prone to this sort of thing happening. But without the details of what exactly was compromised and how, the article is too vague to be useful to people who actually want to educate themselves imho. URL: https://forum.audiogames.net/post/459894/#p459894 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Audio Deepfake Used To Steal 240,000$
Audio Deepfake Used To Steal 240,000$ As reported by [motherboard]:Remember those days spent playing with Lyrebird, Googles WaveNet, or other speech synthesis AI's? Ahh, such fun... Well we all knew where that could end up, and that moment has come. The Wall Street Journal reported that thieves used a DeepFake audio recording to impersonate a Chief Executive to con the managing director of a UK based energy company into transfering 240,000$ to a Hungarian account to save on "late-payment fines", sending over financial details over email while on the phone. According to the director the synthesis was able to imitate not just his voice, but the tonality, punctuation, and his German accent. The ruse was uncovered when the thieves called back demanding a second payment, making the managing director suspicious, prompting him to call the real Chief Executive, and being in the bizzare situation of talking to his boss whilst the deepfake copy of his boss was simultaneously demanding to speak to him. URL: https://forum.audiogames.net/post/459873/#p459873 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector
Audio Deepfake Used To Steal 240,000$
Audio Deepfake Used To Steal 240,000$ As reported by [motherboard]:Remember those days spent playing with Lyrebird, Googles WaveNet, or other speech synthesis AI's? Ahh, such fun... Well we all knew where that could end up, and that moment has come. The Wall Street Journey reported that thieves used a DeepFake audio recording to impersonate a Chief Executive to con the managing director of a UK based energy company into transfering 240,000$ to a Hungarian account to save on "late-payment fines", sending over financial details over email while on the phone. According to the director the synthesis was able to imitate not just his voice, but the tonality, punctuation, and his German accent. The ruse was uncovered when the thieves called back demanding a second payment, making the managing director suspicious, prompting him to call the real Chief Executive, and being in the bizzare situation of talking to his boss whilst the deepfake copy of his boss was simultaneously demanding to speak to him. URL: https://forum.audiogames.net/post/459873/#p459873 -- Audiogames-reflector mailing list Audiogames-reflector@sabahattin-gucukoglu.com https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector