Affiliate links on Android Authority may earn us a commission. Learn more.
A novice just used ChatGPT to create terrifyingly sophisticated malware
- A self-proclaimed novice reportedly created data mining malware using ChaptGPT.
- The malware is said to be as sophisticated as nation-state-level malware.
- Building this kind of malware would’ve taken a team weeks to create.
Now that AI tools, like Bard and ChatGPT, are available to the public, it may be time to start asking questions about safety. Especially now that someone has used it to create a sophisticated malware program that’s almost undetectable.
According to Digital Trends, Forcepoint security researcher Aaron Mulgrew says that he created zero-day malware using only ChatGPT. While Open AI’s chatbot has protections that normally prevent users from doing this, the self-proclaimed novice found a loophole.
Instead of having the software create the malware all at once, Mulgrew reportedly had the AI write separate lines of malicious code. Once the process was done, Mulgrew was able to compile the individual functions into a single cohesive data-stealing program.
The malware in question is said to disguise itself as a screensaver app that auto-launches on Windows. It’s capable of taking data from files, breaking it down into smaller pieces that hide in images, and uploading that data to a Google Drive folder.
The fact that Mulgrew was able to create malware this way is scary enough, but it gets worse. Reportedly, Mulgrew was able to refine and strengthen his code with ChatGPT to the point that VirusTotal tests could no longer detect it. It’s also said to be as sophisticated as any nation-state-level threat.
However, the truly scary part is the fact that he did this all on his own in a matter of hours. Creating malware of this level would usually be a team effort and it would require weeks of work to compile.
Thankfully, the malware is not publicly available. This was just a test that Mulgrew was conducting. But this goes to show just how dangerous ChatGPT could be in the wrong hands.