Features including a “strict no-logs policy”, access via Telegram and use for coding.
A GenAI tool named GhostGPT is being offered to cyber-criminals for help with writing malware code and phishing emails.
According to a blog post by Abnormal Security, GhostGPT is marketed as an “uncensored AI” and is likely a wrapper for a jailbroken version of ChatGPT or an open-source GenAI model, according to SC US.
It offers several features including a “strict no-logs policy” ensuring no records are kept of conversations, and access via a Telegram bot.
The researchers tested GhostGPT’s capabilities by asking it to write a phishing email from Docusign, and the chatbot responded with a template for a convincing email directing the recipient to click a link to review a document.
GhostGPT can also be used for coding, with the blog post noting marketing related to malware creation and exploit development.
“Attackers now use tools like GhostGPT to create malicious emails that appear completely legitimate. Because these messages often slip past traditional filters, AI-powered security solutions are the only effective way to detect and block them,” the researchers wrote.
Written by
Dan Raywood
Senior Editor
SC Media UK
Dan Raywood is a B2B journalist with more than 20 years of experience, including covering cybersecurity for the past 16 years. He has extensively covered topics from Advanced Persistent Threats and nation-state hackers to major data breaches and regulatory changes.
He has spoken at events including 44CON, Infosecurity Europe, RANT Conference, BSides Scotland, Steelcon and ESET Security Days.
Outside work, Dan enjoys supporting Tottenham Hotspur, managing mischievous cats, and sampling craft beers.