More Than 100,000 Compromised ChatGPT Account Credentials Sold on Dark Web

Over 100,000 compromised OpenAI ChatGPT account credentials have found their way on illicit dark web marketplaces between June 2022 and May 2023, with India alone accounting for 12,632 stolen credentials.

The credentials were discovered within information stealer logs made available for sale on the cybercrime underground, Group-IB said in a report shared with The Hacker News.

“The number of available logs containing compromised ChatGPT accounts reached a peak of 26,802 in May 2023,” the Singapore-headquartered company said. “The Asia-Pacific region has experienced the highest concentration of ChatGPT credentials being offered for sale over the past year.”

Other countries with the most number of compromised ChatGPT credentials include Pakistan, Brazil, Vietnam, Egypt, the U.S., France, Morocco, Indonesia, and Bangladesh.

A further analysis has revealed that the majority of logs containing ChatGPT accounts have been breached by the notorious Raccoon info stealer, followed by Vidar and RedLine.

Information stealers have become popular among cybercriminals for their ability to hijack passwords, cookies, credit cards, and other information from browsers, and cryptocurrency wallet extensions.

“Logs containing compromised information harvested by info stealers are actively traded on dark web marketplaces,” Group-IB said.

“Additional information about logs available on such markets includes the lists of domains found in the log as well as the information about the IP address of the compromised host.”

Typically offered based on a subscription-based pricing model, they have not only lowered the bar for cybercrime, but also serve as a conduit for launching follow-on attacks using the siphoned credentials.

“Many enterprises are integrating ChatGPT into their operational flow,” Dmitry Shestakov, head of threat intelligence at Group-IB, said.

“Employees enter classified correspondences or use the bot to optimize proprietary code. Given that ChatGPT’s standard configuration retains all conversations, this could inadvertently offer a trove of sensitive intelligence to threat actors if they obtain account credentials.”

To mitigate such risks, it’s recommended that users follow appropriate password hygiene practices and secure their accounts with two-factor authentication (2FA) to prevent account takeover attacks.

The development comes amid an ongoing malware campaign that’s leveraging fake OnlyFans pages and adult content lures to deliver a remote access trojan and an information stealer called DCRat (or DarkCrystal RAT), a modified version of AsyncRAT.

“In observed instances, victims were lured into downloading ZIP files containing a VBScript loader which is executed manually,” eSentire researchers said, noting the activity has been underway since January 2023.

“File naming convention suggests the victims were lured using explicit photos or OnlyFans content for various adult film actresses.”

It also follows the discovery of a new VBScript variant of a malware called GuLoader (aka CloudEyE) that employs tax-themed decoys to launch PowerShell scripts capable of retrieving and injecting Remcos RAT into a legitimate Windows process.

“GuLoader is a highly evasive malware loader commonly used to deliver info-stealers and Remote Administration Tools (RATs),” the Canadian cybersecurity company said in a report published earlier this month.

“GuLoader leverages user-initiated scripts or shortcut files to execute multiple rounds of highly obfuscated commands and encrypted shellcode. The result is a memory-resident malware payload operating inside a legitimate Windows process.”

– The Hacker News

Related posts

Advancing IT Support to the Next Era: TeamViewer Integrates Microsoft Teams into its AI-Powered Insights

“Automation Anywhere: Pioneering the Transformation of Enterprise Business Processes in India”

Accenture Expands Generative AI-Powered Cybersecurity Services and Capabilities to Accelerate Clients’ Resilience and Reinvention

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More