fake chatgpt domains: Fake ChatGPT domains and extensions steal personal information: study

[ad_1]

A new study has warned against the proliferation of fake ChatGPT domains and extensions, which, it said, are being used to either steal personal information or compromise user devices.

Technisant, a digital enterprise risk management firm, said malicious actors are using social engineering tactics to trick users into downloading and installing fake ChatGPT applications, including creating convincing logos and web pages, as well as using persuasive language in their marketing materials.

ChatGPT is a generative artificial intelligence-based chatbot launched a few months ago by OpenAI which has rapidly gained popularity worldwide.

Last week, security firm CloudSEK said at least 13 Facebook accounts with more than 500,000 followers were compromised and were being used to disseminate the malware via Facebook ads making it look like it was a link to an Open AI page.

“Fake domains are usually used for phishing and to an extent to push malware and stealers. This could potentially steal data of consumers. There is a high demand to access the ChatGPT platform and consumers with less awareness could potentially be a victim of all these,” said Nandakishore Harikumar, CEO, Technisanct.

A few weeks after ChatGPT was launched, the company identified a domain named ‘Chat GpT for Windows’, asking users to download an executable file. This was malware, designed to steal data from Windows devices.

Discover the stories of your interest


Similarly, it came across another Google Chrome extension, which once installed, worked like a browser data stealer, said Harikumar. This means it could steal information like login credentials, along with other data, he said.Researchers at CloudSEK said they have found several instances in the past two months of Facebook and YouTube pages being taken over by cyber criminals.

“After taking over a Facebook account or page, the threat actors modify the profile information to make it appear as if it is an authentic ChatGPT page. This involves using the username “ChatGPT OpenAI” and setting the ChatGPT image as the profile picture. These accounts are then used to run Facebook ads offering links to the “latest version of ChatGPT, GPT- V4” which, when downloaded, deploys a stealer malware into the victim’s device,” said the report.

The ads are designed in such a way that they appear legitimate, containing all the necessary details to appear convincing to unsuspecting users, said Bablu Kumar, cyber intelligence analyst at CloudSEK.

“The download link is accompanied by a password to lend further credibility to the scam. Furthermore, compromised accounts can also result in the theft of personally identifiable information and sensitive details such as payment information, etc.,” he said.

Users would do well to remember that ChatGPT does not require a download or an app, and can only be accessed through the browser. “All those apps claiming to be a ChatGPT app are not owned by OpenAI. They are just using the popularity of ChatGPT to promote their platform or app,” said Harikumar.

Stay on top of technology and startup news that matters. Subscribe to our daily newsletter for the latest and must-read tech news, delivered straight to your inbox.

[ad_2]

Source link


Leave a Reply

Your email address will not be published. Required fields are marked *