Fake ChatGPT extension to steal victim’s account details

compatible GettyImages 1179506263 5


Palo Alto Networks, a global cybersecurity company, has released findings from its threat intelligence team, Unit 42, on how the growing popularity of generative AI is leading to a surge in ChatGPT-themed fraud. bottom.

The research sheds light on the various tactics scammers use to trick users into sharing sensitive information or installing malicious software. The research also provides concrete examples and case studies to demonstrate these methods.

Unit 42 looked at several phishing URLs masquerading as OpenAI’s official website. Scammers behind such schemes usually create fake girlfriend websites that look a lot like ChatGPT’s official website to trick users into downloading malicious software or sharing their personal and confidential information. It is intended to be disclosed.

Additionally, scammers can use ChatGPT-related social engineering to commit identity theft and financial fraud. OpenAI offers a free version of ChatGPT, but scammers often trick victims into visiting scam websites and paying for these services. For example, the Fake Girlfriend ChatGPT site may try to lure victims into providing confidential information such as credit card details and email addresses.

Researchers have also noticed that some scammers are exploiting the growing popularity of OpenAI for cryptocurrency scams. For example, scammers can exploit her OpenAI logo and Elon Musk name to lure victims into fraudulent cryptocurrency giveaway events.

Sean Duca, Vice President and Regional Chief Security Officer, Palo Alto Networks, said:

“We need to educate ourselves and stay informed about the tactics scammers employ to protect themselves and sensitive information,” he says.

“Unit 42 research highlights the need for a robust cybersecurity framework to protect against the growing threat landscape. As technology continues to evolve, so must our approach to cybersecurity. not.”

Key findings from the Unit 42 report include:

  • The AI ​​ChatGPT extension can add background scripts containing highly obfuscated JavaScript to the victim’s browser. This JavaScript can call Facebook APIs to steal the victim’s account details and gain access to the account.
  • Between November 2022 and April 2023, Unit 42 observed a 910% increase in monthly registrations for domains related to ChatGPT.
  • Over 100 ChatGPT-related malicious URLs were detected daily, captured from traffic seen by Palo Alto Networks’ advanced URL filtering system.
  • In the same timeframe, the team observed a nearly 18,000% increase in squatting domains from DNS security logs. Unit 42 has seen multiple phishing URLs that attempt to impersonate the official OpenAI site. Scammers usually create fake girlfriend websites that look a lot like the official ChatGPT website to trick users into downloading malware or sharing sensitive information.
  • Even though OpenAI offers users a free version of ChatGPT, scammers direct victims to fraudulent websites and claim that they have to pay for these services.


Source link

What do you think?

Leave a Reply

GIPHY App Key not set. Please check settings

    Copy of sgsdfsdeeee4 2023 05 01T190257.879

    Vatican working on secret peace mission in Ukraine

    HFO nat montana zephyr wmfl facebookJumbo

    Judge Rejects Zooey Zephyr’s Effort to Return to Montana House Floor