AI Deepfakes Used by Criminals to Bypass Crypto Security

AI Deepfakes Used by Criminals to Bypass Crypto Security

In Summary

  • AI deepfake tool sold in underground markets bypasses crypto security.
  • Used to create fake credentials for illicit activities.
  • Over $5.3B lost to fraud in 2023, up from $3.9B in 2022.
  • Experts urge crypto exchanges to enhance security measures.


New York, Friday, October 18, 2024 – A network security firm has identified a new AI tool being sold in underground markets, enabling criminals to bypass security protocols at cryptocurrency exchanges.

According to a report by Cato Networks, the deepfake tool is being used to create synthetic accounts for illicit activities like money laundering and fraud.

The firm warns that criminals are generating fake credentials using AI-rendering websites and employing the deepfake tool to forge documents such as passports.

These documents are then used to pass crypto exchanges’ identity verification systems, allowing the creation of verified but fake accounts.

Cato Networks shared a demonstration where the tool was used to create a verified account on a crypto exchange within minutes. Instead of using live facial recognition, the tool connects a pre-made deepfake video to mimic the required verification.

Fraud involving new accounts has surged, with over $5.3 billion in losses reported in 2023, up from $3.9 billion in 2022. As these AI-based tools evolve, cybersecurity experts urge crypto exchanges to strengthen their security systems to combat such threats.

Cato Networks advises exchanges to stay updated on the latest cybercrime trends and adopt intelligence-driven strategies to safeguard against these AI-driven attacks.

Protected by Copyscape