New York, Friday, October 18, 2024 – A network security firm has identified a new AI tool being sold in underground markets, enabling criminals to bypass security protocols at cryptocurrency exchanges.
According to a report by Cato Networks, the deepfake tool is being used to create synthetic accounts for illicit activities like money laundering and fraud.
The firm warns that criminals are generating fake credentials using AI-rendering websites and employing the deepfake tool to forge documents such as passports.
These documents are then used to pass crypto exchanges’ identity verification systems, allowing the creation of verified but fake accounts.
Cato Networks shared a demonstration where the tool was used to create a verified account on a crypto exchange within minutes. Instead of using live facial recognition, the tool connects a pre-made deepfake video to mimic the required verification.
Fraud involving new accounts has surged, with over $5.3 billion in losses reported in 2023, up from $3.9 billion in 2022. As these AI-based tools evolve, cybersecurity experts urge crypto exchanges to strengthen their security systems to combat such threats.
Cato Networks advises exchanges to stay updated on the latest cybercrime trends and adopt intelligence-driven strategies to safeguard against these AI-driven attacks.