The conversation around uniting artificial intelligence and the cryptocurrency industry has frequently revolved around the capacity of AI to assist in the deterrence of fraudulent activities. Nonetheless, a critical consideration is being overlooked; AI could paradoxically exacerbate the issue. Meta has issued warnings recently regarding malevolent actors leveraging OpenAI’s ChatGPT to infiltrate Facebook user accounts.
During March and April, Meta registered over a thousand instances of deceptive links disguised as ChatGPT extensions. The platform even went to the extent of dubbing ChatGPT the “new crypto” from the perspective of fraudsters. This revelation comes as the terms “ChatGPT” or “OpenAI” are increasingly used in token trading pairs on DEXTools, a dynamic crypto trading platform monitoring several tokens, to an aggregate count of over 700. This trend is unfolding despite OpenAI’s conspicuous silence on any intentions to venture into the blockchain realm.
Cybercriminals are increasingly exploiting social media platforms as launchpads for promoting sham coins. The broad influence and extensive reach of social media platforms provide the ideal conditions for these fraudsters to amass considerable followings in short order. With the aid of AI-driven instruments, these fraudsters can significantly boost their influence and create the mirage of a devoted following, thereby enhancing the perceived credibility of their fraudulent schemes.
A considerable portion of cryptocurrency operations hinge on social proof-of-work, with investors and potential buyers placing faith in projects that display large and seemingly loyal online followings. However, the advent of AI threatens this trust-based system.
With the advancement of AI, endorsement numbers and realistic interactions can no longer be viewed as indicators of legitimacy. Various other threats are likely to arise with the further advancement of AI, including “pig butchering” scams, where AI interacts with victims, typically the elderly or vulnerable, over a period of time before defrauding them. AI’s sophistication enables fraudsters to automate their operations, thereby extending their reach to susceptible individuals within the crypto ecosystem.
AI-powered chatbots and virtual assistants could be used by these fraudsters to interact with potential victims, offer investment advice, advertise fake tokens or ICOs, or propose lucrative investment opportunities. The ability of these AI-driven scams to mimic human conversations accurately makes them particularly perilous. With the conjunction of social media platforms and AI-generated content, these criminals can organize sophisticated pump-and-dump scams, subsequently leading many investors into losses.
Investors have been cautioned about deepfake scams that employ AI to fabricate highly realistic digital content, manipulating images, videos, and audio to falsely portray influencers or renowned personalities endorsing fraudulent schemes. One notorious example involved a deepfake video of the former CEO of FTX, Sam Bankman-Fried, that directed users to a deceptive website claiming to double their cryptocurrency.
In March 2023, the self-proclaimed AI project Harvest Keeper defrauded its users of approximately $1 million. Concurrently, projects began appearing on Twitter under the moniker “CryptoGPT.”
In contrast, AI holds promise in automating mundane aspects of crypto development, thus serving as a valuable tool for blockchain specialists. Routine tasks, such as establishing Solidity environments or generating base code, could be simplified with AI. The resultant reduction in entry barriers could shift the focus in the crypto industry from development skills to the intrinsic value of one’s ideas.
AI’s potential to democratize processes typically considered exclusive to an elite class — such as seasoned developers — is exciting. The expanding accessibility to advanced development tools and launchpads in the crypto space is encouraging. However, this must be counterbalanced with caution, as the simplicity with which projects can defraud individuals also increases. Users are advised to exercise vigilance and thorough investigation before investing, such as by being wary of suspicious URLs and avoiding sudden, unknown investments.