Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Activate subscription >
Add devices or upgrade >
Renew subscription >
Secure Hub >
Don’t have an account?
Sign up >
< Products
Have a current computer infection?
Worried it’s a scam?
Try our antivirus with a free, full-featured 14-day trial
Get your free digital security toolkit
Find the right cyberprotection for you
< Business
< Pricing
Protect your personal devices and data
Protect your team’s devices and data – no IT skills needed
Explore award-winning endpoint security for your business
< Resources
< Support
Malwarebytes and Teams Customers
Nebula and Oneview Customers
Scammers have found a new use for AI: creating custom chatbots posing as real AI assistants to pressure victims into buying worthless cryptocurrencies.
We recently came across a live “Google Coin” presale site featuring a chatbot that claimed to be Google’s Gemini AI assistant. The bot guided visitors through a polished sales pitch, answered their questions about investment, projecting returns, and ultimately ended with victims sending an irreversible crypto payment to the scammers.
Google does not have a cryptocurrency. But as “Google Coin” has appeared before in scams, anyone checking it out might think it’s real. And the chatbot was very convincing.
The chatbot introduced itself as,
“Gemini — your AI assistant for the Google Coin platform.”
It used Gemini-style branding, including the sparkle icon and a green “Online” status indicator, creating the immediate impression that it was an official Google product.
When asked, “Will I get rich if I buy 100 coins?”, the bot responded with specific financial projections. A $395 investment at the current presale price would be worth $2,755 at listing, it claimed, representing “approximately 7x” growth. It cited a presale price of $3.95 per token, an expected listing price of $27.55, and invited further questions about “how to participate.”
This is the kind of personalized, responsive engagement that used to require a human scammer on the other end of a Telegram chat. Now the AI does it automatically.
What stood out during our analysis was how tightly controlled the bot’s persona was. We found that it:
When pressed, the bot doesn’t get confused or break character. It loops back to the same scripted claims: a “detailed 2026 roadmap,” “military-grade encryption,” “AI integration,” and a “growing community of investors.”
Whoever built this chatbot locked it into a sales script designed to build trust, overcome doubt, and move visitors toward one outcome: sending cryptocurrency.
Scammers have always relied on social engineering. Build trust. Create urgency. Overcome skepticism. Close the deal.
Traditionally, that required human operators, which limited how many victims could be engaged at once. AI chatbots remove that bottleneck entirely.
A single scam operation can now deploy a chatbot that:
This matches a broader trend identified by researchers. According to Chainalysis, roughly 60% of all funds flowing into crypto scam wallets were tied to scammers using AI tools. AI-powered scam infrastructure is becoming the norm, not the exception. The chatbot is just one piece of a broader AI-assisted fraud toolkit—but it may be the most effective piece, because it creates the illusion of a real, interactive relationship between the victim and the “brand.”
The chatbot sits on top of a convincing scam operation. The Google Coin website mimics Google’s visual identity with a clean, professional design, complete with the “G” logo, navigation menus, and a presale dashboard. It claims to be in “Stage 5 of 5” with over 9.9 million tokens sold and a listing date of February 18—all manufactured urgency.
To borrow credibility, the site displays logos of major companies—OpenAI, Google, Binance, Squarespace, Coinbase, and SpaceX—under a “Trusted By Industry” banner. None of these companies have any connection to the project.
If a visitor clicks “Buy,” they’re taken to a wallet dashboard that looks like a legitimate crypto platform, showing balances for “Google” (on a fictional “Google-Chain”), Bitcoin, and Ethereum.
The purchase flow lets users buy any number of tokens they want and generates a corresponding Bitcoin payment request to a specific wallet address. The site also layers on a tiered bonus system that kicks in at 100 tokens and scales up to 100,000: buy more and the bonuses climb from 5% up to 30% at the top tier. It’s a classic upsell tactic designed to make you think it’s smarter to spend more.
Every payment is irreversible. There is no exchange listing, no token with real value, and no way to get your money back.
We’re entering an era where the first point of contact in a scam may not be a human at all. AI chatbots give scammers something they’ve never had before: a tireless, consistent, scalable front-end that can engage victims in what feels like a real conversation. When that chatbot is dressed up as a trusted brand’s official AI assistant, the effect is even more convincing.
According to the FTC’s Consumer Sentinel data, US consumers reported losing $5.7 billion to investment scams in 2024 (more than any other type of fraud, and up 24% on the previous year). Cryptocurrency remains the second-largest payment method scammers use to extract funds, because transactions are fast and irreversible. Now add AI that can pitch, persuade, and handle objections without a human operator—and you have a scalable fraud model.
AI chatbots on scam sites will become more common. Here’s how to spot them:
They impersonate known AI brands. A chatbot calling itself “Gemini,” “ChatGPT,” or “Copilot” on a third-party crypto site is almost certainly not what it claims to be. Anyone can name a chatbot anything.
They won’t answer due diligence questions. Ask what legal entity operates the platform, what financial regulator oversees it, or where the company is registered. Legitimate operations can answer those questions, scam bots try to avoid them (and if they do answer, verify it).
They project specific returns. No legitimate investment product promises a specific future price. A chatbot telling you that your $395 will become $2,755 is not giving you financial information—it’s running a script.
They create urgency. Pressure tactics like, “stage 5 ends soon,” “listing date approaching,” “limited presale” are designed to push you into making fast decisions.
Google does not have a cryptocurrency. It has not launched a presale. And its Gemini AI is not operating as a sales assistant on third-party crypto sites. If you encounter anything suggesting otherwise, close the tab.
If you’ve already sent funds, report it to your local law enforcement, the FTC at reportfraud.ftc.gov, and the FBI’s IC3 at ic3.gov.0xEc7a42609D5CC9aF7a3dBa66823C5f9E5764d6DA98388xymWKS6EgYSC9baFuQkCpE8rYsnScV4L5Vu8jtDHyDmJdr9hjDUH5kcNjeyfzonyeBt19g6GTWqzJ9sF1w9aWwMevq4b15KkJgAFTfH5imbc1qw0yfcp8pevzvwp2zrz4pu3vuygnwvl6mstlnh6 r9BHQMUdSgM8iFKXaGiZ3hhXz5SyLDxupY
We don’t just report on scams—we help detect them
Cybersecurity risks should never spread beyond a headline. If something looks dodgy to you, check if it’s a scam using Malwarebytes Scam Guard. Submit a screenshot, paste suspicious content, or share a link, text or phone number, and we’ll tell you if it’s a scam or legit. Available with Malwarebytes Premium Security for all your devices, and in the Malwarebytes app for iOS and Android.
SHARE THIS ARTICLE
Stefan Dasic
Passionate about antivirus solutions, Stefan has been involved in malware testing and AV product QA from an early age. As part of the Malwarebytes team, Stefan is dedicated to protecting customers and ensuring their security.
This article explains why Chrome’s “preloading” can cause scary-looking blocks in Malwarebytes Browser Guard.
Google has released an emergency update to patch an actively exploited zero-day—the first Chrome zero-day of the year.
A list of topics we covered in the week of February 9 to February 15 of 2026
AI chat app leak exposes 300 million messages tied to 25 million users
Grok continues producing sexualized images after promised fixes
Firefox is giving users the AI off switch
Contributors
Threat Center
Podcast
Glossary
Scams
Malwarebytes – all-in-one cybersecurity protection always by your side.
COMPUTER SECURITY
MOBILE SECURITY
PRIVACY PROTECTION
IDENTITY PROTECTION
LEARN ABOUT CYBERSECURITY
PARTNER WITH MALWAREBYTES
ADDRESS
One Albert Quay
2nd Floor
Cork T12 X8N6
Ireland
2445 Augustine Drive
Suite 550
Santa Clara, CA
USA, 95054
ABOUT MALWAREBYTES
WHY US
GET HELP
Want to stay informed on the latest news in cybersecurity? Sign up for our newsletter and learn how to protect your computer from threats.
By submitting this form, you consent to Malwarebytes contacting you regarding products and services and using your personal data as described in our Terms of Service and Privacy Policy.
© 2026 All Rights Reserved