Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Latest
AI
Amazon
Apps
Biotech & Health
Climate
Cloud Computing
Commerce
Crypto
Enterprise
EVs
Fintech
Fundraising
Gadgets
Gaming
Google
Government & Policy
Hardware
Instagram
Layoffs
Media & Entertainment
Meta
Microsoft
Privacy
Robotics
Security
Social
Space
Startups
TikTok
Transportation
Venture
Staff
Events
Startup Battlefield
StrictlyVC
Newsletters
Podcasts
Videos
Partner Content
TechCrunch Brand Studio
Crunchboard
Contact Us
At its annual Snap Partner Summit on Tuesday, the Snapchat maker announced a series of new AI features coming to its app. Most notably, the app’s My AI chatbot is being updated to work more like Google Lens, as it can now respond to more complex Snaps.
You can now snap a picture of a menu in a foreign language and send it to My AI to get an English translation, for instance. Plus, you can take a picture of a plant and send it to the chatbot to find out what it’s called.
You will also be able to snap a picture of a parking sign and send it to the chatbot to quickly understand whether you can park at a specific location.
Similar features were first made available to mobile consumers by Google Lens.
The launch is another indication that Snapchat is looking to make My AI more of a useful tool, as opposed to just an entertainment feature. A few months ago, Snapchat rolled out the ability for users to set in-app reminders and countdowns through My AI. The new features announced today add more useful functionality to the chatbot, as Snap is trying to get users to open up its social app when they have certain search queries.
In addition to the updates to My AI, Snap announced that its “My Selfie” feature will enable AI-powered edits in Memories (saved Snaps) for Snapchat+ premium subscribers. This feature will be used to spruce up your existing Snaps by adding captions and Lenses to Snaps that you have stored in your Memories archive. For instance, if you saved a picture of yourself in your Memories, Snapchat may add a Lens on top of it to reimagine yourself as a Renaissance painting.
Plus, you can now opt in to appear in AI-generated images alongside your friends. For instance, the app may create an AI-generated Snap of you and your friend as lawyers or Olympic swimmers.
Snapchat also announced that it’s going to launch an AI-powered Lens that will allow users to see what they will look like in the future. The launch appears to be Snapchat’s response to TikTok’s popular old-age filter.
Elsewhere on the app, Snapchat revealed that it’s rolling out improved HD video calls and Snap Mail, which lets you leave your friend a Snap if they don’t answer your call. The app is also going to start showing local time zones in chats to make it easier for people to know when to connect with friends around the world.
Topics
Aisha is a consumer news reporter at TechCrunch. Prior to joining the publication in 2021, she was a telecom reporter at MobileSyrup. Aisha holds an honours bachelor’s degree from University of Toronto and a master’s degree in journalism from Western University.
You can contact or verify outreach from Aisha by emailing aisha@techcrunch.com or via encrypted message at aisha_malik.01 on Signal.
Founders: land your investor and sharpen your pitch. Investors: discover your next breakout startup. Innovators: claim a front-row seat to the future. Join 10,000+ tech leaders at the epicenter of innovation. Register now and save up to $668.
Regular Bird rates end September 26
Meta CTO explains why the smart glasses demos failed at Meta Connect — and it wasn’t the Wi-Fi
OpenAI’s research on AI models deliberately lying is wild
Meta unveils new smart glasses with a display and wristband controller
The 9 most sought-after startups from YC Demo Day
Apple’s iOS 26 with the new Liquid Glass design is now available to everyone
Spotify will now let free users pick and play tracks
Vibe coding has turned senior devs into ‘AI babysitters,’ but they say it’s worth it
© 2025 TechCrunch Media LLC.