Mashable
Googles latest AI shopping tool is Chers Clueless closet IRL
Google's latest shopping feature makes Cher Horowitz's computerized closet a reality. The new virtual try-on tool within its "AI Mode" search option lets users see how outfits look on photos of themselves. Announced during the opening keynote at Google I/O 2025 on Tuesday, the tool uses a new custom image-generation model to place clothing pictured in online product listings onto a full-length shot provided by the user. Per a company blog post, the model "understands the human body and nuances of clothing -- like how different materials fold, stretch and drape on different bodies." According to Google, it will also be able to accommodate different poses.
Whoa: Google Meet can now translate a foreign language in real-time
You already know Google Translate, but what about live voice translation in Google Meet? This feature is one of the major Workspace announcements Google shared at its annual I/O event on Tuesday. Starting today, Google is rolling out real-time speech translation in Google Meet for subscribers of its AI Premium plan. When a user on a Google Meet video call turns on this feature, an AI audio model uses their speech to live translate what they're saying into another language. Google is starting with English and Spanish, with more languages coming in the next few weeks.
Google AI Mode is launching in the U.S., kicking off a new era of AI search
Google just cracked open the future of search, and it talks back. During today's Google I/O 2025 keynote event, Google announced that it is now rolling out the AI Mode search tool to everyone in the United States. Powered by Gemini, AI Mode will now include new "Deep Search" features and some agentic capabilities. AI Mode represents the biggest shift in Google Search since its inception. It's no longer just a place to find links.
Driverless cars need to be more human, study finds
Self-driving cars may need to act more like humans, new research has found. A study published in the Proceedings of the National Academy of Sciences found autonomous cars with a greater focus on being "socially sensitive" would prove safer. What does being socially sensitive mean? In short, it seems that it's driving more like a person. The researchers found that autonomous vehicles would be safer if programmed to "incorporate ethical considerations" and focus on protecting more vulnerable people on the road such as pedestrians or cyclists.
IT: Welcome to Derry teaser gives us our first glimpse of 1960s Pennywise
'IT: Welcome to Derry' teaser gives us our first glimpse of 1960s Pennywise Mashable Tech Science Life Social Good Entertainment Deals Shopping Games Search Cancel * * Search Result Tech Apps & Software Artificial Intelligence Cybersecurity Cryptocurrency Mobile Smart Home Social Media Tech Industry Transportation All Tech Science Space Climate Change Environment All Science Life Digital Culture Family & Parenting Health & Wellness Sex, Dating & Relationships Sleep Careers Mental Health All Life Social Good Activism Gender LGBTQ Racial Justice Sustainability Politics All Social Good Entertainment Games Movies Podcasts TV Shows Watch Guides All Entertainment SHOP THE BEST Laptops Budget Laptops Dating Apps Sexting Apps Hookup Apps VPNs Robot Vaccuums Robot Vaccum & Mop Headphones Speakers Kindles Gift Guides Mashable Choice Mashable Selects All Sex, Dating & Relationships All Laptops All Headphones All Robot Vacuums All VPN All Shopping Games Product Reviews Adult Friend Finder Bumble Premium Tinder Platinum Kindle Paperwhite PS5 vs PS5 Slim All Reviews All Shopping Deals Newsletters VIDEOS Mashable Shows All Videos Home Entertainment TV Shows'IT: Welcome to Derry' teaser gives us our first glimpse of 1960s Pennywise Just when you thought you were over that fear of clowns. By Sam Haysom Sam Haysom Sam Haysom is the Deputy UK Editor for Mashable. He covers entertainment and online culture, and writes horror fiction in his spare time. Read Full Bio on May 20, 2025 Share on Facebook Share on Twitter Share on Flipboard All products featured here are independently selected by our editors and writers. If you buy something through links on our site, Mashable may earn an affiliate commission.
Get an AI investment coach for life for just A 86
TL;DR: Sterling Stock Picker has an AI that helps you invest in the stock market, and it's only A 86 for life. The stock market has been especially volatile lately, but that doesn't mean you have to wait to invest. A new specialized AI from the creators of ChatGPT has been trained on the stock market to help you invest your money safely, even in a chaotic market. Sterling Stock Picker can help you determine which investments are worth the money, and a lifetime subscription is even on sale for A 86 (reg. Sterling Stock Picker uses AI-driven tools to help simplify the investing process for beginners and experienced investors alike.
Google introduces AI Ultra, a pro subscription plan with 250 a month price tag
If you like Google's AI services (and I mean really like them), there's a new subscription for you. At its Google I/O keynote event (and in a company blog post), Google revealed that a new AI subscription plan for professionals is ready to roll out in the United States. The new Google AI Ultra subscription is intended for the hardest of hardcore AI users, and it costs a whopping 250 a month. Yes, you read that right: Two hundred and fifty U.S. dollars per month. While business owners and professionals may be used to paying for Google Workspace access, the average user is probably not accustomed to paying for Google services.
Darren Aronofsky turns to AI to reimagine the future of film
AI and creators mix much like oil and vinegar -- not at all unless you use a very specific technique (whisking) for a very specific purpose (making salad dressing). For Darren Aronofsky, the director behind Requiem for a Dream, The Whale, and Black Swan, that technique involves using Google DeepMind's research team and three filmmakers to produce short films that embrace new technology and storytelling. The partnership between Aronofsky's venture Primordial Soup and Google DeepMind will create frameworks for AI's role in filmmaking in an effort to prioritize artists in the conversation. It was announced during Tuesday's Google I/O, the company's annual developer conference. "Filmmaking has always been driven by technology," Aronofsky said in a press release.
Im a college professor. My advice to young people who feel hooked on tech
When I was a child, computers were a fixture in my home, from the giant Atari on which I learned my ABCs, to the Commodore Amiga that my dad used for his videography business, to the PC towers that facilitated my first forays onto the internet. But tech was still a niche hobby back then. Even in college in the late 1990s and early 2000s, many of my friends got by just fine without computers. For people in college now--namely, my students--things are decidedly different. Gadgets are everywhere, and are increasingly designed to insert themselves into every aspect of our consciousness, colonizing every spare moment of our time and attention.
Google talked AI for 2 hours. It didnt mention hallucinations.
This year, Google I/O 2025 had one focus: Artificial intelligence. We've already covered all of the biggest news to come out of the annual developers conference: a new AI video generation tool called Flow. Yet over nearly two hours of Google leaders talking about AI, one word we didn't hear was "hallucination". Hallucinations remain one of the most stubborn and concerning problems with AI models. The term refers to invented facts and inaccuracies that large-language models "hallucinate" in their replies.