Mashable
Skullcandy Method 360 ANC vs. Bose QuietComfort: Comparing Bose-powered earbuds
In the press release for the Method 360 earbuds, Skullcandy called them its "most advanced audio experience to date." In listening to everything from indie rock, video game soundtracks, and podcasts, I can see why. The Skullcandy earbuds had a balance that matched up easily to other impressive budget earbuds I've tested. Whether I was listening to the Final Fantasy VII soundtrack or a live Daft Punk performance, these earbuds punched above a 100 price point. However, when listening to them side by side with the Bose earbuds, the Skullcandy earbuds felt muffled and muddier (though I wouldn't describe them as muddy on their own).
Welcome to Google AI Mode! Everything is fine.
If the AI lovefest of Google I/O 2025 were a TV show, you might be tempted to call it It's Always Sunny in Mountain View. But here's a better sitcom analogy for the event that added AI Mode to all U.S. search results, whether we want it or not. It's The Good Place, in which our late heroes are repeatedly assured that they've gone to a better world. A place where everything is fine, all is as it seems, and search quality just keeps getting better. Don't worry about ever-present and increasing AI hallucinations here in the Good Place, where the word "hallucination" isn't even used.
Everything you need to know from Google I/O 2025
From the opening AI-influenced intro video set to "You Get What You Give" by New Radicals to CEO Sundar Pichai's sign-off, Google I/O 2025 was packed with news and updates for the tech giant and its products. And when we say packed, we mean it, as this year's Google I/O clocked in at nearly two hours long. During that time, Google shared some big wins for its AI products, such as Gemini topping various categories on the LMArena leaderboard. Another example that Google seemed really proud of was the fact that Gemini completed Pokรฉmon Blue a few weeks ago. But, we know what you're really here for: Product updates and new product announcements.
Introducing Flow, Googles new AI video tool and Sora competitor
Google's AI Era is officially officially here, and at the center of it is a new generative video model called Flow. At the Google I/O 2025 keynote event on May 20, Google unveiled a new suite of AI video tools, powered by state-of-the-art models. The offspring of media models Veo 3 and Imagen 4, Flow is Google's answer to OpenAI's Sora -- AI tools for a new era in video generation for filmmakers and creatives. However, unlike Sora, Flow comes with native audio generation baked right in. Pitched as an "AI filmmaking tool built for creatives, by creatives," Flow is the tech giant's latest attempt to demo the power of AI as a use case in reshaping the creative process.
You can sign up for Googles AI coding tool Jules right now
Google just rolled out a product that might make coding a lot easier. Google introduced Jules, its AI coding tool, in December in Google Labs. Today, Jules is available to everyone and everywhere the Gemini model is available, without a waitlist. "Just submit a task, and Jules takes care of the rest -- fixing bugs, making updates. It integrates with GitHub and works on its own," Tulsee Doshi, the senior director and product lead for Gemini Models, said at Google I/O 2025. "Jules can tackle complex tasks in large codebases that used to take hours, like updating an older version of Node.js.
Google adds Gemini to Chrome
Google has injected AI features into practically all of its products, now including Chrome. At Google I/O, the tech giant's annual event, Google announced that Gemini is coming to Chrome as it transitions into the generative AI era, antitrust issues be damned. Gemini's integration with the browser means users can ask questions about information on sites, or even navigate to those sites, while browsing the web. Gemini on Chrome will be available to Chrome users on Windows and macOS, but only for paying subscribers to Google AI Pro and AI Ultra plans, which cost 20 and 250 a month, respectively. Meanwhile, Google is in the remedial phase of its antitrust case, which the U.S. Department of Justice is prosecuting. Google has been ruled a monopoly for leveraging its Chrome browser in anti-competitive ways.
Googles latest AI shopping tool is Chers Clueless closet IRL
Google's latest shopping feature makes Cher Horowitz's computerized closet a reality. The new virtual try-on tool within its "AI Mode" search option lets users see how outfits look on photos of themselves. Announced during the opening keynote at Google I/O 2025 on Tuesday, the tool uses a new custom image-generation model to place clothing pictured in online product listings onto a full-length shot provided by the user. Per a company blog post, the model "understands the human body and nuances of clothing -- like how different materials fold, stretch and drape on different bodies." According to Google, it will also be able to accommodate different poses.
Whoa: Google Meet can now translate a foreign language in real-time
You already know Google Translate, but what about live voice translation in Google Meet? This feature is one of the major Workspace announcements Google shared at its annual I/O event on Tuesday. Starting today, Google is rolling out real-time speech translation in Google Meet for subscribers of its AI Premium plan. When a user on a Google Meet video call turns on this feature, an AI audio model uses their speech to live translate what they're saying into another language. Google is starting with English and Spanish, with more languages coming in the next few weeks.
Google AI Mode is launching in the U.S., kicking off a new era of AI search
Google just cracked open the future of search, and it talks back. During today's Google I/O 2025 keynote event, Google announced that it is now rolling out the AI Mode search tool to everyone in the United States. Powered by Gemini, AI Mode will now include new "Deep Search" features and some agentic capabilities. AI Mode represents the biggest shift in Google Search since its inception. It's no longer just a place to find links.
Driverless cars need to be more human, study finds
Self-driving cars may need to act more like humans, new research has found. A study published in the Proceedings of the National Academy of Sciences found autonomous cars with a greater focus on being "socially sensitive" would prove safer. What does being socially sensitive mean? In short, it seems that it's driving more like a person. The researchers found that autonomous vehicles would be safer if programmed to "incorporate ethical considerations" and focus on protecting more vulnerable people on the road such as pedestrians or cyclists.