The Ray-Ban Meta smart glasses' new AI powers are impressive, and worrying

Engadget 

When I first reviewed the Ray-Ban Meta smart glasses, I wrote that some of the most intriguing features were the ones I couldn't try out yet. Of these, the most interesting is what Meta calls "multimodal AI," the ability for the glasses to respond to queries based on what you're looking at. For example, you can look at text and ask for a translation, or ask it to identify a plant or landmark. The other major update I was waiting for was the addition of real-time information to the Meta AI assistant. Last fall, the assistant had a "knowledge cutoff" of December 2022, which significantly limited the types of questions it could answer. But Meta has started to make both of these features available (multimodal search is in an "early access" period").

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found