Goto

Collaborating Authors

 smartglass


The five new gadgets I tried and loved at CES 2026 (that you can buy right now)

The Guardian

I'm a digital nomad who works on the road. The Guardian's journalism is independent. We will earn a commission if you buy something through an affiliate link. Every year, the Consumer Electronics Show (CES) gives us a glimpse of what's around the corner in tech: creepy humanoid robots, robovacs that climb stairs, AI baked into everything. The Guardian's journalism is independent.


OpenAI's Ambitions Just Became Crystal Clear

The Atlantic - Technology

Sam Altman is done with keyboards and screens. Earlier today, OpenAI announced its intentions to solve this apparent problem. The company is partnering with Jony Ive, the longtime head of design at Apple, who did pioneering work on products such as the iMac G3, the iPod, and, most famously, the iPhone. Together, Altman and Ive say they want to create hardware built specifically for AI software. Everyone, Altman suggested in a highly produced announcement video, could soon have access to a "team of geniuses"--presumably, ChatGPT-style assistants--on a "family of devices."


EgoCHARM: Resource-Efficient Hierarchical Activity Recognition using an Egocentric IMU Sensor

Padmanabha, Akhil, Govindarajan, Saravanan, Kim, Hwanmun, Ortiz, Sergio, Rajan, Rahul, Senkal, Doruk, Kadetotad, Sneha

arXiv.org Artificial Intelligence

Figure 1: We propose a resource-efficient hierarchical architecture, EgoCHARM, to classify both high and low level activities using a single egocentric, head-mounted IMU. On the left, we show the Aria smartglasses [9, 27] featuring the IMU along with sample acceleration data from selected low and high level activities within our dataset. Our hierarchical architecture, shown on the right, features a low level encoder that inputs 1s low level windows of IMU data and extracts motion embeddings, which can be used to predict low level activities or aggregated over time (30s) and inputted into a high level architecture to predict high level activities. Human activity recognition (HAR) on smartglasses has various use cases, including health/fitness tracking and input for context-aware AI assistants. However, current approaches for egocentric activity recognition suffer from low performance or are resource-intensive. In this work, we introduce a resource (memory, compute, power, sample) efficient machine learning algorithm, EgoCHARM, for recognizing both high level and low level activities using a single egocentric (head-mounted) Inertial Measurement Unit (IMU). Our hierarchical algorithm employs a semi-supervised learning strategy, requiring primarily high level activity labels for training, to learn generalizable low level motion embeddings that can be effectively utilized for low level activity recognition. We evaluate our method on 9 high level and 3 low level activities achieving 0.826 and 0.855 F1 scores on high level and low level activity recognition respectively, with just 63k high level and 22k low level model parameters, allowing the low level encoder to be deployed directly on current IMU chips with compute. Lastly, we present results and insights from a sensitivity analysis and highlight the opportunities and limitations of activity recognition using egocentric IMUs. Work done while at Meta. email: akhil.padmanabha@gmail.com The proliferation of wearable devices and sensor-enabled technologies in portable form factors has created numerous opportunities for tracking, analyzing, and generating insights into human actions and behaviors.


The Metaverse Value-Chain

#artificialintelligence

Trillions of dollars: that's how much private industry is investing into the metaverse. In this article, I provide a description of the value chain of this market, from the experiences that people seek out to the enabling technologies that make it possible. More importantly, I also provide a prescription -- a vision for a future metaverse that is powered by creators and built upon decentralization. Investments and decisions made now will determine whether this is the future that manifests: one that offers the greatest variety of experience, powered by creators who earn a living at it -- or one defined by the next wave of gatekeepers and rent-takers. I'm excited that we're well on the way to the former, which is the more egalitarian market -- and I'm hopeful it will continue. Many people think of the metaverse as 3D space that will surround us.


READ The Power Trip

#artificialintelligence

The POWER TRIP J. Cafesin A sharp jolt wakes me. I open my eyes, my heart beating hard and fast, but I stay in bed while the room shakes, the walls creak and groan. Bolting for the bedroom threshold in my boxers to huddle in the doorway seems a bit extreme. The rash of quakes…


Amazon's Echo Frames will soon come with blue-light filtering lenses

Engadget

Amazon's Echo Frames are compatible with most prescription lenses, but they only come with clear lenses out of the box -- until now, anyway. The retail giant has introduced three new options to choose from if you're looking to buy a pair of the Alexa-powered eyewear. You can get it with polarized blue mirror sunglass lenses starting today, and starting on June 9th, you can get a pair with blue-light-filtering lenses and polarized classic sunglass lenses. The Echo Frames have open-ear audio near your temples, so you can listen to music, audiobooks and podcasts, as well as take calls without blocking the world around you. Since it also gives you hands-free access to Alexa, you can simply issue voice commands to control it, such as "Alexa, play my followed podcasts on Amazon Music" or "Alexa, resume my audiobook."


Part human, part machine: is Apple turning us all into cyborgs?

The Guardian

At the beginning of the Covid-19 pandemic, Apple engineers embarked on a rare collaboration with Google. The goal was to build a system that could track individual interactions across an entire population, in an effort to get a head start on isolating potentially infectious carriers of a disease that, as the world was discovering, could be spread by asymptomatic patients. Delivered at breakneck pace, the resulting exposure notification tool has yet to prove its worth. The NHS Covid-19 app uses it, as do others around the world. But lockdowns make interactions rare, limiting the tool's usefulness, while in a country with uncontrolled spread, it isn't powerful enough to keep the R number low. In the Goldilocks zone, when conditions are just right, it could save lives.


Putting Alexa inside a pair of smartglasses makes a lot of sense

Engadget

Vuzix has been a regular at CES for years, crafting head-mounted displays and smartglasses often aimed at businesses -- or very enthused wearable fans. The Vuzix Blade, its latest pair of augmented reality spectacles tries to balance that B2B / consumer sales pitch by adding a voice assistant. Amazon Alexa's newest home is a pair of smartglasses. Firstly, the crush of a CES evening show is never the best place to test out a voice assistant: you need a strong connection to make her receptive to your requests. So, pretty much all my Alexa queries fell on deaf robot ears.


50 wearable tech predictions for 2018

#artificialintelligence

We like to think we know a few things about what's hot - and what's not - in the world of wearable tech. As is custom for the Wareable 50, we are making 50 bold predictions about what will happen in the wearable tech and connected self in 2018. Yes, anyone can make predictions, but we have previous form in getting most (not all) things right. This time last year we said that hearables, augmented reality, hybrid smartwatches and Fitbit's first smartwatch would be talking points in 2017. The Wareable 50 2018 edition picks out the people, products, companies, startups and trends that are set to have a big 12 months. The list has been fiercely debated in both US and UK Wareable HQs and now we've decided on the finalists - including the number one thing we think everyone's going to be talking about next year. So take a look, and let us know in the comments section below if you think we're on the money, wide off the mark, or you just want to compliment us on our fine list-making skills.


Amazon to release Alexa-powered smartglasses, reports say

The Guardian

Amazon is planning to release a pair of Alexa-enabled smartglasses as the latest addition to its range of voice-controlled devices, according to reports. Unlike most previous smartglasses, such as the ill-fated Google Glass experiment and Snapchat's Spectacles, the Amazon glasses won't feature a camera in any form, bypassing the privacy concerns that have plagued the form-factor in the past. Instead, they will focus on providing a link to Alexa, Amazon's voice-controlled personal assistant, through a bone-conduction audio system, which transmits sounds into the wearer's head by vibrating their skull, rather than through headphones inserted in their ear. According to a report by the Financial Times, the glasses could be revealed at a product launch event expected to be held soon alongside a home security camera, designed to tie in with its Echo Show video screen. Other reports have suggested the company will shortly release a new version of the Fire TV, its streaming media set-top box, with an Echo-style speaker system built-in.