In a quiet corner of rural Hampshire, a robot called Rachel is pootling around an overgrown field. With bright orange casing and a smartphone clipped to her back end, she looks like a cross between an expensive toy and the kind of rover used on space missions. Up close, she has four USB ports, a disc-like GPS receiver, and the nuts and bolts of a system called Lidar, which enables her to orient herself using laser beams. She cost around £2,000 to make. Every three seconds, Rachel takes a closeup photograph of the plants and soil around her, which will build into a forensic map of the field and the wider farm beyond. After 20 minutes or so of this, she is momentarily disturbed by two of the farm's dogs, unsure what to make of her.
Advancements in robotics are continually taking place in the fields of space exploration, health care, public safety, entertainment, defense, and more. These machines--some fully autonomous, some requiring human input--extend our grasp, enhance our capabilities, and travel as our surrogates to places too dangerous or difficult for us to go. Gathered here are recent images of robotic technology, including a Japanese probe reaching a distant asteroid, bipedal-robot fighting matches in Japan, a cuddly cat-substitute robotic pillow, an automated milking machine, delivery bots, telepresence robots, technology on the fashion runway, robotic prosthetic limbs and exoskeletons, and much more.
Nearly halfway into the NFL season, the Dallas Cowboys are 3–3 and sit 20th out of 32 on ESPN's power ranking index, which gives them a less than 50–50 shot at making the playoffs. So fans of America's Team don't have a whole lot to get excited about. Unless, that is, they like riding in robot cars. Today, startup Drive.ai is launching a self-driving car service in Arlington, Texas, which sits halfway between Dallas and Fort Worth and is home to the Cowboys' AT&T Stadium. The service will run several routes in multiple parts of the city, bustling to and from big venues including that stadium, Globe Life Park (where baseball's Texas Rangers play), and the Arlington Convention Center.
But the input can be pushed in certain directions. A quarter-century ago, an electronic surveillance consultant named Scott French used a supercharged Mac to imitate Jacqueline Susann's sex-drenched tales. His approach was different from Mr. Sloan's. Mr. French wrote thousands of computer-coded rules suggesting how certain character types derived from Ms. Susann's works might plausibly interact. It took Mr. French and his Mac eight years to finish the tale -- he reckoned he could have done it by himself in one.
Three former executives at Google, Tesla and Uber who once raced to be the first to develop self-driving cars have adopted a new strategy: Slow down. At their new company Aurora Innovation, which is developing self-driving technology for carmakers including Volkswagen and Hyundai, the rules are simple: No flashy launches, mind-blowing timelines or hyper-choreographed performances on closed tracks. "No demo candy," said Chris Urmson, a co-founder and former head of Google's self-driving car team. Aurora's long-game technique reflects a new phase for the hyped promise of computer-piloted supercars: a more subdued, more pragmatic way of addressing the tough realities of the most complicated robotic system ever built. In the wake of several high-profile crashes that dented public enthusiasm in autonomous cars, Aurora's executives are urging their own industry to face a reality check, saying lofty promises risk confusing passengers and dooming the technology before it can truly take off.
Artificial intelligence, often used to identify specific patterns in data, can detect anomalies in the way a person types that may be attributable to specific disorders. Researchers are experimenting with artificial intelligence (AI) software that is increasingly able to tell whether you suffer from Parkinson's disease, schizophrenia, depression, or other types of mental disorders, simply from watching the way you type. The researchers are able to make these astounding diagnoses, they say, because the capabilities of computing devices have become so granular that smartphones, tablets, and computers all can measure typing activity down to the millisecond. Essentially, today's technologies, along with the capability of AI to learn to identify specific patterns in data, offer researchers a powerful lens on even the slightest abnormalities in everyday typing behavior. In a University of Texas study published earlier this year, for example, researchers were able to identify typists suffering from Parkinson's disease simply by capturing how study subjects worked a keyboard over time, then running that data through pattern-finding AI software.
For weeks, computer scientist Siwei Lyu had watched his team's deepfake videos with a gnawing sense of unease. Created by a machine learning algorithm, these falsified films showed celebrities doing things they'd never done. They felt eerie to him, and not just because he knew they'd been ginned up. "They don't look right," he recalls thinking, "but it's very hard to pinpoint where that feeling comes from." He, like many kids, had held staring contests with his open-eyed peers.
It's now possible to check in automatically at Shanghai's Hongqiao airport using facial recognition technology, part of an ambitious rollout of facial recognition systems in China that has raised privacy concerns as Beijing pushes to become a global leader in the field. The airport unveiled self-service kiosks for flight and baggage check-in, security clearance and boarding powered by facial recognition technology, according to the Civil Aviation Administration of China.
When Whitney Bailey bought an Amazon Echo, she wanted to use the hands-free calling feature in case she fell and couldn't reach her phone. She hoped that it would offer her family some peace of mind and help make life a little easier. In some ways, she says, it does. But because she has cerebral palsy, her voice is strained when she talks, and she struggles to get Alexa to understand her. To make matters worse, having to repeat commands strains her voice even more.
Delivery drones are real and they're operating on a national level, but they're not dropping off impulse purchases, and some of the most important applications are not in the United States. Zipline, a Bay Area startup, inked a deal with the government of Rwanda in 2016 and now uses small, autonomous planes to deliver medical supplies, and in particular blood, to rural communities across the African country. "It's a pretty cool paradigm shift for people who think all technological revolution is going on in US, and it'll trickle down to poor countries," says Zipline CEO, Keller Rinaudo, presenting his vision for drone deliveries on stage at the WIRED25 summit in San Francisco on Monday. "This is the opposite of that." Amazon created an internet-wide buzz when it announced it wanted to start delivering online shopping via drone, in a 60 Minutes interview in 2013.