Sengupta: Thank you so much for having me today. I'm really excited to be in San Francisco. I don't get to come here that often, which is strange because I live in Los Angeles, but I do like to come whenever I can. For my talk today, I'm going to talk about the future of transportation, specifically on the things that I worked on that I think are kind of the up and coming thing, the thing that I'm working on now and what's going to happen in the future. I think part of my career has always been about just doing fun and exciting new things and all my degrees are in aerospace engineering, ever since I was a little kid, I loved science fiction. I actually am a Star Trek person versus a Star Wars person, but I knew since I was a little kid that I wanted to be involved in the space program, so that's why I decided to go the aerospace engineering route and I wanted to build technology. I got my Ph.D. in plasma propulsion systems. Has anyone heard of the mission called Dawn that's out in the main asteroid belt? My Ph.D. research actually was developing the ion engine technology for that mission. It actually flew and got it to a pretty cool place out in the main asteroid belt looking at Vesta and Ceres. I did that for about five years and then I kind of felt like I had done everything I could possibly do on that front, from a research perspective. My management asked me if I wanted to work on the next mission to Mars. There's very few engineers in the space program who'd be like, "No, I'm just not interested in that." And they're like, "We want you to do the supersonic parachute for it."
Former NASA engineer Mark Moore will now be working on Uber's flying car project, Uber Elevate. SAN FRANCISCO -- George Jetson, your ride is on its way. Uber has just hired a NASA expert to build out its vision for flying cars Monday. Mark Moore, a 30-year veteran of the space agency with expertise in using electric motors to get a vehicle airborne, will help the ride-hailing giant execute on an expansive white paper it released last fall on developing VTOL (vertical take-off and landing) vehicles. "Uber continues to see its role as a catalyst to the growing developing VTOL ecosystem," Nikhil Goel, Uber's head of product for advanced programs, said in a statement.
Apple, Intel, Microsoft and Uber will soon start flying drones for a range of tasks including food and package delivery, digital mapping and conducting surveillance as part of 10 pilot programmes approved Wednesday by the US government. The drone-testing projects have been given waivers for regulations that currently ban their use in the US and will be used to help the Federal Aviation Authority draw up suitable laws to govern the use of the unmanned aerial vehicles (UAV) for myriad tasks. "The enthusiastic response to our request for applications demonstrated the many innovative technological and operational solutions already on the horizon," said US transportation secretary Elaine Chao. Apple will be using drones to capture images of North Carolina with the state's Department of Transportation. Uber is working on air-taxi technology and will deliver food by drone in San Diego, California, because "we need flying burgers" said the company's chief executive Dara Khosrowshahi.
It's a weird feeling, cruising around Silicon Valley in a car driven by no one. I am in the back seat of one of Google's self-driving cars – a converted Lexus SUV with lasers, radar and low-res cameras strapped to the roof and fenders – as it maneuvers the streets of Mountain View, California, not far from Google's headquarters. I grew up about five miles from here and remember riding around on these same streets on a Schwinn Sting-Ray. Now, I am riding an algorithm, you might say – a mathematical equation, which, written as computer code, controls the Lexus. The car does not feel dangerous, nor does it feel like it is being driven by a human. It rolls to a full stop at stop signs (something no Californian ever does), veers too far away from a delivery van, taps the brakes for no apparent reason as we pass a line of parked cars. I wonder if the flaw is in me, not the car: Is it reacting to something I can't see? The car is capable of detecting the motion of a cat, or a car crossing the street hundreds of yards away in any direction, day or night (snow and fog can be another matter). "It sees much better than a human being," Dmitri Dolgov, the lead software engineer for Google's self-driving-car project, says proudly. He is sitting behind the wheel, his hands on his lap. As we stop at the intersection, waiting for a left turn, I glance over at a laptop in the passenger seat that provides a real-time look at how the car interprets its surroundings. On it, I see a gridlike world of colorful objects – cars, trucks, bicyclists, pedestrians – drifting by in a video-game-like tableau. Each sensor offers a different view – the lasers provide three-dimensional depth, the cameras identify road signs, turn signals, colors and lights. The computer in the back processes all this information in real time, gauging the speed of oncoming traffic, making a judgment about when it is OK to make a left turn.