A lot of discussion and ethical thought about self-driving cars have focused on tragic dilemmas, like hypotheticals in which a car has to decide whether to run over a group of schoolchildren or plunge off a cliff, killing its own occupants. But those sorts of situations are extreme cases. As the most recent crash – in which a self-driving car killed a pedestrian in Tempe, Arizona – demonstrates, the mundane, everyday situations at every pedestrian crossing, turn and intersection present much harder and broader ethical quandaries. As a philosopher working with engineers in Stanford's Centre for Automotive Research, I was initially surprised that we spent our lab meetings discussing what I thought was an easy question: how should a self-driving car approach a pedestrian crossing? My assumption had been that we would think about how a car should decide between the lives of its passengers and the lives of pedestrians.
The head of the U.S. military's Special Operations Command said Wednesday that Air Force gunships, needed to provide close air support for American commandos and U.S.-backed rebel fighters in Syria, were being "jammed" by "adversaries." Calling the electronic warfare environment in Syria "the most aggressive" on earth, Air Force Gen. Tony Thomas told an intelligence conference in Tampa that adversaries "are testing us every day, knocking our communications down, disabling our AC-130s, etc." Thomas' remarks, which were first reported by the website The Drive, come on the heels of reports that Russian forces are jamming U.S. surveillance drones flying over the war-torn nation. An Air Force AC-130 gunship was among the U.S. military aircraft used to kill dozens of Russian mercenaries in Syria in early February. The Pentagon said the mercenaries attacked an outpost manned by American commandos and U.S.-backed fighters of the Syrian Democratic Forces (SDF), comprising Syrian Kurdish and Arab fighters. Wednesday was not the first time General Thomas has been so forthcoming about Syria in a public setting.
The accident involving a self-driving Uber car that killed a woman in Arizona in March 2018 has made headlines around the world. The vehicle was in autonomous mode when it hit the woman, who was walking outside the crosswalk and later died at a hospital. "What we can learn from the Uber incident is that there will not be 100 percent security in autonomous driving even in an ideal world where all vehicles are autonomous," says Michael Bruch, head of emerging trends at Allianz Global Corporate & Specialty (AGCS). "An environment with zero accidents or fatalities is unrealistic. Imagine a child running directly in front of an autonomous car, the hardware would not be able to brake quickly enough to avoid an accident."
The term "machine learning" might not mean much to you. You might imagine a computer playing chess, calculating the multitude of moves and the possible countermoves. But, when you hear the term "artificial intelligence" or "AI," however, it's more likely you have visions of Skynet and the rise of our inevitable robot overlords. But, the truth of artificial intelligence -- and particularly machine learning -- is far less sinister, and it's actually not something of the far-off future. It's here today, and it's shaping and simplifying the way we live, work, travel and communicate.
Brennan Boblett, who helped pioneer the look of touch screen interfaces in increasingly autonomous vehicles, is joining well-funded startup Mapbox Inc. to help create digital maps for passengers in driverless cars. Mapbox provides mapping and location-search technology to a variety of companies including messaging-app developer Snap Inc. and General Electric Co. In October, it raised $164 million in a round led by SoftBank Group to expand its efforts into the automotive industry. Mr. Boblett spent several years as a designer at auto maker Tesla Inc., leading a team that created the interfaces of digital touch screens that rest on the dashboards of the electric vehicles, including how the user interacted with the company's semiautonomous Autopilot system. He also worked for about a year at Uber Technologies Inc., where he worked on the user interface for autonomous cars and redesigned the ride-hailing service's driver app that deals heavily with mapping, according to his online resume.
Two things have moved deep-neural-network-based (DNN) machine learning (ML) from research to mainstream. The first is improved computing power, especially general-purpose GPU (GPGPU) improvements. The second is wider distribution of ML software, especially open-source software. Quite a few applications are driving adoption of ML, including advanced driver-assistance systems (ADAS) and self-driving cars, big-data analysis, surveillance, and improving processes from audio noise reduction to natural language processing. Many of these applications utilize arrays of GPGPUs and special ML hardware, especially for handling training that uses large amounts of data to create models that require significantly less processing power to perform a range of recognition and other ML-related tasks.
The project named'Enhancing the Region through New Technology for Unmanned Systems,' will implement a new drone technology training program at Dabney S. Lancaster Community College. This program will open up a career pathway, by enhancing the learning opportunities for high school students and extending to four-year degree attainment through partnerships with other higher-education institutions. This project aims to capitalize on the "Alleghany Highlands Drone Zone Initiative," a business accelerator program to support enterprises in the UAS industry in Alleghany County. "Growth and Opportunity for Virginia (GO Virginia) is inspiring the innovative thinking that will help to push Virginia's economy forward," says Governor, Ralph Northam.
How do self-driving vehicles learn how to drive? How does artificial intelligence become smart in the first place? It all starts with data. An international company that applies data to a variety of practical uses has chosen to place its first U.S. "delivery center" in New Orleans and will hire 100 people within the next 12 to 18 months to staff it. The available jobs range from entry-level to more skilled positions.
Rice University students developed the hardware and software required to coordinate sensor-carrying drones. "The system is designed to be application-agnostic in the sense that you can use our APIs and libraries to build any kind of autonomous solution that you want," said Kevin Lin, the team member. Each drone is equipped with a Wi-Fi that enables communication across long distances, and a LIDAR (Light detection and ranging) to avoid obstacles and track altitude. The team tested several applications, "that was useful because it showcased two drones coordinating and sharing their data," said Lin. UAV's have been used to check and inspect towers after natural disasters and measure signal strength in high-traffic venues. The idea is that people could use their drone to measure the magnitude of a leak and determine where people shouldn't be allowed to go.
Jennifer Malandra has eight years' active service in the Navy and a Naval Academy education. Both are in search of civilian jobs in the highly competitive tech industry. They and 23 others are wrapping up a 10-day program run by BreakLine, an educational program sponsored by a who's who of Silicon Valley companies in search of bright candidates. The focus today was artificial intelligence and drones. Both specialties are creating jobs as a result of automation and robotics.