Goto

Collaborating Authors

ground transportation


Breakthrough in safety-critical machine learning could be just the beginning

#artificialintelligence

Safety is the central focus on driverless vehicle systems development. Artificial intelligence (AI) is coming at us fast. It's being used in the apps and services we plug into daily without us really noticing, whether it's a personalized ad on Facebook, or Google recommending how you sign off your email. If these applications fail, it may result in some irritation to the user in the worst case. But we are increasingly entrusting AI and machine learning to safety-critical applications, where system failure results in a lot more than a slight UX issue.


Follow-the-leader: A shortcut to autonomous trucking

ZDNet

A company spun out of a prestigious university robotics lab is making a big leap in autonomous trucking. Locomation is claiming the world's first autonomous truck purchase order from a Springfield, MO, company called Wilson Logistics. The order will equip 1,120 trucks with Locomation's convoy technology, which enables driverless trucks to follow a lead-truck piloted by a human, combining the best of autonomous technology with reliable human-in-the-loop driving protocols. The first units will be delivered in early 2022. Trucking is considered one of the nearest horizons for on-road autonomy.


Amazon announced Ring's new indoor security drone: How will Always Home Cam work?

USATODAY - Tech Top Stories

Consumer drones are notorious for being hard to fly at first, before you learn what you're doing, and the odds are, you will crash it. So how about a drone that flies automatically, in the home as a roaming security camera? One the manufacturer promises won't crash into a ceiling fan or a flower pot, because it has obstacle avoidance technology. And flies back into its cradle when the flight is complete. Jamie Siminoff, the founder of the Amazon Ring subsidiary, insists that it will because there's an app for it.


3 Things You Need to Know About Deep Learning

#artificialintelligence

Deep learning is a machine learning technique that teaches computers to do what comes naturally to humans: learn by example. Deep learning is a key technology behind driverless cars, enabling them to recognize a stop sign, or to distinguish a pedestrian from a lamppost. It is the key to voice control in consumer devices like phones, tablets, TVs, and hands-free speakers. Deep learning is getting lots of attention lately and for good reason. It's achieving results that were not possible before.


Every new Alexa device Amazon just announced: Prices, release dates, and how to buy

ZDNet

Amazon-owned Ring announced a new line of security cameras for cars: The new $199 Car Cam, $60 Car Alarm, and the Car Connect systems, which lal integrate with the Ring app. The Car Alarm plugs into your car's OBD-II diagnostic port and sends alerts to your phone. It has a built-in siren that can be remotely triggered, or it can link to other Ring or Alexa devices to emit audible alerts when an event is detected. As for the Car Cam, it is Ring's first camera for outside of the home and has the ability to record both inside and outside of the car when mounted on a dashboard. Like the Car Alarm, the Car Cam can send alerts.


Dark skies and bright satellites

Science

Most ground-based observatories require a dark night sky to uncover answers to some of the most fundamental questions about the nature of our Universe. However, a number of companies and governments are in various stages of planning or deploying bright satellites in low-Earth orbit (or LEOsats) in greater numbers than ever before. These “megaconstellations” will fundamentally change astronomical observing at visible wavelengths. Nighttime images will be contaminated by streaks caused by the passage of Sun-illuminated satellites. If proposals calling for 100,000 or more LEOsats are realized, no combination of mitigations will be able to fully avoid the negative impact on astronomy. This threat comes at a time when new technology offers unprecedented scientific opportunities, all requiring access to dark skies. One example is the Vera C. Rubin Observatory, which is nearing completion. Its Legacy Survey of Space and Time (LSST) will soon offer a dramatic new view of the changing sky. Rubin Observatory will employ the 8.4-m Simonyi Survey Telescope and the 3200-megapixel LSST Camera to capture about 1000 images of the sky, every night, for 10 years. A single 30-s exposure will reveal distant objects that are about 40 million times fainter than those visible with the unaided eye. The observatory's combination of a large light-collecting area and field of view is unparalleled in the history of astronomy, which is why the project was the top ground-based priority for U.S. astronomers in the 2010 National Academies Decadal Survey of Astronomy and Astrophysics. LSST six-color images will contain data for about 20 billion ultrafaint galaxies and a similar number of stars, and will be used for investigations ranging from cosmological studies of the Universe to searches for potentially hazardous Earth-impacting asteroids. However, the discoveries anticipated from Rubin and other observatories could be substantially degraded by the deployment of multiple LEOsat constellations. The most exciting science to come out of current and planned astronomical facilities may be the discovery of types of objects and phenomena not yet observed or predicted. Such profound surprises have the potential to revolutionize our understanding of every field from exobiology to cosmology. Rubin Observatory's LSST, for example, opens the prospect of observing how ultrafaint objects change over time. It is precisely this kind of astronomy that is most at risk from image artifacts arising from LEOsat megaconstellations. These satellites scatter sunlight for several hours after sunset or before sunrise, are relatively close and bright, and thus can affect ground-based telescopes observing at visible wavelengths. Constellations in orbits well above 600 km will be illuminated by the Sun all night long. Astronomers worldwide are seeking ways to diminish the satellites' most damaging effects—the focus of a recent virtual workshop[*][1] sponsored by the U.S. National Science Foundation—and are collaborating with SpaceX (in particular, the Rubin Observatory), the first operator to launch a substantial constellation of LEOsats. SpaceX has shown that satellite operators can reduce reflected sunlight through satellite orientation, Sun shielding, and surface darkening. A joint effort to obtain higher-accuracy public data on the predicted location of individual satellites could help astronomers point their instruments to avoid some of the interference. Although all of these measures are helpful, there are no guarantees, and the research community is left to hope for good corporate citizenship. Future constellations owned and operated by foreign governments pose a different sort of challenge. Although there are international regulations covering radio-frequency interference, there are no such regulations in place for visible-frequency light pollution from space. Earth orbit is a natural resource without environmental protections, and we are now witnessing its industrialization. Currently there are about a thousand bright LEOsats, but that may be just the beginning. Proposals to expand telecommunications and data relay to serve new technologies like self-driving cars could lead to a 100-fold increase in the number of LEOsats in the next decade. The American Astronomical Society is working with astronomy stakeholders, commercial satellite operators, and international organizations to begin to forge policy on light pollution from space. It is unclear how long this will take and how effective it can be. What is clear is that without productive industry-observatory collaboration, voluntary operator compliance with best practices for mitigation, and subsequent regulatory action, we are slated to lose a clear view of the Universe and its secrets. [1]: #fn-1


CMU's Roborace Team Prepares for First Competition

CMU School of Computer Science

An autonomous car programmed by a Carnegie Mellon University student team will race for the first time Sept. 24-25 when Roborace, an international competition for autonomous vehicles (AVs), begins its season on the island of Anglesey in Wales. In Roborace, each team prepares their own artificial intelligence algorithms to control their race car, but all of the teams use identically prepared AVs, compute platforms and venues. To prepare for this month's race, the CMU team spent the summer working on the fundamentals of driving and on building an optimal driving path. But this week was the first time they had the chance to run their computer code on a hardware simulator. "Our minimum goal is to be able to get the car to start driving crash-free for now," said Anirudh Koul, an alumnus of the Language Technologies Institute's Master of Computational Data Science (MCDS) program and the team's coach. But the CMU team, the first U.S. team in Roborace, is confident that it will soon be competitive with other teams that have previous experience in the racing series.


Why people might never use autonomous cars

MIT Technology Review

Automated driving is advancing all the time, but there's still a critical missing ingredient: trust. Host Jennifer Strong meets engineers building a new language of communication between automated vehicles and their human occupants, a crucial missing piece in the push toward a driverless future. Credits: This episode was reported and produced by Jennifer Strong,Tanya Basu, Emma Cillekens and Tate Ryan-Mosley. We had help from Karen Hao and Benji Rosen.


From bomb-affixed drones to narco tanks and ventilated tunnels: How well-equipped are the Mexican cartels?

FOX News

Mexico's increasingly militarized crackdown of powerful drug cartels has left nearly 39,000 unidentified bodies languishing in the country's morgues – a grotesque symbol of the ever-burgeoning war on drugs and rampant violence. Investigative NGO Quinto Elemento Labs, in a recent report, found that an alarming number of people have been simply buried in common graves without proper postmortems, while others were left in funeral homes. The so-called war of drugs has claimed the lives of nearly 300,000 people over the last 14 years, while another 73,000 have gone missing. All the while, these cartels have yet to be designated formal terrorist organizations despite boasting well-documented arsenals of sophisticated weaponry to rival most fear-inducing militias on battlefields abroad. Just last month, reports surfaced that Mexico's Jalisco New Generation Cartel (CJNG) now possess bomb-toting drones – which The Drive's Warzone depicts as "small quadcopter-type drones carrying small explosive devices to attack its enemies."


A Step Towards Sensor Fusion for Indoor Layout Estimation

#artificialintelligence

The vision of smart autonomous robots in the indoor environment is becoming a reality in the current decade. This vision is now becoming a reality because of emerging technologies of Sensor Fusion and Artificial Intelligence. Sensor fusion is aggregating informative features from disparate hardware resources. Just like autonomous vehicles, the robotic industry is quickly moving towards automatic smart robots for handling indoor tasks. Now the major question arises.