As mankind expands outwards into the universe, unmanned spacecraft will face a growing problem: as Earth becomes more distant, the transmission time for information and instructions to reach these craft becomes longer and longer. This time lag could make it difficult or even impossible for satellites to respond to fast-moving threats, like space debris, or quickly take opportunities to collect data from unexpected sources, like a passing meteorite. A new grant from NASA to the University of Akron in Ohio will fund research to overcome this issue by helping such spacecraft "think" for themselves using deep-learning artificial intelligence (AI) that works over an Ethereum blockchain network. "I hope to develop technology that can recognize environmental threats and avoid them, as well as complete a number of tasks automatically," Akron Assistant Professor Jin Wei Kocsis, who will lead the research, said in a press release. "I am honored that NASA recognized my work, and I am excited to continue challenging technology's ability to think and do on its own."
The first asteroid discovered in the solar system took an entire year to nail down. It was Ceres, the largest object in the asteroid belt. When it was first spotted shortly into 1801, scientists didn't know what to make of the object or how to track it in the night sky, TED-Ed explains in a video (below). Astronomers were stumped for months and could not find Ceres once they lost sight of it. It took a new mathematical model for predicting orbits before they were able to pinpoint it again, with its re-discovery taking place on the last day of the year.
Early Saturday morning the Dragon cargo spacecraft that's been attached to the International Space Station since Dec. 17 is expected to be released and sent back to Earth. The plan was to have the flight controllers on the ISS use the robotic arm of the station to move the Dragon into place on Friday so that come Saturday morning, a ground-controlled crew could release the craft at the perfect time to send it back to Earth, NASA said in a release. Coverage of the release will start at 4:30 a.m. EST Saturday and the craft is scheduled for a 5 a.m. Less than four hours later, at 10:26 a.m. EST, the craft, full of experiments, is expected to splash into the Pacific Ocean. There "recovery forces" are expected to pull the Dragon from the water and taken it by ship to Long Beach, California.
We just open-sourced a project to create labeled datasets for ML on satellite imagery. There are only a handful of high quality satellite datasets out there, so our team built something to quickly/easily generate new ones. It pulls label information from OpenStreetMap and saves both the imagery and labels into numpy arrays for incorporation into ML workflows. You can filter by common tags in OSM like roads, buildings, railroads, etc., and it's able to package data for classification, object detection, or segmentation. Heads up that you need define some source for the satellite imagery (and most high-quality ones aren't free), but you can use free imagery tiles from OpenAerialMap or some available from England here or here.
A stunning new Nasa image shows raging storms on Jupiter with clouds that stretch for thousands of miles - and it looks just like an oil painting. Nasa's Juno spacecraft was a little more than one Earth diameter from Jupiter - or 8,292 miles (13,345 kilometres) - when it captured this mind-bending view of the planet's tumultuous atmosphere. The incredible colour-enhanced image was captured at a latitude of 48.9 degrees and depicts vasts swirling cloud formations that travel at about 129,000 mph (60 km/s) over the gas giant planet's surface. Jupiter fills the image, with only a hint of the terminator (where daylight fades to night) in the upper right corner, and no visible limb (the curved edge of the planet). Juno took this image of colorful, turbulent clouds in Jupiter's northern hemisphere on December 16, from 8,292 miles (13,345 kilometers) above the tops of Jupiter's clouds The incredible colour-enhanced image, showing swirling cloud formations over the gas giant planet's surface, was captured at a latitude of 48.9 degrees.
In December, NASA announced two finalist concepts for a robotic mission that will launch in the mid-2020s. The first is the Comet Astrobiology Exploration Sample Return (CAESAR), which would send a fairly conventional spacecraft over to a comet to grab a chunk of its nucleus and bring it back to Earth. That's cool and all, but we're much more excited about the second finalist concept: Dragonfly, from the Johns Hopkins University Applied Physics Lab (APL), a quad octocopter that would explore Saturn's moon Titan from the air. The idea is that it would work like a planetary rover, except that it would fly instead of drive, allowing it to cover much more ground at the risk of, you know, crashing. We've seen lots of drones that can do amazing things, and also lots of drones that crash very, very badly while trying to do amazing things.
When humans are finally ready to relocate civilization to Mars, they won't be able to do it alone. They'll need trusted specialists with encyclopedic knowledge, composure under pressure, and extreme endurance--droids like Justin. Built by the German space agency DLR, such humanoid bots are being groomed to build the first martian habitat for humans. Engineers have been refining Justin's physical abilities for a decade; the mech can handle tools, shoot and upload photos, catch flying objects, and navigate obstacles. Now, thanks to new AI upgrades, Justin can think for itself.
Fully assembled on the Launchpad. SpaceX's three-core, 27-engine Falcon Heavy launch vehicle sits on pad 39A at Kennedy Space Center in December 2017. SpaceX CEO Elon Musk said Thursday, Jan. 4, 2018, that the demonstration flight will happen before the end of January 2018. MELBOURNE, Fla. -- SpaceX's much-vaunted Falcon Heavy launch vehicle will roar off a historic Kennedy Space Center pad on its demonstration flight before the end of this month, CEO Elon Musk said Thursday. Pad 39A, which once played host to Apollo and space shuttle missions, is expected to see the three-core vehicle lift off on a premiere flight that will test one of the company's most technically challenging undertakings to date.
Thsi article is a slightly modified version of an invited address that was given at the Eighth IEEE Conference on Artificial Intelligence for Applications in Monterey, California, on 2 March 1992. It describes the lessons learned in developing and implementing the Artificial Intelligence Research and Development Program at the National Aeronautics and Space Administration (NASA). In so doing, the article provides a historical perspective of the program in terms of the stages it went through as it matured. These stages are similar to the "ages of artificial intelligence" that Pat Winston described a year before the NASA program was initiated. The final section of the article attempts to generalize some of the lessons learned during the first seven years of the NASA AI program into AI program management heuristics.
Clusters of conversation provide a more valuable way to spend ones time than attending sessions. At the last national meeting we escaped from the celebrations of the recent victory of Deep Blue over the dreaded Kasparov, to find just such a group, already engaged in an animated discussion: A: We need to draw a line. A: Between a program that has some intelligence in it and one that doesn't. All Deep Blue does is brute-force search. That hardly counts as AI.