Software star-up, Z Advanced Computing, Inc. (ZAC), has received funding from the U.S. Air Force to incorporate the company's 3D image recognition technology into unmanned aerial vehicles (UAVs) and drones for aerial image and object recognition. ZAC's in-house image recognition software is based on Explainable-AI (XAI), where computer-generated image results can be understood by human experts. ZAC – based in Potomac, Maryland – is the first to demonstrate XAI, where various attributes and details of 3D objects can be recognized from any view or angle. "With our superior approach, complex 3D objects can be recognized from any direction, using only a small number of training samples," says Dr. Saied Tadayon, CTO of ZAC. "You cannot do this with the other techniques, such as deep Convolutional Neural Networks (CNNs), even with an extremely large number of training samples. That's basically hitting the limits of the CNNs," adds Dr. Bijan Tadayon, CEO of ZAC.
Most people use Google's search-by-image feature to either look for copyright infringement, or for shopping. See some shoes you like on a frenemy's Instagram? Search will pull up all the matching images on the web, including from sites that will sell you the same pair. In order to do that, Google's computer vision algorithms had to be trained to extract identifying features like colors, textures, and shapes from a vast catalogue of images. Luis Ceze, a computer scientist at the University of Washington, wants to encode that same process directly in DNA, making the molecules themselves carry out that computer vision work. And he wants to do it using your photos.
Image recognition has become increasingly critical in applications ranging from smartphones to driverless cars, and on Wednesday UCLA opened up to the public a new algorithm that promises big gains. The Phase Stretch Transform algorithm is a physics-inspired computational approach to processing images and information that can help computers "see" features of objects that aren't visible using standard imaging techniques. It could be used to detect an LED lamp's internal structure, for example--something that would be obscured to conventional techniques by the brightness of its light. It can also distinguish distant stars that would normally be invisible in astronomical images, UCLA said. Essentially, the algorithm works by performing a mathematical operation that identifies objects' edges and then detects and extracts their features.
A demo of the Orcam MyEye 2.0 was one of the highlights at the AbilityNet/RNIB TechShare Pro event in November. This small device, an update to the MyEye released in 2013, clips onto any pair of glasses and provides discrete audio feedback about the world around the wearer. It uses state-of-the-art image recognition to read signs and documents as well as recognise people and does not require internet connection. It's just one of many apps and devices that are using the power of artificial intelligence (AI) to transform the lives of people who are blind or have sight loss.
Z Advanced Computing, Inc. (ZAC), an AI (Artificial Intelligence) software startup, is developing its Smart Home product line through a paid-pilot for smart appliances for BSH Home Appliances, the largest manufacturer of home appliances in Europe and one of the largest in the world. BSH Home Appliances Corporation is a subsidiary of the Bosch Group, originally a joint venture between Robert Bosch GmbH and Siemens AG. ZAC Smart Home product line uses ZAC Explainable-AI Image Recognition. ZAC is the first to apply Explainable-AI in Machine Learning. "You cannot do this with other techniques, such as Deep Convolutional Neural Networks," said Dr. Saied Tadayon, CTO of ZAC.