explainable-ai
Cognitive Explainable Artificial Intelligence (AI) breakthroughs in Machine Learning (ML) for US Air Force: 3D Image Recognition using few training samples on CPU (without GPU)
Z Advanced Computing, Inc. (ZAC), the pioneer Cognitive Explainable-AI (Artificial Intelligence) (Cognitive XAI) software startup, has made AI and Machine Learning (ML) breakthroughs: ZAC has achieved 3D Image Recognition using only a few training samples, and using only an average laptop with low power CPU, for both training and recognition, for the US Air Force (USAF). This is in sharp contrast to the other algorithms in industry that require thousands to billions of samples, being trained on large GPU servers. "ZAC requires much less computing power and much less electrical power to run, which is great for mobile and edge computing, as well as environment, with less Carbon footprint," emphasized Dr. Saied Tadayon, CTO of ZAC. ZAC is the first to demonstrate the novel and superior algorithms Cognition-based Explainable-AI (XAI), where various attributes and details of 3D (three dimensional) objects are recognized from any view or angle. "You cannot do this task with the other algorithms, such as Deep Convolutional Neural Networks (CNN) or ResNets, even with an extremely large number of training samples, on GPU servers. That's basically hitting the limitations of CNNs or Neural Nets, which all other companies are using now," said Dr. Bijan Tadayon, CEO of ZAC.
- Government > Regional Government > North America Government > United States Government (1.00)
- Government > Military > Air Force (1.00)
- Information Technology > Artificial Intelligence > Issues > Social & Ethical Issues (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Explanation & Argumentation (0.96)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.94)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition > Image Matching (0.65)
Explainable-AI: Where Supervised Learning Can Falter
Disclaimer: I'll be talking mainly about logistic-regression and basic feed-forward neural networks, so its helpful to have programmed with those 2 models before reading this piece. OK -- before statisticians and ML folks come running after me after reading the title, I'm not talking about linear regression, for example. Yes, in linear regression, you can use the R-squared (or adjusted R-squared statistic) to talk about explained variance, and since linear regression only involves addition between independent variables (or predictors), they're pretty interpretable. If you were doing a linear regression to predict, say the price of a car Car_Price, based on the number of seats, mileage, maximum-speed, and battery life, your linear model could be –– say Car_Price c1*Seats c2*Mileage c3*Speed c4*Battery_Power –– the fact that variables are only added makes it pretty interpretable. But when it comes to more complex prediction models like Logistic Regression and neural networks, everything about the predictors (or called "features" in ML) becomes more confusing.
Inside the Black Box: 5 Methods for Explainable-AI (XAI)
Explainable artificial intelligence (XAI) is the attempt to make the finding of results of non-linearly programmed systems transparent to avoid so-called black-box processes. The main task of XAI is to make non-linear programmed systems transparent. It offers practical methods to explain AI models, which, for example, correspond to the regulation of the General Data Protection Regulation (GDPR). The following five methods are listed, which have to make AI models more transparent and understandable. Layer-wise Relevance Propagation (LRP) is a technique that brings such explainability and scales to potentially highly complex deep neural networks.
Artificial Intelligence Breakthrough: Training and Image Recognition on Low Power CPU (with no GPU), via Explainable-AI for Smart Appliance Pilot for Bosch
Z Advanced Computing, Inc. (ZAC), the pioneer startup on Explainable-AI (Artificial Intelligence) (XAI), is developing its Smart Home product line through a paid-pilot for Smart Appliances for BSH Home Appliances (a subsidiary of the Bosch Group, originally a joint venture between Bosch and Siemens), the largest manufacturer of home appliances in Europe and one of the largest in the world. ZAC just successfully finished its Phase 1 of the pilot program. "Our cognitive-based algorithm is more robust, resilient, consistent, and reproducible, with a higher accuracy, than Convolutional Neural Nets or GANs, which others are using now. It also requires much smaller number of training samples, compared to CNNs, which is a huge advantage," said Dr. Saied Tadayon, CTO of ZAC. "We did the entire work on a regular laptop, for both training and recognition, without any dedicated GPU. So, our computing requirement is much smaller than a typical Neural Net, which requires a dedicated GPU," continued Dr. Bijan Tadayon, CEO of ZAC.
- Europe (0.29)
- North America > United States > Maryland > Montgomery County > Potomac (0.09)
Explainable-AI (Artificial Intelligence) Image Recognition Startup Pilots Smart Appliance with Bosch
Z Advanced Computing, Inc. (ZAC), an AI (Artificial Intelligence) software startup, is developing its Smart Home product line through a paid-pilot for smart appliances for BSH Home Appliances, the largest manufacturer of home appliances in Europe and one of the largest in the world. BSH Home Appliances Corporation is a subsidiary of the Bosch Group, originally a joint venture between Robert Bosch GmbH and Siemens AG. ZAC Smart Home product line uses ZAC Explainable-AI Image Recognition. ZAC is the first to apply Explainable-AI in Machine Learning. "You cannot do this with other techniques, such as Deep Convolutional Neural Networks," said Dr. Saied Tadayon, CTO of ZAC.
- Europe (0.27)
- North America > United States > Maryland > Montgomery County > Potomac (0.07)
- Information Technology > Artificial Intelligence > Natural Language > Explanation & Argumentation (0.89)
- Information Technology > Artificial Intelligence > Issues > Social & Ethical Issues (0.89)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition > Image Matching (0.69)
- (2 more...)
US Air Force funds Explainable-AI for UAV tech
Z Advanced Computing, Inc. (ZAC) of Potomac, MD announced on August 27 that it is funded by the US Air Force, to use ZAC's detailed 3D image recognition technology, based on Explainable-AI, for drones (unmanned aerial vehicle or UAV) for aerial image/object recognition. ZAC is the first to demonstrate Explainable-AI, where various attributes and details of 3D (three dimensional) objects can be recognized from any view or angle. "With our superior approach, complex 3D objects can be recognized from any direction, using only a small number of training samples," said Dr. Saied Tadayon, CTO of ZAC. "For complex tasks, such as drone vision, you need ZAC's superior technology to handle detailed 3D image recognition." "You cannot do this with the other techniques, such as Deep Convolutional Neural Networks, even with an extremely large number of training samples. That's basically hitting the limits of the CNNs," continued Dr. Bijan Tadayon, CEO of ZAC.
- Government > Military > Air Force (1.00)
- Government > Regional Government > North America Government > United States Government (0.65)
U.S. Air Force invests in Explainable-AI for unmanned aircraft
Software star-up, Z Advanced Computing, Inc. (ZAC), has received funding from the U.S. Air Force to incorporate the company's 3D image recognition technology into unmanned aerial vehicles (UAVs) and drones for aerial image and object recognition. ZAC's in-house image recognition software is based on Explainable-AI (XAI), where computer-generated image results can be understood by human experts. ZAC – based in Potomac, Maryland – is the first to demonstrate XAI, where various attributes and details of 3D objects can be recognized from any view or angle. "With our superior approach, complex 3D objects can be recognized from any direction, using only a small number of training samples," says Dr. Saied Tadayon, CTO of ZAC. "You cannot do this with the other techniques, such as deep Convolutional Neural Networks (CNNs), even with an extremely large number of training samples. That's basically hitting the limits of the CNNs," adds Dr. Bijan Tadayon, CEO of ZAC.
- North America > United States > Maryland > Montgomery County > Potomac (0.28)
- North America > United States > Ohio > Cuyahoga County > Cleveland (0.08)
- Government > Military > Air Force (1.00)
- Aerospace & Defense > Aircraft (0.91)
- Government > Regional Government > North America Government > United States Government (0.77)
- Information Technology > Sensing and Signal Processing > Image Processing (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition > Image Matching (0.65)
- Information Technology > Artificial Intelligence > Natural Language > Explanation & Argumentation (0.64)
- (3 more...)