What AI still can't do

#artificialintelligence 

Machine-learning systems can be duped or confounded by situations they haven't seen before. A self-driving car gets flummoxed by a scenario that a human driver could handle easily. An AI system laboriously trained to carry out one task (identifying cats, say) has to be taught all over again to do something else (identifying dogs). In the process, it's liable to lose some of the expertise it had in the original task. Computer scientists call this problem "catastrophic forgetting."

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found