AI in the Age of the Smart Hospital

#artificialintelligence 

While talking about Artificial Intelligence (AI) in healthcare might sound futuristic, the first proof of concept for AI application took place in the late 1950s1. In the 1970s, researchers at Stanford developed the MYCIN program to help doctors identify blood infections.2 At Intel, we've had the opportunity to see many different types of AI applications in use by our partners in the healthcare industry, from AI-enabled robots that can help clean hospital rooms to algorithms that can perform real-time inference on endoscopic cameras. Many of these AI implementations rely on edge computing, or the ability to process and compute data close to where it originates -- either on a network-connected device or right next to the device. AI at the edge means that data can be processed and analyzed quickly -- before it goes to the cloud or a server for storage.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found