Collaborating Authors

Expert Systems

DARPA's explainable AI (XAI) program: A retrospective


Dramatic success in machine learning has created an explosion of new AI capabilities. Continued advances promise to produce autonomous systems that perceive, learn, decide, and act on their own. These systems offer tremendous benefits, but their effectiveness will be limited by the machine's inability to explain its decisions and actions to human users. This issue is especially important for the United States Department of Defense (DoD), which faces challenges that require the development of more intelligent, autonomous, and reliable systems. XAI will be essential for users to understand, appropriately trust, and effectively manage this emerging generation of artificially intelligent partners.

Chinese tech companies must undergo government cyber review to list overseas


China on Tuesday evening confirmed it will increase oversight on how local tech companies operate their platforms both locally and overseas through two new sets of rules. The first set of rules, set to be enforced on February 15, is focused on cybersecurity reviews and will require local tech companies with personal information on over 1 million users to undergo a security review before being allowed to list onto overseas stock exchanges. Announced by the Cyberspace Administration of China (CAC), the rules did not specify whether cybersecurity reviews would be required for companies that list in Hong Kong. As part of a cybersecurity review process, the Chinese government can urge tech companies to make organisational changes to fulfil their commitments to the cybersecurity review. The CAC said the new listing requirement was established to address the risk of key infrastructure, data, and personal information being used maliciously by foreign actors.

Applied Sciences


In the fourth industrial revolution, or Industry 4.0, a key objective is to enhance equipment's ability to perceive its own health state and predict future behavior. The development of artificial intelligence, especially the progress made in deep learning, in the recent decade provides a promising tool in bolstering this enhancement. Such a tool can be a complement or alternative to conventional physics-based and signal-processing-based techniques in fault detection, diagnosis and prognosis applications. Researchers have started to build data-driven or hybrid models to further boost their prediction accuracy in the above applications, yet there are still some untouched or underexplored territories, such as causal inference, demystifying the black-box modelling, domain adaptation, automatic feature learning, etc. This special issue is to present current innovations and engineering achievements of scientists and industrial practitioners in the area of adopting artificial intelligence techniques in fault detection, diagnosis and prognosis.

Artificial Intelligence - Expert Systems


Expert systems (ES) are one of the prominent research domains of AI. It is introduced by the researchers at Stanford University, Computer Science Department. The expert systems are the computer applications developed to solve complex problems in a particular domain, at the level of extra-ordinary human intelligence and expertise. It contains domain-specific and high-quality knowledge. Knowledge is required to exhibit intelligence.

Web-Based Fault Diagnostic and Learning System - The International Journal of Advanced Manufacturing Technology


Web-based technology holds great potential for enabling the rapid dissemination of information and facilitating distributed decision-making. This paper presents a novel knowledge-based multi-agent system for remote fault diagnosis, which is composed of diagnostic and learning agents (DLAs), machine agents (MAs) and a central management agent (CMA). Machines are remotely diagnosed by the DLAs through the communication channels between the MAs and the DLAs. In addition, the DLAs can learn new expertise from the users, and the CMA can update the central knowledge base (CKB) shared by all the DLAs with the valuable expertise. When faults that cannot be solved with the present knowledge base occur, the DLA can acquire new knowledge, translate it into rules using a rule builder, and update the rules into the CKB.

Unsupervised machine learning techniques for fault detection and diagnosis in nuclear power plants


Develop an FDD approach based on unsupervised learning methods for NPPs. A comparative study on the presented methods is conducted. PCTRAN simulation is used to test the efficiencies of the proposed approach. Nuclear power plants have proved their importance in the energy sector by generating clean and uninterrupted energy over decades. Moreover, nuclear power plants (NPPs) are large-scale and complex systems with potential radioactive release risks.

Artificial Intelligence in Healthcare Industry


The transition to information-based healthcare delivery and administration has been expedited by technological advancements. AI/ML-driven information systems are critical to today's multidisciplinary approach to improving healthcare outcomes, which includes sophisticated imaging and genetic-based tailored therapy models. Artificial Intelligence is basically a great evolution in the field of computer science. AI has changed the way of computing and carrying out tasks easier as well as automated. Artificial Intelligence is a way in which a machine learns about patterns and ways and by using its intelligence produces desired results.

Voice-only telehealth might go away with pandemic rules set to expire

NPR Technology

Community clinics say the easing of restrictions on telehealth during the pandemic has made it possible for health workers to connect with hard-to-reach patients via a phone call -- people who are poor, elderly or live in remote areas, and don't have access to a computer or cell phone with video capability. Community clinics say the easing of restrictions on telehealth during the pandemic has made it possible for health workers to connect with hard-to-reach patients via a phone call -- people who are poor, elderly or live in remote areas, and don't have access to a computer or cell phone with video capability. Caswell County, where William Crumpton works, runs along the northern edge of North Carolina and is a rural landscape of mostly former tobacco farms and the occasional fast-food restaurant. "There are wide areas where cell phone signals are just nonexistent," Crumpton says. "Things like satellite radio are even a challenge."

No-Code Analytics – The Best Introduction to Data Science


Although reading books and watching lectures is a great way to learn analytics – it is best to start doing. However, it can be quite tricky to start doing when it comes to languages such as Python and R if someone does not have a coding background. Not only do you need to know what you are doing in terms of analytical procedures, but you also need to understand the nuances of programming languages which adds onto the list of things to learn to just get started. Therefore, the best middle ground between knowledge acquisition (books, videos, etc.) and conducting advanced analytics (Python, R, etc.) is by using open-source analytics software. These types of software are great for both knowledge acquisition and actually doing analysis as documentation is built into the software and you can start doing relatively complex tasks with only mouse clicks.

AI in the Age of the Smart Hospital


While talking about Artificial Intelligence (AI) in healthcare might sound futuristic, the first proof of concept for AI application took place in the late 1950s1. In the 1970s, researchers at Stanford developed the MYCIN program to help doctors identify blood infections.2 At Intel, we've had the opportunity to see many different types of AI applications in use by our partners in the healthcare industry, from AI-enabled robots that can help clean hospital rooms to algorithms that can perform real-time inference on endoscopic cameras. Many of these AI implementations rely on edge computing, or the ability to process and compute data close to where it originates -- either on a network-connected device or right next to the device. AI at the edge means that data can be processed and analyzed quickly -- before it goes to the cloud or a server for storage.