How to Choose the Right Artificial Intelligence Solution for Your Security Problems

#artificialintelligence

Artificial intelligence (AI) brings a powerful new set of tools to the fight against threat actors, but choosing the right combination of libraries, test suites and trading models when building AI security systems is highly dependent on the situation. If you're thinking about adopting AI in your security operations center (SOC), the following questions and considerations can help guide your decision-making. Spam detection, intrusion detection, malware detection and natural language-based threat hunting are all very different problem sets that require different AI tools. Begin by considering what kind of AI security systems you need. Understanding the outputs helps you test data.


Machine learning-based ASCVD risk calculator outperforms ACC/AHA standard

#artificialintelligence

A machine learning (ML)-based risk calculator developed to assess an individual's long-term risk for atherosclerotic cardiovascular disease (ASCVD) identified 13 percent more high-risk patients and recommended unnecessary statin therapy 25 percent less often than standard risk assessment tools in initial tests, researchers reported in the Journal of the American Heart Association. First author Ioannis A. Kakadiaris, PhD, and colleagues with the Society for Heart Attack Prevention and Eradication (SHAPE) wrote in JAHA that the current gold standard for ASCVD risk assessment--the American College of Cardiology and American Heart Association's Pooled Cohort Equations Risk Calculator--is flawed in its accuracy. "Studies have demonstrated that the current U.S. guidelines based on the ACC/AHA risk calculator may underestimate risk of atherosclerotic CVD in certain high-risk individuals, therefore missing opportunities for intensive therapy and preventing CVD events," Kakadiaris and coauthors said. The existing approach to CVD risk assessment desperately needs an overhaul." According to a consensus report from SHAPE, comprehensive ASCVD risk assessment should include evaluation of plaque, blood and myocardial vulnerability factors if it's going to be anywhere near accurate.


PopcornApps Case Study

#artificialintelligence

UK National Rail is a public transportation system operating roughly 9,000 passenger trains on 21,000 miles of track. Each day, 2.5 million passengers use its services across more than 2,500 destinations in the UK. PopcornApps, a Microsoft Gold partner and a member of the Microsoft AI Inner Circle program, has accelerated AI-led digital transformation in several industries, and it has done the same for the UK National Rail. With a focus on user experience, complex problem solving, and meaningful business outcomes, PopcornApps combines a variety of Microsoft technologies including Azure Machine Learning and Bing Cognitive Services to craft relevant solutions for its customers. Equipped with the right experience and technology, PopcornApps created intelligent chatbots for UK Rail that provide on-demand customer engagement and improving the service experience for passengers.


Must-have skills for successful e-learning professionals MATRIX Blog

#artificialintelligence

From corporations to start-ups and non-profit organizations, every company relies on its workforce -- its people -- to be successful and constantly grow. Since the currency of the business world is information, the more employees know, the better their chances to achieve success for their organizations. And the way to know more is…? E-learning professionals have a reputation for being the teachers of the workforce, with knowledge in many interconnected fields. Their job has always been an important one.


Machine learning, meet quantum computing

#artificialintelligence

Back in 1958, in the earliest days of the computing revolution, the US Office of Naval Research organized a press conference to unveil a device invented by a psychologist named Frank Rosenblatt at the Cornell Aeronautical Laboratory. Rosenblatt called his device a perceptron, and the New York Times reported that it was "the embryo of an electronic computer that [the Navy] expects will be able to walk, talk, see, write, reproduce itself, and be conscious of its existence." Those claims turned out to be somewhat overblown. But the device kick-started a field of research that still has huge potential today. A perceptron is a single-layer neural network.


Big Data is a Powerful Asset for Business Success

#artificialintelligence

Various business trends today, such as the use of artificial intelligence and multimedia visual marketing, are connected to the concept of Big Data. Every action internet users take generates a data trail, and the amount of machine-generated data is growing too. Using this data effectively can give businesses an edge in today's competitive environment. Analyzing Big Data helps them to achieve better results in many areas of business with minimum wasted effort and costs. Big Data refers to large amounts of information.


Webinar: Manufacturing and Artificial Intelligence: How Computer Vision Drives ROI

#artificialintelligence

Manufacturing enterprises are quickly deploying AI solutions to stay ahead, but how to do scale these advances -- and where to begin -- remain elusive. This talk, moderated by Levatas' head of Data Science, will walk through how to perform human-in-the-loop analysis of unstructured data such as imagery and video footage, and how it could save businesses time and money. Using real examples in NLP and computer vision from other industries, you'll see how it could be possible for your firm to take advantage of these cost-saving technologies in the near-future. We'll walk through what's needed and what kind of results other industries are seeing and what the potential is for this industry. Daniel is an avid technology enthusiast with 15 years of experience designing and architecting software applications.


Which Countries Lead in Deep Machine Learning Research?

#artificialintelligence

These terms are now among the most used in tech think tanks around the world. All devices and systems are already integrated with this technology to keep up with the demands of the human race, drastically improving the way we live. We've reached that age of time wherein men have created machines capable of thinking independently. These machines can learn and mimic human-like responses which can be applied in a variety of fields like medicine, transportation and manufacturing, among others. All of us have benefited from them for sure.


One Robot To Rule Them All

#artificialintelligence

Don't be deceived by the name, this adorable robot isn't going to clean your floors, but it might be able to control another device that does. It's called the IHR (Intelligent Housekeeping Robot) and its primary function is to act as a central hub that can manage all your smart devices efficiently. The IHR will make informed decisions by monitoring the condition of the interior and assigning "jobs" to your devices. If your floors are dirty, it can trigger the vacuum… or if your home has been inactive for a period of time, it can set the alarm and lock the doors… all while looking pretty darn cute. Intelligent Housekeeping Robot is a winner of the 2018 Red Dot Design Concept Award.


Merging memory and computation, programmable chip speeds AI, slashes power use

#artificialintelligence

By shifting a fundamental property of computation, Princeton researchers have built a new type of computer chip that boosts the performance and slashes the energy demands of systems used for artificial intelligence. The chip, which works with standard programming languages, could be particularly useful on phones, watches or other devices that rely on high-performance computing and have limited battery life. The chip, based on a technique called in-memory computing, is designed to clear a primary computational bottleneck that forces computer processors to expend time and energy fetching data from stored memory. In-memory computing performs computation directly in the storage, allowing for greater speed and efficiency. The announcement of the new chip, along with a system to program it, follows closely on an earlier report that the researchers in collaboration with Analog Devices Inc. had fabricated circuitry for in-memory computing.