Machine Learning


Artificial intelligence is a totalitarian's dream – here's how to take power back

#artificialintelligence

Individualistic western societies are built on the idea that no one knows our thoughts, desires or joys better than we do. And so we put ourselves, rather than the government, in charge of our lives. We tend to agree with the philosopher Immanuel Kant's claim that no one has the right to force their idea of the good life on us. Artificial intelligence (AI) will change this. It will know us better than we know ourselves.


Assessing Gender Gaps in Artificial Intelligence

#artificialintelligence

As roles and tasks shift in tandem with the expansion of new technologies, and the division of work between human and machine is redrawn, it is of critical importance to monitor how those changes will impact the evolution of economic gender gaps. Artificial Intelligence (AI) is a prominent driver of change within the transformations brought about by the Fourth Industrial Revolution (4IR), and can serve as key marker of the trajectory of innovation across industries.19 In partnership with the LinkedIn Economic Graph Team, the World Economic Forum aims to provide fresh evidence of the emerging contours of gender parity in the new world of work through near-term labour market information. The increasing expansion of AI is creating the demand for a range of new skills, among them neural networks, deep learning, machine learning, and "tools" such as Weka and Scikit-Learn. AI skills are among the fastest-growing specializations among professionals represented on the LinkedIn platform.


Machine Learning Operations - Run:AI

#artificialintelligence

This article explains how Machine Learning Operations came to be a discipline inside many companies and things to consider when deciding if your organization is ready to form an MLOps team. Machine learning (ML) is a subset of artificial intelligence in which computer systems autonomously learn a task over time. Based on pattern analyses and inference models, ML algorithms allow a computer system to adapt in real time as it is exposed to data and real-world interactions. For many people, ML was, until recently, considered science fiction. But advances in computational power, frictionless access to scalable cloud resources, and the exponential growth of data have fueled an increase in ML-based applications.


The Emergence Of Hardware As A Key Enabler For The Age Of Artificial Intelligence

#artificialintelligence

Over the past few decades, software has been the engine of innovation for countless applications. From PCs to mobile phones, well-defined hardware platforms and instruction set architectures (ISA) have enabled many important advancements across vertical markets. The emergence of abundant-data computing is changing the software-hardware balance in a dramatic way. Diverse AI applications in facial recognition, virtual assistance, autonomous vehicles and more are sharing a common feature: They rely on hardware as the core enabler of innovation. Since 2017, the AI hardware market has grown 60-70% annually, and is projected to reach $65 billion by 2025.


Machine Learning and Artificial Intelligence

#artificialintelligence

Artificial intelligence and machine learning can teach us about the future. In today's Academic Minute, the University of Alaska Fairbanks' Falk Huettmann explores the benefit to the public good from these technologies. Huettmann is an associate professor of wildlife biology at Fairbanks. A transcript of this podcast can be found here.


EETimes - What the AI Chip Market is All About

#artificialintelligence

Right now, the AI chip market is all about deep learning. Deep learning (DL) is the most successful of machine learning paradigms at making AI applications useful in the real world. The AI chip market today is all about accelerating deep learning (DL) – the acceleration is needed during training and during inferencing. The AI chip market has exploded with players: for a recent research report we counted some 80 startups globally with $10.5 billion spend by investors, competing with some 34 established players. Clearly this is unsustainable, but we need to dissect this market to better understand why it is the way it is now, how it is likely to change, and what it all means.


AI in cyber: Using artificial intelligence create more resilient cyber security

#artificialintelligence

Cyber attacks and threats are considered major disruptors to businesses, nations and consumers alike. Artificial intelligence is seen as a major disruptive force too, but of the positive kind, fuelling a new era of hyper connectivity, hyper intelligence and hyper performance. An increasingly complex business environment is leading organisations to embrace forms of artificial intelligence such as machine learning and facial recognition technology, while using data to build more intimate relationships with consumers. But the flip side of these innovations is that the'attack surfaces' of an organisation are multiplying, creating a fast-growing world of vulnerability to cyber crime that didn't exist before. At the same time, AI use is on the rise among cyber criminals, who are using it to help drive attacks, employing the technology to uncover unsecured points of entry in enterprise networks.


Cadence Delivers Machine Learning up to 5X Faster Regressions – IAM Network

#artificialintelligence

Core engine performance enhancements accelerate verification throughput by reducing stimulation cycles with matching coverage on randomized test suites. Cadence Design Systems, Inc. today announced that the Cadence Xcelium Logic Simulator has been enhanced with machine learning technology (ML), called Xcelium ML, to increase verification throughput.Using new machine learning technology and core computational software, Xcelium ML enables up to 5X faster verification closure on randomized regressions. Using computational software and a proprietary machine learning technology that directly interfaces to the simulation kernel, Xcelium ML learns iteratively over an entire simulation regression.It analyzes patterns hidden in the verification environment and guides the Xcelium randomization kernel on subsequent regression runs to achieve matching coverage with reduced simulation cycles. Cadence's Xcelium Logic Simulator provides best-in-class core engine performance for SystemVerilog, VHDL, mixed-signal, low power, and x-propagation.It supports both single-core and multi-core simulation, incremental and parallel build, and save/restart with dynamic test reload. The Xcelium Logic Simulator has been deployed by a majority of top semiconductor companies, and a majority of top companies in the hyper-scale, automotive, and consumer electronics segments.Kioxia has effectively utilized Xcelium simulation for a variety of our designs, and it addresses our ever-growing verification needs.


IIT Roorkee joins Coursera to launch 2 AI, ML programmes - Express Computer

#artificialintelligence

The Indian Institute of Technology-Roorkee (IIT-R) in partnership with leading online learning platform Coursera on Thursday launched two new online certificate programmes for professionals looking to build skills in data science, Artificial Intelligence (AI) and Machine Learning (ML). The six-month certificate programme in AI and ML will consist of video lectures, hands-on learning opportunities, team projects, tutorials and workshops. The programme will also teach classical ML techniques and provide hands-on programming experience with'Tensorflow' software for model building, robust ML production and powerful experimentation. The certificate programme in data science will help professionals build skills in data science, machine learning, critical thinking, data collection, data visualization and data management. "We are delighted to partner with Coursera to help fulfil the goal of inclusive education of the New Education Policy," Professor Ajit K Chaturvedi, Director, IIT Roorkee, said in a statement.


New machine learning tool predicts devastating intestinal disease in premature infants

#artificialintelligence

Necrotizing enterocolitis (NEC) is a life-threatening intestinal disease of prematurity. Characterized by sudden and progressive intestinal inflammation and tissue death, it affects up to 11,000 premature infants in the United States annually, and 15-30% of affected babies die from NEC. Survivors often face long-term intestinal and neurodevelopmental complications. Researchers from Columbia Engineering and the University of Pittsburgh have developed a sensitive and specific early warning system for predicting NEC in premature infants before the disease occurs. The prototype predicts NEC accurately and early, using stool microbiome features combined with clinical and demographic information. The pilot study was presented virtually on July 23 at ACM CHIL 2020.