Results


On Machine Learning and Programming Languages

#artificialintelligence

Any sufficiently complicated machine learning system contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of a programming language.1 As programming languages (PL) people, we have watched with great interest as machine learning (ML) has exploded – and with it, the complexity of ML models and the frameworks people are using to build them. State-of-the-art models are increasingly programs, with support for programming constructs like loops and recursion, and this brings out many interesting issues in the tools we use to create them – that is, programming languages. While machine learning does not yet have a dedicated language, several efforts are effectively creating hidden new languages underneath a Python API (like TensorFlow) while others are reusing Python as a modelling language (like PyTorch). We'd like to ask – are new ML-tailored languages required, and if so, why?


rise_of_the_machines.aspx

#artificialintelligence

AMONG THE MANY EMERGING TRENDS IN THE technology sector, the rise of artificial intelligence (AI) is likely to be one of the most significant over the coming years. AI refers to the ability of machines to perform tasks that would typically be associated with human cognition such as responding to questions, recognizing faces, playing video games or describing objects. Over recent years, AI capability has improved to such an extent that a range of commercial applications are now possible in areas like consumer electronics, industrial automation and online retail. Technology companies of all sizes and in locations all around the world are developing AI-driven products aimed at reducing operating costs, improving decision-making and enhancing consumer services across a range of client industries. And despite a decline in venture capital funding across industries overall in 2016, AI startups raised a record $5 billion globally last year – a 71% annualized growth rate and near-tenfold rise over the 2012 level (see EXHIBIT 1).


How AI Careers Fit into the Data Landscape – Insight Data

#artificialintelligence

The goal of newly-formed AI teams is to build intelligent systems, focused on quite specific tasks, that can be integrated into the scalable data transformations of Data Engineering work and the data products and business decisions of Data Science work. The differences between Artificial Intelligence, Data Science, and Data Engineering can vary considerably among companies and teams. Artificial Intelligence, or AI, focuses on understanding core human abilities such as vision, speech, language, decision making, and other complex tasks, and designing machines and software to emulate these processes. These models typically require very large datasets, so while efficient manipulation and use of large amounts of data is a fundamental aspect of Data Engineering work, it is crucial for state-of-the-art AI systems.


A.I. Business Applications (and How It May Impact You)

#artificialintelligence

Technology companies of all sizes and in locations all around the world are developing AI-driven products aimed at reducing operating costs, improving decision-making and enhancing consumer services across a range of client industries. The sum of these drivers -- new programming techniques, more data and faster chips -- has seen AI converge with human-level performance in the key areas of image classification and speech recognition over recent years (see EXHIBIT 2). Chipmakers stand to benefit from increased demand for processing power, particularly makers of graphical processing units for AI program training. And internet companies with AI at the core of their consumer services (such as digital assistants and new software features) stand to benefit directly from improvements in speech recognition and image classification.


4 ways to use AI for better cloud ops efficiency TechBeacon

#artificialintelligence

The increasing adoption of cloud (soon, 80% of all IT budgets will be committed to cloud solutions) and the emergence of artificial intelligence (AI) and machine-learning (ML) technologies are allowing companies to use intelligent software automation to make decisions on known problems, predict issues, and provide diagnostic information to reduce the operational overhead for engineers. With AI Ops, machine intelligence and AI technologies can detect cost spikes, provide deep visibility into who used what, and help companies deploy intelligent automation to address these issues. In addition, AI and ML on AI Ops can intelligently automate other areas of operations including deployment (with cluster management and auto-healing tooling), application performance management (not just what's happening but why it's happening due to what), log management (real-time streaming of log events and auto detection of relevant anomaly events based on application stack), and incident management (by suppressing noise from different alerting systems and providing diagnostics for engineers to get to the root cause faster). Companies must leverage AI Ops and technologies such as artificial intelligence and machine learning to disrupt cloud operations and ease infrastructure management.


reshaping-computer-aided-design

Robohub

Researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and Columbia University are trying to make the process faster and easier: In a new paper, they've developed InstantCAD, a tool that lets designers interactively edit, improve, and optimize CAD models using a more streamlined and intuitive workflow. Traditional CAD systems are "parametric," which means that when engineers design models, they can change properties like shape and size ("parameters") based on different priorities. Matusik says InstantCAD could be particularly helpful for more intricate designs for objects like cars, planes, and robots, particularly for industries like car manufacturing that care a lot about squeezing every little bit of performance out of a product. "In a world where 3-D printing and industrial robotics are making manufacturing more accessible, we need systems that make the actual design process more accessible, too," Schulz says.


Reshaping computer-aided design

MIT News

Researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and Columbia University are trying to make the process faster and easier: In a new paper, they've developed InstantCAD, a tool that lets designers interactively edit, improve, and optimize CAD models using a more streamlined and intuitive workflow. Traditional CAD systems are "parametric," which means that when engineers design models, they can change properties like shape and size ("parameters") based on different priorities. Matusik says InstantCAD could be particularly helpful for more intricate designs for objects like cars, planes, and robots, particularly for industries like car manufacturing that care a lot about squeezing every little bit of performance out of a product. "In a world where 3-D printing and industrial robotics are making manufacturing more accessible, we need systems that make the actual design process more accessible, too," Schulz says.


Experts Predict When Artificial Intelligence Will Exceed Humans

#artificialintelligence

Just one day earlier, Katja Grace at the Future of Humanity Institute at the University of Oxford and four other researchers published, "When Will AI exceed Human Performance? Source: "When Will AI exceed Human Performance? Experts predict that over the next decade, artificial intelligence (AI) will outperform humans at translating languages (by 2024), writing high school essays (by 2026), and driving trucks (by 2027). Moving a little further into the future, experts predict AI will replace humans working in retail (2031), writing New York Times bestsellers (2049), and doing surgery (2053).


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").