Goto

Collaborating Authors

Results


Modern Computing: A Short History, 1945-2022

#artificialintelligence

Inspired by A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi. But the selection of key events in the journey from ENIAC to Tesla, from Data Processing to Big Data, is mine. This was the first computer made by Apple Computers Inc, which became one of the fastest growing ... [ ] companies in history, launching a number of innovative and influential computer hardware and software products. Most home computer users in the 1970s were hobbyists who designed and assembled their own machines. The Apple I, devised in a bedroom by Steve Wozniak, Steven Jobs and Ron Wayne, was a basic circuit board to which enthusiasts would add display units and keyboards. April 1945 John von Neumann's "First Draft of a Report on the EDVAC," often called the founding document of modern computing, defines "the stored program concept." July 1945 Vannevar Bush publishes "As We May Think," in which he envisions the "Memex," a memory extension device serving as a large personal repository of information that could be instantly retrieved through associative links.


Deep Learning Poised to 'Blow Up' Famed Fluid Equations

#artificialintelligence

For more than 250 years, mathematicians have been trying to "blow up" some of the most important equations in physics: those that describe how fluids flow. If they succeed, then they will have discovered a scenario in which those equations break down -- a vortex that spins infinitely fast, perhaps, or a current that abruptly stops and starts, or a particle that whips past its neighbors infinitely quickly. Beyond that point of blowup -- the "singularity" -- the equations will no longer have solutions. They will fail to describe even an idealized version of the world we live in, and mathematicians will have reason to wonder just how universally dependable they are as models of fluid behavior. But singularities can be as slippery as the fluids they're meant to describe.


8 Biggest Artificial Intelligence (AI) Acquisitions of 2021

#artificialintelligence

Artificial intelligence (AI) helps automate repetitive tasks comprising large datasets. This makes it beneficial in many diverse fields and business environments. This has led the demand for such solutions to rise. It's also driving AI acquisitions as businesses attempt to streamline their workflows. In this article, you'll learn about the 8 biggest AI acquisitions from 2021.


Is the future of AI supervised?

#artificialintelligence

The future is already here – it's just not evenly distributed: William Gibson We are gravitating towards technological singularity. Futurists like Louis Rosenberg, Ray Kurzweil, and Patrick Winston have predicted the timeframe for'super intelligence' (between 2030-2045). But are these timelines realistic? And what approaches (supervised, semi-supervised, or unsupervised learning) will get us there? Andrew Ng, founder and CEO of Landing AI, swears by smart-sized, "data-centric" AI, whereas Meta's VP & Chief AI Scientist, Yann LeCun thinks "the revolution will not be supervised".


How AI-powered XDR can secure the hybrid workforce - S.G.E

#artificialintelligence

A year ago, NOV Inc. was in the middle of evaluating a new security product to help with securing its globally distributed workforce, spread across more than 60 countries. The oilfield equipment maker was considering deploying an extended detection and response (XDR) solution from SentinelOne -- and as part of the evaluation, NOV deployed the XDR platform across a company it had recently acquired. "Immediately" after deployment, SentinelOne's Singularity XDR detected and halted a cyberattack in progress against the acquired company, said NOV chief information security officer John McLeod -- and then remediated the attack, as well. "This was all done during the pandemic lockdown, in a country on the other side of the globe, where we didn't speak the same language," McLeod said in an email. Perhaps unsurprisingly, NOV ended up becoming a customer.


La veille de la cybersécurité

#artificialintelligence

Microsoft's Azure and Research teams are working together to build a new AI infrastructure service, codenamed « Singularity. A group of those working on the project have published a paper entitled « Singularity: Planet-Scale, Preemptible and Elastic Scheduling of AI Workloads, » which provides technical details about the Singularity effort. The Singularity service is about providing data scientists and AI practitioners with a way to build, scale, experiment and iterate on their models on a Microsoft-provided distributed infrastructure service built specifically for AI. Authors listed on the newly published paper include Azure Chief Technical Officer Mark Russinovich; Partner Architect Rimma Nehme, who worked on Azure Cosmos DB until moving to Azure to work on AI and deep learning in 2019; and Technical Fellow Dharma Shukla. Microsoft officials previously have discussed plans to make FPGAs, or field-programmable gate arrays, available to customers as a service.


Microsoft's 'Singularity' to Enable Global Accelerator Network for AI Training

#artificialintelligence

In science fiction and future studies, the word "singularity" is invoked in reference to a rapidly snowballing artificial intelligence that, repeatedly iterating on itself, eclipses all human knowledge and ability. It is this word that Microsoft--perhaps ambitiously--has invoked for its new AI project, a "globally distributed scheduling service for highly efficient and reliable execution of deep learning training and inference workloads." Microsoft's Singularity is a response to the computational costs of training deep learning workloads--costs that have quickly spiraled as those workloads have grown in size, complexity and number. It is also an attempt to maximize the use of idle time, which has increasingly become a focus of discussions of how to minimize the costs and environmental footprints of high-performance computing systems and AI model training on such systems. "Singularity is built with one key goal," explains the preprint paper, which was written by a team of more than two dozen Microsoft researchers and published on arXiv, "driving down the cost of AI by maximizing the aggregate useful throughput on a given fixed pool of capacity of accelerators on a planet scale, while providing stringent [service-level agreements] for multiple pricing tiers."


Singularity: Planet-Scale, Preemptible, Elastic Scheduling of AI Workloads

arXiv.org Artificial Intelligence

Lowering costs by driving high utilization across deep learning workloads is a crucial lever for cloud providers. We present Singularity, Microsoft's globally distributed scheduling service for highly-efficient and reliable execution of deep learning training and inference workloads. At the heart of Singularity is a novel, workload-aware scheduler that can transparently preempt and elastically scale deep learning workloads to drive high utilization without impacting their correctness or performance, across a global fleet of AI accelerators (e.g., GPUs, FPGAs). All jobs in Singularity are preemptable, migratable, and dynamically resizable (elastic) by default: a live job can be dynamically and transparently (a) preempted and migrated to a different set of nodes, cluster, data center or a region and resumed exactly from the point where the execution was preempted, and (b) resized (i.e., elastically scaled-up/down) on a varying set of accelerators of a given type. Our mechanisms are transparent in that they do not require the user to make any changes to their code or require using any custom libraries that may limit flexibility. Additionally, our approach significantly improves the reliability of deep learning workloads. We show that the resulting efficiency and reliability gains with Singularity are achieved with negligible impact on the steady-state performance. Finally, our design approach is agnostic of DNN architectures and handles a variety of parallelism strategies (e.g., data/pipeline/model parallelism).


The 5 Technologies That Will Change The Future Of The Human Race

#artificialintelligence

In my book, Tech Trends in Practice, I talk about a lot of technology trends that are already moving out of the R&D departments and into everyday life, but the following five I think will have the most profound impacts on our society and the human race as a whole. Artificial intelligence, or AI, and machine learning refer to the ability of machines to learn and act intelligently, meaning they can make decisions, carry out tasks, and even predict future outcomes based on what they learn from data. AI and machine learning already play a bigger role in everyday life than you might imagine. Alexa, Siri, Amazon's product recommendations, Netflix's and Spotify's personalized recommendations, every Google search you make, security checks for fraudulent credit card purchases, dating apps, fitness trackers... All are driven by AI.


There is a Singularity in the Loss Landscape

arXiv.org Artificial Intelligence

Despite the widespread adoption of neural networks, their training dynamics remain poorly understood. We show experimentally that as the size of the dataset increases, a point forms where the magnitude of the gradient of the loss becomes unbounded. Gradient descent rapidly brings the network close to this singularity in parameter space, and further training takes place near it. This singularity explains a variety of phenomena recently observed in the Hessian of neural network loss functions, such as training on the edge of stability and the concentration of the gradient in a top subspace. Once the network approaches the singularity, the top subspace contributes little to learning, even though it constitutes the majority of the gradient.