Collaborating Authors


MacBook Pro 13in 2020 review: Apple has 'created something extraordinary'

The Independent - Tech

The latest MacBook Pro, just released, means the entire Apple laptop range has now been refreshed with newer processors and, most importantly, the new Magic Keyboard. Apple's complete range of laptops offer striking design, sumptuous trackpads, excellent performance and gorgeous screens. The MacBook Air was the last to gain a Retina Display in late 2018. But there was one key ingredient which wasn't working quite as well as it should have been for many users: the keyboard. A few years back, Apple switched its keyboard mechanism from scissor-switch to butterfly.

New AI Chips Set to Reshape Data Centers - EE Times India


AI chip startups are hot on the heels of GPU leader Nvidia. At the same time, there is also significant competition in data center inference... New computing models such as machine learning and quantum are becoming more important for delivering cloud services. The most immediate computing change has been the rapid adoption of ML/AI for consumer and business applications. This new model requires the processing vast amounts of data to developing usable information, and eventually building knowledge models. These models are rapidly growing in complexity – doubling every 3.5 months.

Why the AI revolution now? Because of 6 key factors.


About: Data-Driven Science (DDS) provides training for people building a career in Artificial Intelligence (AI). In recent years, AI has been taking off and became a topic that is frequently making it into the news. But why is that actually? AI research has started in the mid-twentieth century when mathematician Alan Turing asked the question "Can Machines Think?" in a famous paper in 1950. However, it's been not until the 21st century that Artificial Intelligence has shaped real-world applications that are impacting billions of people and most industries across the globe.

AI Computing for Automotive: The Battle for Autonomy - EE Times Asia


The 2025 market for AI, including ADAS and robotic vehicles, is estimated at $2.75 billion – of which $2.5 billion will be "ADAS only"... Artificial Intelligence (AI) is gradually invading our lives through everyday objects like smartphones, smart speakers, and surveillance cameras. The hype around AI has led some players to consider it as a secondary objective, more or less difficult to achieve, rather than as a central tool to achieve the real objective: autonomy. Who are the winners and losers in the race for autonomy? "AI is gradually invading our lives and this will be particularly true in the automotive world" asserts Yohann Tschudi, Technology & Market Analyst, Computing & Software at Yole Développement (Yole). "AI could be the central tool to achieve AD, in the meantime some players are afraid of overinflated hype and do not put AI at the center of their AD strategy".


Communications of the ACM

Moritz Lipp is a Ph.D. candidate at Graz University of Technology, Flanders, Austria. Michael Schwarz is a postdoctoral researcher at Graz University of Technology, Flanders, Austria. Daniel Gruss is an assistant professor at Graz University of Technology, Flanders, Austria. Thomas Prescher is a chief architect at Cyberus Technology GmbH, Dresden, Germany. Werner Haas is the Chief Technology Officer at Cyberus Technology GmbH, Dresden, Germany.

Detecting/Preventing Infections, and Moving Instruction Online

Communications of the ACM

As of March 17th, 2020, more than 188,297 people have been infected with COVID-19. How can technology aid in curtailing the spread of infectious diseases that have the potential to create panic and infirm thousands of people? The Internet of Things (IoT), a network of interconnected systems and advances in data analytics, artificial intelligence, and connectivity, can help by providing an early warning system to curb the spread of infectious diseases. China's efforts to control the coronavirus have meant many residents stayed at home and factories just shut down. That had an unintended effect: less air pollution.

Oracle BrandVoice: GPU Chips Are Poised To Rewrite (Again) What's Possible In Cloud Computing


At Altair, chief technology officer Sam Mahalingam is heads-down testing the company's newest software for designing cars, buildings, windmills, and other complex systems. The engineering and design software company, whose customers include BMW, Daimler, Airbus, and General Electric, is developing software that combines computer models of wind and fluid flows with machine design in the same process--so an engineer could design a turbine blade while simultaneously seeing its draft's effect on neighboring mills in a wind farm. What Altair needs for a job as hard as this, though, is a particular kind of computing power, provided by graphics processing units (GPUs) made by Silicon Valley's Nvidia and others. "When solving complex design challenges like the interaction between wind structures in windmills, GPUs help expedite computing so faster business decisions can be made," Mahalingam says. An aerodynamics simulation performed with Altair ultraFluidX on the Altair CX-1 concept design, modeled in Altair Inspire Studio.

Best AI Software in 2020 to Make Your Work Smarter


Artificial Intelligence through its omnipresence is capable of transforming various industries. AI can help develop smart systems not only for professionals but for personal use as well. Using AI one can collect data from different sources in an easier way and turn that into valuable insight. The personalization introduced by the technology in various sectors has soared the revenue and benefits of brands and customers respectively. And all of these can be brought into action using AI software.

What Is Data Echoing And How Does It Make Training Faster


What is the solution for faster training of deep neural networks? That is true, but now we have GPUs and TPUs. What if the speed is not enough? Should we develop processors that are even faster? No, says a team of researchers from Google AI.

Global Big Data Conference


Space-specific silicon company Xilinx has developed a new processor for in-space and satellite applications that records a number of firsts: It's the first 20nm process that's rated for use in space, offering power and efficiency benefits, and it's the first to offer specific support for high performance machine learning through neural network-based inference acceleration. The processor is a field programmable gate array (FPGA), meaning that customers can tweak the hardware to suit their specific needs since the chip is essentially user-configurable hardware. On the machine learning side, Xilinx says that the new processor will offer up to 5.7 tera operations per second of "peak INT8 performance optimized for deep learning," which is an improvement of as much as 25x vs the previous generation. Xilinx's new chip has a lot of potential for the satellite market for a couple of reasons: First, it's a huge leap in terms of processor size, since the company's existing traditional tolerant silicon was offered in a 65nm spec only. That means big improvements in terms of its size, weight and power efficiency, all of which translates to very important savings when you're talking about in-space applications, since satellites are designed to be as lightweight and compact as possible to help defray launch costs and in-space propellant needs, both of which represent major expenses in their operation.