Goto

Collaborating Authors

processor


Interesting AI/ML Articles On Medium This Week (Nov 22)

#artificialintelligence

Tommy Shrove provides machine learning relevant information on the newly introduced M1 processor by Apple. Machine learning processes and implementation are tied closely with computational hardware resources, and within this article provides information on the details of components within the M1 processor and how they impact machine learning processes. Tommy provides information on the performance and efficiency of the components, which are then translated into their benefits in a practical scenario involving machine learning processes. Tommy also mentions how the M1 processor is expected to perform with a suite of conventional machine learning tools and applications commonly utilised by ML practitioners, for example, VS Code, Jupyter etc.


Applications of Differential Privacy to European Privacy Law (GDPR) and Machine Learning

#artificialintelligence

Differential privacy is a data anonymization technique that's used by major technology companies such as Apple and Google. The goal of differential privacy is simple: allow data analysts to build accurate models without sacrificing the privacy of the individual data points. But what does "sacrificing the privacy of the data points" mean? Well, let's think about an example. Suppose I have a dataset that contains information (age, gender, treatment, marriage status, other medical conditions, etc.) about every person who was treated for breast cancer at Hospital X.


FPGAs could replace GPUs in many deep learning applications

#artificialintelligence

The renewed interest in artificial intelligence in the past decade has been a boon for the graphics cards industry. Companies like Nvidia and AMD have seen a huge boost to their stock prices as their GPUs have proven to be very efficient for training and running deep learning models. Nvidia, in fact, has even pivoted from a pure GPU and gaming company to a provider of cloud GPU services and a competent AI research lab. But GPUs also have inherent flaws that pose challenges in putting them to use in AI applications, according to Ludovic Larzul, CEO and co-founder of Mipsology, a company that specializes in machine learning software. The solution, Larzul says, are field programmable gate arrays (FPGA), an area where his company specializes. FPGA is a type of processor that can be customized after manufacturing, which makes it more efficient than generic processors.


AMD Radeon RX 6800 and RX 6800 XT review: A glorious return to high end gaming

PCWorld

With the debut of the Radeon RX 6800 and RX 6800 XT, AMD is clearly on a roll. Mere weeks ago, the company's Ryzen 5000 processors seized the unquestionable performance lead from Intel--yes, even in games--for the first time in over a decade. On Wednesday, it's the graphics division's turn to shine with these two Radeon RX 6000-series "Big Navi" graphics cards powered by AMD's new RDNA 2 architecture. Rival Nvidia has largely been competing against itself in the high-end GPU space for several years now. AMD's Vega offerings showed up disappointingly late and disappointingly underpowered in 2017, followed by (awesome) first-gen RDNA cards that sadly topped out with the midrange Radeon RX 5700 XT in 2019.


In 2020, neural chips helped smartphones finally eclipse pro cameras

#artificialintelligence

When photographer Chase Jarvis coined the famous saying "The best camera is the one you have with you," he was revealing an unspoken truth: Even professionals carried point-and-shoot cameras despite owning DSLRs and dedicated video cameras. His message was that great photographers create compelling images with whatever they have on hand, but the sentiment wound up setting the stage for a massive disruption of traditional imaging -- one that saw famed portrait photographer Annie Leibovitz embrace Google's Pixel still cameras and filmmaker Steven Soderbergh start shooting movies with iPhones. The year 2020 will be remembered for many negative reasons, but it should also be marked as the time when technology caught up with and redefined Jarvis' saying. Thanks in large part to improved sensors and the neural cores in mobile processors made by Qualcomm and Apple, this was the year when standalone photo and video cameras were surpassed by smartphones in important ways, such that "the one you have with you" will now actually be either your best or most capable camera. Unlike single-purpose cameras, the latest smartphones now create 3D scans of objects and rooms, AI-optimized images, and cinema-quality Dolby Vision HDR videos that even professional cameras can't replicate.


Speeding Up AI With Vector Instructions

#artificialintelligence

A search is underway across the industry to find the best way to speed up machine learning applications, and optimizing hardware for vector instructions is gaining traction as a key element in that effort. Vector instructions are a class of instructions that enable parallel processing of data sets. An entire array of integers or floating point numbers is processed in a single operation, eliminating the loop control mechanism typically found in processing arrays. That, in turn, improves both performance and power efficiency. This concept works particularly well with sparse matrix operations used for those data sets, which can achieve a substantial performance boost by being vectorized, said Shubhodeep Roy Choudhury, CEO at Valtrix Systems. This is harder than it might appear, however.


Some Facts About Deep Learning and its Current Advancements

#artificialintelligence

The advancement in technology has been such that we are able to code machines to perform tasks that normally requires human intelligence such as speech recognition, decision making, sound recognition, visual perception, language translation. Deep learning is a subset of Machine Learning which makes use of deep artificial neural networks, in which the system learns to perform various tasks by propagating through the neural network architecture. Deep neural networks are able to process enormous datasets to make significantly accurate predictions. Deep learning models are very versatile. Different kinds of neural networks can be combined to suit the needs of a given problem.


Ambarella launches computer vision chips for edge AI

#artificialintelligence

Chip designer Ambarella has announced a new computer vision chip for processing artificial intelligence at the edge of computer networks, like in smart cars and security cameras. The new CV28M camera system on chip (SoC) is the latest in the company's CVflow family. It combines advanced image processing, high-resolution video encoding, and computer vision processing in a single, low-power chip. Ambarella packed a lot of AI processing power into the chip to anticipate the way computer networks will evolve as everything gets connected to the internet. Since networks could become inundated with data traffic, self-driving cars, for example, will have to do their processing at the edge of the network, or in the car itself, rather than interacting heavily with datacenter processors.


What Do Experts Say About the Future of Machine Learning (and Python)?

#artificialintelligence

Is Python the best language for machine learning? Do you foresee any major changes to the popular ML software stack? In ML, 90% of the ideas you try fail, so iteration speed is critical. Python allows you to iterate faster (in ML) than any other language. I see many changes to the ML software stack, particularly on the infrastructure side, and possibly on the framework side as well (keep an eye on Jax), but I don't see Python being dethroned anytime soon.


What You Need to Know About Machine Learning Pipelines - InformationWeek

#artificialintelligence

Executives often treat the black box nature of machine learning models as a mysterious act, a mystic art that seems more apropos in scenes from the Marvel movie Doctor Strange than to AI. As a result, they task IT managers as if they were the movie's title character -- someone able to conjure processes so that the model performs well. The reality is that understanding machine learning pipeline basics can demystify the steps involved so that IT teams can better manage a tech vital to today's competitive business climate. Pipelines are essentially development steps in building and automating a desired output from a program. Developers have used the phrase "pipeline" as lingo describing how software is formed from source code and into a production environment.