Goto

Collaborating Authors

Results


Next Raspberry Pi CPU Will Have Machine Learning Built In

#artificialintelligence

At the recent tinyML Summit 2021, Raspberry Pi co-founder Eben Upton teased the future of'Pi Silicon' and it looks like machine learning could see a massive improvement thanks to Raspberry Pi's news in-house chip development team. It is safe to say that the Raspberry Pi Pico and its RP2040 SoC have been popular. The Pico has only been on the market for a few weeks, but already has sold 250,000 units with 750,000 on back order. There is a need for more boards powered by the RP2040 and partners such as Adafruit, Pimoroni, Adafruit and Sparkfun are releasing their own hardware, many with features not found on the Pico. Raspberry Pi's in house application specific integrated circuit (ASIC) team are working on the next iteration, and seems to be focused on lightweight accelerators for ultra low power machine learning applications.


Use DirectML to add machine learning to C code

#artificialintelligence

The modern GPU is more than a graphics device. Technologies such as the open-standard OpenCL and Nvidia's CUDA turn the many small processors in a GPU into a parallel computing fabric, allowing desktop PCs to complete tasks that used to be the sole purview of supercomputers. Those same GPUs are also capable of supporting many modern machine learning tasks, using GPU compute to build neural networks and to support model-building, data-parallel analytical and processing tasks. Microsoft has been investing in simplifying GPU programming for a long time now, starting with its DirectX GPU tools, initially via the Direct3D graphics tools, and extending it to GPU compute with DirectCompute. Recent developments have included tools to map OpenGL calls to Direct3D, related to work building a graphical layer onto the WSL 2 Linux virtual machine system bundled with Windows 10. Although they make it easier to work with hardware, these remain low-level programming tools, using C to access hardware features.


The Decline of Computers as a General Purpose Technology

Communications of the ACM

Perhaps in no other technology has there been so many decades of large year-over-year improvements as in computing. It is estimated that a third of all productivity increases in the U.S. since 1974 have come from information technology,a,4 making it one of the largest contributors to national prosperity. The rise of computers is due to technical successes, but also to the economics forces that financed them. Bresnahan and Trajtenberg3 coined the term general purpose technology (GPT) for products, like computers, that have broad technical applicability and where product improvement and market growth could fuel each other for many decades. But, they also predicted that GPTs could run into challenges at the end of their life cycle: as progress slows, other technologies can displace the GPT in particular niches and undermine this economically reinforcing cycle. We are observing such a transition today as improvements in central processing units (CPUs) slow, and so applications move to specialized processors, for example, graphics processing units (GPUs), which can do fewer things than traditional universal processors, but perform those functions better. Many high profile applications are already following this trend, including deep learning (a form of machine learning) and Bitcoin mining. With this background, we can now be more precise about our thesis: "The Decline of Computers as a General Purpose Technology." We do not mean that computers, taken together, will lose technical abilities and thus'forget' how to do some calculations.


Adafruit Making Machine Learning USB Stick for Raspberry Pi

#artificialintelligence

This chip could increase the processing power of your next Raspberry Pi-powered Machine Learning project.


Steve Nouri on LinkedIn: #innovation #artificialintelligence #machinelearning

#artificialintelligence

This clip is nostalgic for me and many gamers who have been waiting for the latest graphic card every year. GPU has another usage these days, training deep learning algorithms! And I am still following the latest trend in processing units for a totally different reason.


To do in 2021: Get up to speed with quantum computing 101

#artificialintelligence

If "figure out quantum computing" is still in your future file, it's time to update your timeline. The industry is nearing the end of the early adopter phase, according to one expert, and the time is now to get up to speed. Denise Ruffner, the vice president of business development at IonQ, said that quantum computing is evolving much faster than many people realize. "When I started five years ago, everyone said quantum computing was five to 10 years away and every year after that I've heard the same thing," she said. "But four million quantum volume was not on the radar then and you can't say it's still 10 years away any more."


Adafruit BrainCraft HAT: Easy AI on Raspberry Pi

#artificialintelligence

Adafruit is very well known in the maker and electronics community. For 15 years the New York-based company has provided kits and boards for Arduino, Beaglebone and Raspberry Pi and their latest board is the $39.95 BrainCraft HAT. Designed for use with the Raspberry Pi 4, this HAT is a hub of inputs and outputs, including a screen that shows image recognition, to facilitate machine learning Raspberry Pi projects. If you are keen to try out machine learning projects using TensorFlow Lite then the Raspberry Pi 4 is the ideal machine for taking your first steps. It is cheap to buy, has plenty of power and adaptable to your needs.


Creating my First Deep Learning + Data Science Workstation

#artificialintelligence

Creating my workstation has been a dream for me, if nothing else. I knew the process involved, yet I somehow never got to it. It might have been time or money. But this time I just had to do it. I was just fed up with setting up a server on AWS for any small personal project and fiddling with all the installations.


Adafruit BrainCraft HAT - Easy Machine Learning for Raspberry Pi

#artificialintelligence

The BrainCraft HAT has a 240×240 TFT IPS display for inference output, slots for camera connector cable for imaging projects, a 5 way joystick and button for UI input, left and right microphones, stereo headphone out, stereo 1 W speaker out, three RGB DotStar LEDs, two 3 pin STEMMA connectors on PWM pins so they can drive NeoPixels or servos, and Grove/STEMMA/Qwiic I2C port.


How to Download, Install and use Nvidia GPU for tensorflow on windows

#artificialintelligence

This article was published as a part of the Data Science Blogathon. "Graphics has lately made a great shift towards machine learning, which itself is about understanding data" _ Jefferson Han, Founder and Chief Scientist of Perceptive Pixel CPU's can fetch data at a faster rate but cannot process more data at a time as CPU has to make many iterations to main memory to perform a simple task. Campus executes jobs sequentially and has fewer cores but GPUs come with hundreds of smaller cores working in parallel making GPU a highly parallel architecture thereby improving the performance. Tensorflow GPU can work only if you have a CUDA enabled graphics card. All the newer NVidia graphics cards within the past three or four years have CUDA enabled.