Goto

Collaborating Authors

Results


Modern Computing: A Short History, 1945-2022

#artificialintelligence

Inspired by A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi. But the selection of key events in the journey from ENIAC to Tesla, from Data Processing to Big Data, is mine. This was the first computer made by Apple Computers Inc, which became one of the fastest growing ... [ ] companies in history, launching a number of innovative and influential computer hardware and software products. Most home computer users in the 1970s were hobbyists who designed and assembled their own machines. The Apple I, devised in a bedroom by Steve Wozniak, Steven Jobs and Ron Wayne, was a basic circuit board to which enthusiasts would add display units and keyboards. April 1945 John von Neumann's "First Draft of a Report on the EDVAC," often called the founding document of modern computing, defines "the stored program concept." July 1945 Vannevar Bush publishes "As We May Think," in which he envisions the "Memex," a memory extension device serving as a large personal repository of information that could be instantly retrieved through associative links.


Jensen Huang press Q&A: Nvidia's plans for the Omniverse, Earth-2, and CPUs

#artificialintelligence

Nvidia CEO Jensen Huang recently hosted yet another spring GTC event that drew more than 200,000 participants. And while he didn't succeed in acquiring Arm for $80 billion, he did have a lot of things to show off to those gathering at the big event. He gave an update on Nvidia's plans for Earth-2, a digital twin of our planet that -- with enough supercomputing simulation capability within the Omniverse –could enable scientists to predict climate change for our planet. The Earth 2 simulation will require the best technology -- like Nvidia's newly announced graphics processing unit (GPU) Hopper and its upcoming central processing unit (CPU) Grade. Huang fielded questions about the ongoing semiconductor shortage, the possibility of investing in manufacturing, competition with rivals, and Nvidia's plans in the wake of the collapse of the Arm deal. He conveyed a sense of calm that Nvidia's business is still strong (Nvidia reported revenues of $7.64 billion for its fourth fiscal quarter ended January 30, up 53% from a year earlier). Gaming, datacenter, and professional visualization market platforms each achieved record revenue for the quarter and year. He also talked about Nvidia's continuing commitment to the self-driving vehicle market, which has been slower to take off than expected. Huang held a Q&A with the press during GTC and I asked him the question about Earth-2 and the Omniverse (I also moderated a panel on the industrial metaverse as well at GTC). I was part of a large group of reporters asking questions. Question: With the war in Ukraine and continuing worries about chip supplies and inflation in many countries, how do you feel about the timeline for all the things you've announced? For example, in 2026 you want to do DRIVE Hyperion. With all the things going into that, is there even a slight amount of worry? Jensen Huang: There's plenty to worry about. I have to observe, though, that in the last couple of years, the facts are that Nvidia has moved faster in the last couple of years than potentially its last 10 years combined. It's quite possible that we work better, actually, when we allow our employees to choose when they're most productive and let them optimize, let mature people optimize their work environment, their work time frame, their work style around what best fits for them and their families. It's very possible that all of that is happening. It's also true, absolutely true, that it has forced us to put a lot more energy into the virtual work that we do. For example, the work around OmniVerse went into light speed in the last couple of years because we needed it. Instead of being able to come into our labs to work on our robots, or go to the streets and test our cars, we had to test in virtual worlds, in digital twins.


NVIDIA's more powerful 'AI brain' for robots is available now for $1,999

Engadget

If you've been eager to use NVIDIA's more powerful robotics'brain' for projects, you now have your chance -- provided you're willing to pay a premium. The company is now selling the Jetson AGX Orin developer kit for $1,999. The palm-sized computing device is now billed as eight times more powerful than Jetson AGX Xavier (275 trillion operations per second, or TOPS) thanks to its 12-core ARM Cortex-A78AE CPUs, Ampere-based GPU and upgrades to its AI accelerators, interfaces, memory bandwidth and sensor support. You'll have to wait a while longer for production-ready units. They'll be available in the fourth quarter of the year starting at $399 for a'basic' Orin NX kit with six CPU cores, a 1,792-core GPU, 8GB of RAM and 70 TOPS of performance.


Raspberry Pi just turned 10. Celebrate by learning how it works.

Mashable

TL;DR: As of March 20, you can get The 2022 Complete Raspberry Pi & Arduino Developer Bundle -- worth $1,800 -- for just $39.99, which saves you 97%. So, you've got a Raspberry Pi. The bundle features nine courses and 61 hours of content on the hands-on programming basics you'll need to get started with your Raspberry Pi. Your instructor will be Edouard Renard, a software engineer and entrepreneur who co-founded a robotics startup in 2016. He's built a complete robotic arm from scratch with Arduino, Raspberry Pi, Ubuntu, and ROS and will help you work your way towards building fun projects of your own.


What Are the Best Quantum Computing Stocks to Buy?

#artificialintelligence

We've reached a point where 1980s-90s sci-fi buzzwords are turning into reality. A few examples are nanotechnology, the metaverse and quantum computing. In the past few years, all three of these concepts have turned into full-fledged industries. In particular, quantum computing could be incredibly valuable over the coming decade. Quantum computing essentially makes computing-intensive processes easier.


This Raspberry Pi and Arduino bootcamp bundle is on sale for 96% off

Mashable

TL;DR: The Raspberry Pi and Arduino Bootcamp Bundle is on sale for £22.40, saving you 96% on list price. Even if you have no experience, the five-course Raspberry Pi and Arduino bootcamp will help you get started learning about programming and robotics. It's designed for complete beginners and walks you through Robot Operating System (ROS) basics first and foremost so that you can create powerful and scalable robot applications. Then you can apply those skills in the Raspberry Pi For Beginners and Arduino for Beginners courses. Each course is hands-on and takes you step by step through the basics of your first projects.


3 Stocks To Tap The Semiconductor-General Industry Potential

#artificialintelligence

Companies in the Semiconductor – General industry are at the forefront of the ongoing technological revolution based on HPC, AI, automated driving, IoT and so on. These semiconductors also enable the cloud to function and help analyze the data into actionable insights that can be used by companies to operate more efficiently. If anything, the pandemic has strengthened the conviction that these technological changes are required and inevitable, because it is these technology platforms that enabled us to function when it was unsafe for us to go to work or meet people. Even with the opening up of the economy, the race to digitization, cloud, AI, etc. is expected to continue at an accelerated rate, driving strong demand for semiconductors. NVIDIA has pioneered and built a whole lot of this cutting-edge technology, so it remains a top recommendation.


Why scrapping Nvidia Arm deal is ultimately bad for the industry

ZDNet

The largest proposed semiconductor acquisition in IT history – Nvidia merging with Arm – was called off today due to significant regulatory challenges, with antitrust issues being the main hurdle. The $40 billion deal was initially announced in September 2020, and there has been wide speculation that this would eventually be the outcome based on several factors that I believed were either not true or overblown. Before I get into that, it's important to understand why this deal was so important. Nvidia's core product is the graphics processing unit, or GPU, which was initially used to improve graphics capabilities on computers for uses such as gaming. It just so happens that the architecture of a GPU makes it ideal for other tasks that require accelerated computing, such as real-time graphics rendering, virtual reality, and artificial intelligence.


GEMEL: Model Merging for Memory-Efficient, Real-Time Video Analytics at the Edge

arXiv.org Artificial Intelligence

Video analytics pipelines have steadily shifted to edge deployments to reduce bandwidth overheads and privacy violations, but in doing so, face an ever-growing resource tension. Most notably, edge-box GPUs lack the memory needed to concurrently house the growing number of (increasingly complex) models for real-time inference. Unfortunately, existing solutions that rely on time/space sharing of GPU resources are insufficient as the required swapping delays result in unacceptable frame drops and accuracy violations. We present model merging, a new memory management technique that exploits architectural similarities between edge vision models by judiciously sharing their layers (including weights) to reduce workload memory costs and swapping delays. Our system, GEMEL, efficiently integrates merging into existing pipelines by (1) leveraging several guiding observations about per-model memory usage and inter-layer dependencies to quickly identify fruitful and accuracy-preserving merging configurations, and (2) altering edge inference schedules to maximize merging benefits. Experiments across diverse workloads reveal that GEMEL reduces memory usage by up to 60.7%, and improves overall accuracy by 8-39% relative to time/space sharing alone.


LG Electronics joins IBM Quantum Network to research AI, IoT, and more

ZDNet

IBM announced Monday that LG Electronics is joining the Quantum Network. The two companies will work to explore how quantum computing can be used for a variety of applications, ranging from IoT and data to AI and robotics. The three-year deal will give LG Electronics access to IBM's quantum computing systems, experts, and their "Qiskit" open-source quantum information software development kit. It would be the harbinger of an entirely new medium of calculation, harnessing the powers of subatomic particles to obliterate the barriers of time in solving incalculable problems. "Based on our open innovation strategy, we plan to use IBM Quantum to develop our competency in quantum computing," said Byoung-Hoon Kim, CTO and executive vice president of LG Electronics.