Goto

Collaborating Authors

Coming Soon – Snowball Edge with More Compute Power and a GPU

#artificialintelligence

I never get tired of seeing customer-driven innovation in action! When AWS customers told us that they needed an easy way to move petabytes of data in and out of AWS, we responded with the AWS Snowball. Later, when they told us that they wanted to do some local data processing and filtering (often at disconnected sites) before sending the devices and the data back to AWS, we launched the AWS Snowball Edge, which allowed them to use AWS Lambda functions for local processing. Earlier this year we added support for EC2 Compute Instances, with six instances sizes and the ability to preload up to 10 AMIs onto each device. Great progress, but we are not done yet!


Computing to win: Addressing the policy blind spot that threatens national AI ambitions - Atlantic Council

#artificialintelligence

Servers run inside the Facebook New Albany Data Center on Thursday, February 6, 2020 in New Albany, Ohio. Artificial intelligence (AI) is causing significant structural changes to global competition and economic growth. AI may generate trillions of dollars in new value over the next decade, but this value will not be easily captured or evenly distributed across nations. Much of it will depend on how governments invest in the underlying computational infrastructure that makes AI possible. Yet early signs point to a blind spot--a lack of understanding, measurement, and planning.


4 IoT compute types for the Internet of Things

#artificialintelligence

IoT is about capturing micro-interactions and responding as fast as you can. Edge computing brings us closest to the data source and allows us to apply machine learning at the sensor's region. If you got caught up with the edge vs fog computing discussions, you should understand that edge computing is all about intelligence at the sensor nodes, whereas fog computing is still about local area networks that can provide computing power for data heavy operations.


Intel is ending development of its Compute Cards

Engadget

Intel is halting development on its line of Compute Cards, according to a report from Tom's Hardware. The company will continue to sell its existing line of Compute Cards for the time being and will continue to offer support for the current generation of products through 2019. Beyond that, Intel is essentially leaving behind the modular computing concept. Compute Cards were first introduced by Intel at Computex in 2017. The concept behind the product was to fit all of the necessary computing power a device may need -- CPU, RAM, storage, etc. -- onto a single card.


The Future of Compute: From Edge to Cloud

#artificialintelligence

Discover the future of high performance compute As business leaders seek to leverage breakthroughs in AI, HPC, and data science throughout their business, the demand for solutions that bring those capabilities to the edge continues to grow. Ahead of NVIDIA's GPU Technology Conference (GTC), Hewlett Packard Enterprise's Jeff Winterich spoke to us about the need to 'push the envelope of enterprise compute' and why HPE and NVIDIA are better together. GTC brings together developers, engineers, and innovators looking to gain a deeper understanding of how AI will transform their industry. Watch the HPE on demand sessions. Don't miss the Scheduled Session: