Centec Networks, a leading innovator of Ethernet switching silicon and SDN white box solutions, today announced the TsingMa CTC7132 system-on-chip (SoC) device, its sixth-generation switching silicon that is designed to help fuel the transformation from 4G to 5G deployment and from traditional cloud computing to edge computing with optimized cost, power, performance and features. "Built upon our proven Transwarp switch architecture, the TsingMa chip is a complete SoC device that is purpose-built to address the growing demand for extreme low latency, comprehensive end-to-end tunnel security and rich network telemetry in the era of 5G and edge computing," said Tao Gu, vice president of business development at Centec Networks. "TsingMa is the first of a series of new chips we are rolling out for OEMs and ODMs so they can build a new class of network equipment for 5G transport and edge computing networks." "Powered by Artificial Intelligence, network automation can effectively solve the operation and maintenance challenges of massive nodes in 5G deployment," said Tang Xiongyan, chief scientist of Network Technology Research Institute at China Unicom. "TsingMa's capability of collecting comprehensive metadata of network flows, deep forwarding states and perceived behavior, is a powerful tool for the development of intelligent network automation."
From just powering gaming computers, NVIDIA Corporation (NASDAQ:NVDA) has advanced its GPU business to focusing the use of its technology to power advanced machine technologies. NVIDIA DGX-1 – This is what is known to be the world's first commercially available supercomputer designed specifically for deep learning. NVIDIA claims that DGX-1 is a supercomputer delivering the computing power of 250 2-socket servers in a box. The company states on their website that its NVIDIA NVLink implementation delivers massive increase in GPU memory capacity, giving you a system that can learn, see, and simulate our world--a world with an infinite appetite for computing. NVIDIA also claims the DGX-1 can be trained for tasks like image recognition and will perform significantly faster than other servers.
As corporate bandwidth requirements continue to surge exponentially with every passing year, it becomes clear that bandwidth demands as well as the business requirements of the modern digital workspace are setting the stage for the implementation of new, advanced technologies. These technologies give rise to fresh possibilities and further fuel the demand for adding intelligent systems to our daily lives and greater reliance on tech support, both in the home and workfronts. With software trends emerging regularly in the IT scene, digital services and people are becoming further intertwined to characterize everything that's new the world of network technology this year. These recent advancements are more than likely to disrupt existing operations and foster an era of digitization and intelligence throughout the business sector. Let's see what's getting hot now in networking technology -- and how they will be sizzling by the end of the year.
What a serverless deployment costs depends on a range of variables. The real question is whether it is more cost-effective than traditional means of software deployment. The key issues to bear in mind when considering the suitability of the serverless model for a software deployment are the nature of the application, and the degree to which you draw upon the services of third parties for code, and for services such as hosting and etc. Serverless computing is a highly modular deployment methodology, with code consisting of functions that behave in a particular way in response to a particular input. Among its core cost-benefits is its fast spin-up and spin-down time. A function is invoked, does its thing, and spins down again, so billing can be highly granular: you pay only for the time the function is working, and for the data it outputs.
Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds. Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance. "In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that's just not realistic," says Helder Antunes, senior director of corporate strategic innovation at Cisco. Edge computing is a "mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet," according to research firm IDC. It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing.