The phrase software-defined networking (SDN) was coined when it was necessary to distinguish the concept from the hardware-based variety. Since that time, "SDN" has come to mean the type of dynamic configuration that takes place whenever software-based services in a data center network are made accessible through an Internet Protocol (IP) address. More to the point, SDN is networking now. In the broadest sense, any software that manages a network of dynamically assigned addresses -- addresses which represent services provided or functions performed -- is utilizing some variety of SDN. The web gave rise to the idea of addressing a function by way of a name that resolves (represents, as in a directory) to an IP address. Originally, the web was intended to be a system for addressing content in this way, but engineers soon saw how efficient an evolved form of that system could be in addressing functionality.
Hewlett Packard Enterprise has announced its intention to acquire Plexxi, a small software-defined networking company, for an undisclosed amount. Plexxi provides flexible software-defined networking fabric capabilities for HCI and cloud applications. HPE, in announcing the acquisition, specifically called out the benefit that Plexxi's capabilities in software-defined and flexible networking will bring to HPE's Simplivity and Synergy solutions. While there will be an immediate benefit to those product lines for HPE, this acquisition is far more strategic in nature. Converged and composable infrastructure are evolving at a rapid rate.
The center of gravity for data is continuing to shift from core datacenters to other parts well beyond the walls of those facilities. Data and applications are being accessed and created in the cloud and at the network edge, where billions of smartphones and other smart, connected devices that make up the Internet of Things (IoT) reside and do their work. In this increasingly digital and data-centric world, that is where the action is, where data needs to be collected, stored, analyzed, and acted on, so it's not surprising that so many of those established tech vendors that have made billions of dollars over the past few decades building systems for datacenters are now pushing those capabilities past the walls and toward the cloud and edge. At The Next Platform, we've been talking for more than a year about the edge and distributed computing created by the changes in the enterprise IT space, with more focus being put on branch and remote offices, the cloud and gateways. The amount of data being developed in these places far outside of the central datacenter will only skyrocket, and enterprises need to be able to quickly – and efficiently – analyze the data and make business decisions based on it, so sending it back to the datacenter or cloud to do all that makes no operational or financial sense.
Back in the 1960s, virtualization transformed data centers. CIOs who placed their bets on this technology, which abstracted compute resources from underlying hardware, won big. They improved resource utilization, reduced costs, and simplified server deployment and management. Today, hyper-converged infrastructure (HCI) offers IT leaders the next big win. HCI builds on the concepts of virtualization by converging not only compute resources, but also networking and storage resources onto shared, generally available, and lower-cost industry-standard servers.
Hyper-converged vendors Pivot3 and Scale Computing this week expanded their use cases with product launches. Scale formally unveiled HE150 all-flash NVMe hyper-converged infrastructure (HCI) appliances for space-constrained edge environments. Scale sells the compute device as a three-node cluster, but it does not require a server rack. The new device is a tiny version of the Scale HE500 HCI appliances that launched this year. Scale said select customers have deployed proofs of concept.