We talk way too often about what a technology enables people to do. Its objective is to spread a very fast signal through the airwaves, using transmitters whose power curve is just under the threshold of requiring artificial cooling. It needs to be faster than what we have now, for enough customers and enough providers to invest in it, so that it may achieve that main objective. Assuming 5G deployment proceeds as planned, and the various vast political conspiracies, small and large, fail to derail the telecommunications providers' plans, it will reach the peak of its goals once it has achieved the virtualization of its packet core (which was begun with 4G LTE), its radio access networks (RAN), and the customer-facing functions of its data centers. But it's from atop the highest peak, as any Everest climber or any great John Denver song might tell you, that one obtains the best view of oneself, and one's own place in the world. The common presumption, when the topic of network functions virtualization (NFV) is brought up with respect to 5G, is that all this virtualization will take place on a single platform. Not only is this critical issue undecided, but there would appear to be a dispute over the decided or undecided nature of the issue itself.
When you hear the bountiful promises of internet of things so poetically uttered by its prospective vendors, like the nice half of a pharmaceutical commercial, your brain is probably receiving two implicit messages. One is that connectivity is a virtue unto itself, like consciousness or the acquisition of a new sense. The other is that connectivity would render devices "smart." "I think we are going to be surrounded by smart devices," Internet Protocol co-inventor Vint Cerf told a Google-sponsored startups conference four years ago. "There's something really magic, to be able to assume that any device you have that has some programmability in it could be part of a communications network, and could communicate with any other random, programmable device." A study of any technology over the span of history, for as long as humans have been building machines, will demonstrate quite clearly that it tends to lose its sense of magic, or nirvana, in its implementation. The one quality that 4G wireless had five years ago that it lacks today is that certain "something really magic." What is never so obvious at the outset of a platform's or a system's adoption is that the shedding of false attire is for the better.
Video: 5G: Is all the hype deserved? It is the fourth time in history that the world's telecommunications providers (the telcos) have acknowledged the need for a complete overhaul of their wireless infrastructure. This is why the ever-increasing array of technologies, listed by the 3rd Generation Partnership Project (3GPP) as "Release 15" and "Release 16" of their standards for wireless telecom, is called 5G. It is an effort to create a sustainable industry around the wireless consumption of data for all the world's telcos. One key goal of 5G is to dramatically improve quality of service, and extend that quality over a broader geographic area, in order for the wireless industry to remain competitive against the onset of gigabit fiber service coupled with Wi-Fi.
The phrase software-defined networking (SDN) was coined when it was necessary to distinguish the concept from the hardware-based variety. Since that time, "SDN" has come to mean the type of dynamic configuration that takes place whenever software-based services in a data center network are made accessible through an Internet Protocol (IP) address. More to the point, SDN is networking now. In the broadest sense, any software that manages a network of dynamically assigned addresses -- addresses which represent services provided or functions performed -- is utilizing some variety of SDN. The web gave rise to the idea of addressing a function by way of a name that resolves (represents, as in a directory) to an IP address. Originally, the web was intended to be a system for addressing content in this way, but engineers soon saw how efficient an evolved form of that system could be in addressing functionality.
Video: 3 things you should know about cloud v. data center At present, edge computing is more of a prospect than a mature market -- more of a concept than a product. It is an effort to bring quality of service (QoS) back into the data center services discussion, as enterprises decide not just who will provide their services, but also where. "The edge" is a theoretical space where a data center resource may be accessed in the minimum amount of time. You might think the obvious place for the edge to be located, for any given organization, is within its own data center ("on-premises"). Or, if you've followed the history of personal computing from the beginning, you might think it should be on your desktop, or wherever you've parked your PC.