These are phenomena brought about by the actual technology that inspired its existence in the first place -- one which may not have feasible at the time 4G was conceived. There's nothing about 4G which would have disabled its ability to be sped up, even the millimeter-wave system with which gigabit Internet service would be made available in dense, downtown areas. Wireless Transmitter Facilities (WTF) are too costly to maintain, and run too hot. This would eliminate the need for high-speed processors in the base stations and the antennas, and dramatically reduce cooling costs. For many telcos throughout the world, it could make their networks profitable again. The virtualization of wireless networks' Evolved Packet Core (EPC) is already taking place with 4G LTE. There's no single way to do this -- indeed, EPC is a competitive market with a variety of vendors.
We talk way too often about what a technology enables people to do. Its objective is to spread a very fast signal through the airwaves, using transmitters whose power curve is just under the threshold of requiring artificial cooling. It needs to be faster than what we have now, for enough customers and enough providers to invest in it, so that it may achieve that main objective. Assuming 5G deployment proceeds as planned, and the various vast political conspiracies, small and large, fail to derail the telecommunications providers' plans, it will reach the peak of its goals once it has achieved the virtualization of its packet core (which was begun with 4G LTE), its radio access networks (RAN), and the customer-facing functions of its data centers. But it's from atop the highest peak, as any Everest climber or any great John Denver song might tell you, that one obtains the best view of oneself, and one's own place in the world. The common presumption, when the topic of network functions virtualization (NFV) is brought up with respect to 5G, is that all this virtualization will take place on a single platform. Not only is this critical issue undecided, but there would appear to be a dispute over the decided or undecided nature of the issue itself.
The most important promise made by the proprietors of 5G wireless technology -- the telecommunications service providers, the transmission equipment makers, the antenna manufacturers, and even the server manufacturers -- is this: Once all of 5G's components are fully deployed and operational, you will not need any kind of wire or cable to deliver communications or even entertainment service to your mobile device, to any of your fixed devices (HDTV, security system, smart appliances), or to your automobile. If everything works, 5G would be the optimum solution to the classic "last mile" problem: Delivering complete digital connectivity from the tip of the carrier network to the customer, without drilling another hole through the wall. Also: Should 5G be in your 2020 IT budget? Overlooked by London's skyscrapers EE's 5G mobile trial kicks off. The "if" in that previous sentence remains colossal. The whole point of "Gs" in wireless standards, originally, was to emphasize the ease of transition between one wireless system of delivery and a newer one -- or at least make that transition seem reasonably pain-free. Once complete, the 5G transition plan would constitute an overhaul of communications infrastructure unlike any other in history.
Video: 5G: Is all the hype deserved? It is the fourth time in history that the world's telecommunications providers (the telcos) have acknowledged the need for a complete overhaul of their wireless infrastructure. This is why the ever-increasing array of technologies, listed by the 3rd Generation Partnership Project (3GPP) as "Release 15" and "Release 16" of their standards for wireless telecom, is called 5G. It is an effort to create a sustainable industry around the wireless consumption of data for all the world's telcos. One key goal of 5G is to dramatically improve quality of service, and extend that quality over a broader geographic area, in order for the wireless industry to remain competitive against the onset of gigabit fiber service coupled with Wi-Fi.
The phrase software-defined networking (SDN) was coined when it was necessary to distinguish the concept from the hardware-based variety. Since that time, "SDN" has come to mean the type of dynamic configuration that takes place whenever software-based services in a data center network are made accessible through an Internet Protocol (IP) address. More to the point, SDN is networking now. In the broadest sense, any software that manages a network of dynamically assigned addresses -- addresses which represent services provided or functions performed -- is utilizing some variety of SDN. The web gave rise to the idea of addressing a function by way of a name that resolves (represents, as in a directory) to an IP address. Originally, the web was intended to be a system for addressing content in this way, but engineers soon saw how efficient an evolved form of that system could be in addressing functionality.