Data Infrastructures exists to support business, cloud and information technology (IT) among other applications that transform data into information or services. The fundamental role of data infrastructures is to provide a platform environment for applications and data that is resilient, flexible, scalable, agile, efficient as well as cost-effective. Put another way, data infrastructures exist to protect, preserve, process, move, secure and serve data as well as their applications for information services delivery. Technologies that make up data infrastructures include hardware, software, cloud or managed services, servers, storage, I/O and networking along with people, processes, policies along with various tools spanning legacy, software-defined virtual, containers and cloud. Data Various Types and Layers of Infrastructures Depending on your role or focus, you may have a different view than somebody else of what is infrastructure, or what an infrastructure is.
As the infrastructure world becomes saturated with progressively sophisticated digital technologies, public and private sector infrastructure leaders will be forced to adopt a new base of knowledge and new set of skills. Many of these decision makers are accomplished engineers, but with mechanical, civil, structural or electrical backgrounds. Their expertise and experience remains valuable and relevant, but must today be augmented by perspectives from computer science and software engineering in order to meet the demands and expectations of today's citizen-consumers. These changes also mean officials will have to source new partners and vendors -- ones like Xaqt, Rapid Flow Technologies and Pluto AI that can supplement traditional infrastructural intelligence with digital intelligence.
Docker announced today that it was open sourcing containerd (pronounced Container D), making a key infrastructure piece of its container platform available for anyone to work on. Containerd, which acts as the core container runtime engine, is a component within Docker that provides "users with an open, stable and extensible base for building non-Docker products and container solutions," according to the company. Leading cloud providers have signed on to work on it including Alibaba, AWS, Google, IBM and Microsoft. For now, Docker has not announced the foundation that will house the open source project, but they intend to put it in a neutral foundation some time during the first quarter of 2017. Solomon Hykes, Docker's founder and CTO, says a foundational principle of the company has always been about putting the end user first and the plumbing second, and this moves a critical piece of the plumbing into the open source community.
March Capital Partners, the Los Angeles-based venture capital firm, has raised $300 million for its latest fund. It's another indicator that the Los Angeles technology ecosystem is coming of age, but also a sign that March's core investment strategies -- to invest in companies applying artificial intelligence to business use cases and investing in the next wave transforming computing infrastructure -- is paying off. "We have two major areas and a couple of minor areas," said Sumant Mandal, a managing director with the firm. "We like data driven business and two thirds of our portfolio are AI driven. We also like infrastructure for the internet… the majority of the portfolio will be around those two themes."