The 48th IT Press Tour had the opportunity to meet with the management team at CloudFabrix. This is the team's fourth startup, with the previous three being sold to Cisco. They are doing a lot of interesting things automating operations. Multi-cloud challenges continue, as 73% of enterprises use two or more clouds. This is projected to be 81% by 2024.
This learning path consists of step-by-step tutorials and patterns that walk you through the process of streamlining data integration, data governance, analytics, and data virtualization on AWS. It also covers building models on Amazon SageMaker and monitoring these models using IBM OpenScale deployed on SageMaker and IBM Cloud Pak for Data for fairness, quality, and drift metrics, as well as creating data visualizations and dashboards on IBM Cognos Analytics. You can learn a no-code approach for building and deploying machine learning models on Cloud Pak for Data with minimal data science background. Public pandemic data is used to demonstrate how you can build an effective pandemic management system on AWS using Cloud Pak for Data.
Zscaler (NASDAQ: ZS) accelerates digital transformation so that customers can be more agile, efficient, resilient, and secure. The Zscaler Zero Trust Exchange is the company's cloud-native platform that protects thousands of customers from cyberattacks and data loss by securely connecting users, devices, and applications in any location. With more than 10 years of experience developing, operating, and scaling the cloud, Zscaler serves thousands of enterprise customers around the world, including 450 of the Forbes Global 2000 organizations. In addition to protecting customers from damaging threats, such as ransomware and data exfiltration, it helps them slash costs, reduce complexity, and improve the user experience by eliminating stacks of latency-creating gateway appliances. Zscaler was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users.
Key trend #1 -- Migration to the cloud will accelerate, although the rising scale of cloud costs will become increasingly scrutinized by finance types as the economy tightens. Organizations will struggle with hybrid and multi-cloud architectures. Watch for new and innovative pricing schema to help combat the rising costs (e.g., fee caps and fixed fee). The flexibility and scalability the cloud offers to organizations. Cloud computing allows companies to easily add or remove resources as needed, without having to invest in expensive hardware and infrastructure.
Computing pervades all aspects of society in ways once imagined by only a few. Within science and engineering, computing has often been called the third paradigm, complementing theory and experiment, with big data and artificial intelligence (AI) often called the fourth paradigm.14 Spanning both data analysis and disciplinary and multidisciplinary modeling, scientific computing systems have grown ever larger and more complex, and today's exascale scientific computing systems rival global scientific facilities in cost and complexity. However, all is not well in the land of scientific computing. In the initial decades of digital computing, government investments and the insights from designing and deploying supercomputers often shaped the next generation of mainstream and consumer computing products. Today, that economic and technological influence has increasingly shifted to smartphone and cloud service companies. Moreover, the end of Dennard scaling,3 slowdowns in Moore's Law, and the rising costs for continuing semiconductor advances have made building ever-faster supercomputers more economically challenging and intellectually difficult. As Figure 1 suggests, we believe current approaches to designing and constructing leading-edge high-performance computing (HPC) systems must change in deep and fundamental ways, embracing end-to-end co-design; custom hardware configurations and packaging; large-scale prototyping; and collaboration between the dominant computing companies, smartphone and cloud computing vendors, and traditional computing vendors.
While Google shuttered Stadia for good this week, other cloud gaming services are expanding their offerings. NVIDIA is upgrading its GeForce Now service with a bunch of features, thanks to the addition of new SuperPODs equipped with RTX 4080 GPUs. This seems to be the first truly high-end cloud gaming experience. The renamed Ultimate plan now includes support for refresh rates of up to 240Hz at full HD or 4K at 120 fps and an expanded set of usable widescreen resolutions (3,840x1,600, 3,440x1,440 and 2,560x1,080). NVIDIA is also adding better support for HDR on both Macs and PCs, along with the ability to use full ray tracing with DLSS3 in supported games.
Analysts at Deloitte have unveiled their predictions for what they believe will be the most important tech trends of the next 12 months. To do this, they have framed core trends, including machine learning, cloud computing, and blockchain, against the dynamically shifting industry backgrounds where they operate, aiming to focus on real-world use cases rather than just high-level technology concepts. I took the chance to review the selection with their chief futurist, Mike Bechtel, in order to gain some insights into why the consulting giant believes they will be so influential, and how we are likely to see them impacting our lives as we move into 2023 and beyond. Predicting the impact that the much-talked-about metaverse will have on our lives, Deloitte's report focuses on the concept of "Immersive internet for the enterprise." The true value of the metaverse, it is suggested, will be the new business models that it makes possible.
DataStax, the driving force behind the ongoing development of and commercialization of the open source NoSQL Apache Cassandra database, had been in business for nine years in 2019 when it made a hard shift to the cloud. The company had already been working with organizations whose businesses already stretched into hybrid and multicloud environments, but its "cloud first" strategy was designed to make it easier for the company to grow and easier for customers to consume Cassandra. This cloud first approach is shared by many established and startup software companies alike. Back then, DataStax had just unveiled Constellation, a cloud data platform for developers to build newer application and operations teams to manage them, with the first offering on the platform being DataStax Apache Cassandra as a Service. A year later, the company announced its Astra database cloud service and in 2021 released a new version of Astra for serverless deployments. The transition to the cloud was important in making it easier for enterprises to use Cassandra, according to Ed Anuff, chief product officer at DataStax.
Thanks for reading the web version, you can subscribe to the Ops In Dev newsletter here. Happy new year to everyone who celebrates it! I'll cover the best learning pieces in this newsletter and invest in learning hot topics like AI/ML. I started my year early on January 2nd, and boom, a CI/CD pipeline failed with a fancy stack trace. Got me thinking - what if AI could assist with solving pipeline errors for better efficiency?