NUREMBERG, Germany and SUNNYVALE, CA, USA, May 5, 2021 – Google Cloud and Siemens, an innovation and technology leader in industrial automation and software, today announced a new cooperation to optimize factory processes and improve productivity on the shop floor. Siemens intends to integrate Google Cloud's leading data cloud and artificial intelligence/machine learning (AI/ML) technologies with its factory automation solutions to help manufacturers innovate for the future. Siemens and Google Cloud to cooperate to transform manufacturing by enabling scaled deployment of artificial intelligence. Data drives today's industrial processes, but many manufacturers continue to use legacy software and multiple systems to analyze plant information, which is resource-intensive and requires frequent manual updates to ensure accuracy. In addition, while AI projects have been deployed by many companies in "islands" across the plant floor, manufacturers have struggled to implement AI at scale across their global operations.
Japan's hot startup stocks have two things in common: They do business in areas that could be described as mundane, and they've pushed their founders into the league of the ultrawealthy. Take AI Inside Inc., which helps turn handwritten documents into electronic files. Or Rakus Co., whose goal is to help small and midsize enterprises with their bookkeeping and emailing services. Their shares have all more than doubled in the past year, enriching their founders and leading to talk of a burgeoning tech scene that's very different from Silicon Valley. While the companies are using technologies like artificial intelligence and cloud computing, they're applying them in less sexy ways.
BEGIN ARTICLE PREVIEW: SAN JOSE, Calif., Dec. 2, 2020 /PRNewswire/ — MetricStream, the independent market leader in enterprise cloud applications for Governance, Risk, and Compliance (GRC), announced enhancements to its cloud-native M7 Integrated Risk Platform that is intelligent by design, and audit, compliance, enterprise risk, third-party risk and cyber security products leveraging the power of Amazon Web Services (AWS). Learn more. A recent IDC report states that enterprises have increased their cloud usage by 60% in 2020. The increased volume and velocity of risks, cybersecurity incidents, increasing number of compliance regulations and updates have made it critical for organizations to gain a more holistic view of governance, risk, compliance, and cyber programs. The reality of a
International Business Machines (IBM) - Get Report and Advanced Micro Devices (AMD) - Get Report said they began a development program focused on cybersecurity and artificial intelligence. The development agreement will build on "open-source software, open standards, and open system architectures to drive confidential computing in hybrid cloud environments," the companies said in a statement. The agreement also will "support a broad range of accelerators across high-performance computing and enterprise critical capabilities, such as virtualization and encryption," they said. AMD, Santa Clara, Calif., is one of the world's biggest chipmakers and is thriving. IBM, the storied Armonk, N.Y., technology services company, has struggled to regain the glory of its past, when it led the computer-making industry.
ARMONK, NY and SANTA CLARA, CA, October 15, 2020 – IBM (NYSE: IBM) and ServiceNow (NYSE: NOW) today announced an expansion to their strategic partnership designed to help companies reduce operational risk and lower costs by applying AI to automate IT operations. Available later this year, a new joint solution will combine IBM's AI‑powered hybrid cloud software and professional services to ServiceNow's intelligent workflow capabilities and market‑leading IT service and operations management products. The solution is engineered to help clients realize deeper, AI‑driven insights from their data, create a baseline of a typical IT environment, and take succinct recommended actions on outlying behavior to help prevent and fix IT issues at scale. Together, IBM and ServiceNow can help companies free up valuable time and IT resources from maintenance activities, to focus on driving the transformation projects necessary to support the digital demands of their businesses. "AI is one of the biggest forces driving change in the IT industry to the extent that every company is swiftly becoming an AI company," said Arvind Krishna, Chief Executive Officer, IBM.
Cloudinstitute.IO Teams up With Amazon Web Services as a Trusted Vendor on AWS Marketplace for Customers to Get Cloud Training and Certification Preparation on All the Top It Cloud Providers and Vendors CloudInstitute.io CloudInstitute.io is a cloud workforce readiness platform using AI and adaptive learning to enable, drive adoption and provide certification training for employees. The platform assesses cloud skills of employees against their specific cloud goals & objectives. Then it curates a personalized leaning path for each employee against the goals and provides managers insights into their employees' readiness for specific task and projects. To date our AI powered platform has been leveraged by organizations like California State University, City of Costa Mesa and Motion Picture Association of America (MPAA) among others, to fulfill their AWS training needs.
SAN FRANCISCO, Calif., July 31, 2020 – LF Edge, an umbrella organization within the Linux Foundation that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system, announced maturing of its Fledge project, which has issued it's 1.8 release and moved to the Growth Stage within the LF Edge umbrella. Fledge is an open source framework for the Industrial Internet of Things (IIoT), used to implement predictive maintenance, situational awareness, safety and other critical operations. Fledge v1.8 is the first release since moving to the Linux Foundation. However, this is the ninth release of the project code that has over 60,000 commits, averaging 8,500 commits/month. Concurrently, Fledge has matured into a Stage 2 or "Growth Stage" project within LF Edge.
There are plenty of tools and point solutions that address bits and pieces of the challenge of delivering artificial intelligence (AI) and Internet of things (IoT) applications. C3.ai's focus is on delivering an end-to-end platform for developing, deploying and running these applications in production at scale. Whether customers use every aspect of the C3.ai platform or not, big enterprise-scale companies seem to be attracted by that promise of quickly developing and running innovative, data-driven applications at scale. There was plenty of evidence of that fact at C3.ai's February 25-27 Transform conference in San Francisco, where customers including Bank of America, Shell, 3M and Engie detailed their deployments. C3.ai's cloud-first platform is comprehensive, addressing the needs of developers, data engineers and data scientists, and the operational teams challenged with bringing applications into production at scale.
Ben Horowitz resoundingly falls in the category of "needing no introduction": a highly successful entrepreneur who navigated a perilous situation with his business (Loudcloud, which became Opsware) to a $1.65B acquisition by HP, he's also the founder of premier Silicon Valley venture capital firm Andreessen Horowitz (aka "a16z"), and the best selling author of two books: "The Hard Thing About Hard Things" and the newly-released "What You Do Is Who You Are". It was a special treat to host Ben for a fireside chat at the most recent most recent edition of Data Driven NYC – a great evening that included two other terrific speakers: Amr Adwallah, now VP of Developer Relations at Google Cloud, and previously co-founder and CTO at Cloudera (NYSE: CLDR) and Michael James, co-founder of AI chip Cerebras. We spent a good hour with Ben and covered a bunch of topics, loosely organized in two parts, first AI and data, and then culture an his new book. Below are two videos covering each part, as well as a FULL TRANSCRIPT for anyone who prefers to read.
Deep learning models are being deployed in many mobile intelligent applications. End-side services, such as intelligent personal assistants, autonomous cars, and smart home services often employ either simple local models on the mobile or complex remote models on the cloud. However, recent studies have shown that partitioning the DNN computations between the mobile and cloud can increase the latency and energy efficiencies. In this paper, we propose an efficient, adaptive, and practical engine, JointDNN, for collaborative computation between a mobile device and cloud for DNNs in both inference and training phase. JointDNN not only provides an energy and performance efficient method of querying DNNs for the mobile side but also benefits the cloud server by reducing the amount of its workload and communications compared to the cloud-only approach. Given the DNN architecture, we investigate the efficiency of processing some layers on the mobile device and some layers on the cloud server. We provide optimization formulations at layer granularity for forward- and backward-propagations in DNNs, which can adapt to mobile battery limitations and cloud server load constraints and quality of service. JointDNN achieves up to 18 and 32 times reductions on the latency and mobile energy consumption of querying DNNs compared to the status-quo approaches, respectively.