Well File:

Cloud Computing


Hyundai Motor Group Pilots Digital Twin Technology to Improve EV Battery Performance

#artificialintelligence

SEOUL, May 23, 2022 – Hyundai Motor Group (the Group) announced on April 29 that it recently carried out a project with Microsoft Korea to prove digital twin technology is effective at predicting an electric vehicle's battery lifespan and optimizing its battery management and performance. Using Microsoft's cloud service Azure, the Group created digital twins of actual electric vehicles (EVs) with the aim to improve the accuracy of battery lifespan prediction and customize battery management systems for each EV model. Based on the project's success, the Group will implement digital twin technology as a way to improve battery performance going forward. Through this collaboration, the Group created digital twins of EVs in a virtual space based on various driving data collected from actual EVs in the real world, and used the virtual EVs to predict the battery lifespan of each vehicle. This high-level, data-integrated analysis model uses artificial intelligence (AI), machine learning and physical models to comprehensively analyze information, such as charging and discharging cycles as well as parking and driving environments.



SirionLabs Achieves SAP-Certified Integration with Cloud Solutions from SAP

#artificialintelligence

SirionLabs, the global leader in AI-powered contract lifecycle management (CLM), announced that its SirionOne platform has achieved SAP Ariba certification for integration with SAP's cloud solutions from the SAP Integration and Certification Center (SAP ICC). The integration enables new levels of contract intelligence for customers across legal, procurement, and sales teams by plugging into SAP Ariba and unlocking access to business-critical information in real-time from within enterprise applications. "With this strategic integration, SAP customers can easily manage and optimize end-to-end processes such as sourcing, contracting and spending analysis" With this certification, SAP customers who use SirionOne cut implementation times, lower integration costs, and have full compatibility between the solutions. It provides enterprises with seamless experiences by automating workflows such as the procure-to-pay cycle, purchase order (PO) creation, payments approvals, and spend analytics. "With this strategic integration, SAP customers can easily manage and optimize end-to-end processes such as sourcing, contracting and spending analysis," said Puneet Bhakri, SVP of Global Alliances & Partnerships at SirionLabs. "SirionLabs' interoperability with SAP cloud solutions allows even more enterprises to automate how they develop, manage, and measure contract performance across their organizations."


Manager - Integration Data Operations

#artificialintelligence

Gainsight is the leader in customer success and product experience software. The Gainsight Customer Cloud offers everything your business needs to retain customers and drive growth in the age of the customer. As the first cloud of its kind, Gainsight brings together the required technologies to deliver a superior post-sale experience, ensuring customers easily adopt products they've purchased and achieve their desired business outcomes in partnership with their vendor. Gainsight joined the Vista Equity Partners portfolio in 2020. Leading companies such as LinkedIn, Adobe, Tableau, Splunk, and Box choose Gainsight culminating in our recognition as one of the top 100 private cloud companies in the world by Forbes, one of the fastest-growing private companies in America by Inc. Magazine, and as one of 20 Great Workplaces in Tech by Fortune Magazine.


Multi-access edge computing spend to reach $23bn globally by 2027 – research – Gulf Business

#artificialintelligence

Increased demand for on-premises machine learning and low-latency connection – provided by 5G technology to fuel this growth.


Training a recommender model of 100 trillions parameters on Google Cloud

#artificialintelligence

A recommender system is an important component of Internet services today: billion dollar revenue businesses are directly driven by recommendation services at big tech companies. The current landscape of production recommender systems is dominated by deep learning based approaches, where an embedding layer is first adopted to map extremely large-scale ID type features to fixed-length embedding vectors; then the embeddings are leveraged by complicated neural network architectures to generate recommendations. The continuing advancement of recommender models is often driven by increasing model sizes--several models have been previously released with billion parameters up to even trillion very recently. Every jump in the model capacity has brought in significant improvement on quality. The era of 100 trillion parameters is just around the corner.


Driving Intelligence in Power Management through Digitalization & IoT - Express Computer

#artificialintelligence

In this decade of 21st century, world is seeing very different challenges in the power management in Power Generation, transmission, distribution & consumption. As most of the countries are taking on challenges on sustainability goals, there is significant Energy Transition happening towards more & more GREEN energy. Lot more sources of energy like Wind, Solar are becoming more viable and being adopted. However, this has put a clear expectation on adding intelligence into the existing products & solutions to manage this transition as well as adopting new digital software-based solutions. With the rapid advancement in the IoT & Cloud infrastructure, creating this intelligence is now feasible and economically viable.


Preparing for quantum: next steps for enterprise

#artificialintelligence

Investment in quantum technologies will grow from US$412mn in 2020 to US$8.6bn by 2027, according to research firm IDC. Organisations that get started now will have a significant competitive advantage over those that continue to wait until quantum computing is a proven technology. Nevertheless, the complexity of quantum hardware and software development are forcing organisations to invest significantly in elite quantum expertise just to explore quantum-possible use cases for their potential business value. Gordon Davey is Cloud Services (Microsoft) General Manager at SoftwareONE, a leading global provider of end-to-end software and cloud technology solutions. Davey said: "Quantum technologies within enterprises are expected to take off over the next five years, with forecasts estimating that the market will eventually be worth anywhere between $500mn to $29bn, according to IBM. The development of quantum computing is speeding up, and technology firms are partnering up with businesses to work on bringing out the first commercial applications. A great example of this is Goldman Sachs, who recently assembled a'full team dedicated to quantum computing', and JP Morgan, who is now looking to implement the use of quantum computers as well. Additionally, Willis Tower Watson has also partnered up with Microsoft to develop the potential of quantum computing."


La veille de la cybersécurité

#artificialintelligence

The age of artificial intelligence (A.I.) is finally upon us. Consumer applications of A.I., in particular, have come a long way, leading to more accurate search results for online shoppers, allowing apps and websites to make more personalized recommendations, and enabling voice-activated digital assistants to better understand us. We all know there is tremendous potential value in data, which continues to grow exponentially. In fact, the world is creating 2.5 quintillion bytes of data every day (that's 2.5 followed by 18 zeros). To harness that potential, companies need A.I. to make sense of the data, and hybrid cloud computing platforms that can distribute it across organizations.


Computer Vision Pipeline with Kubernetes

#artificialintelligence

We produce a multitude of attributes (characteristics attached to an entity -- building, parcel, etc.) using various sources such as aerial imagery. The idea is to build Deep Learning models from a few thousand buildings using in-house-tagged labels or existing labels from open data. In a second step, the models are deployed on the whole French territory, which represents more than 35 million images to process (i.e. 4 TB of data to deal with). This second step is the focus of this post. The challenge is to be able to infer at low cost and in a short amount of time, (less than a day).