cloud computing


Auriga Attends Intel Experience Day 2019

#artificialintelligence

Intel Experience Day 2019, organized by Intel, one of the major innovative hardware and technology corporations worldwide, took place in Moscow at the end of October. Intel and partner companies presented the latest Intel hardware and software product implementations advancing IoT, AI, computer vision, machine learning, object recognition, and more. Many speakers shared their ideas and insights on trending industrial innovations like cloud computing, Big Data, and analytics, including Al Diaz, Intel's Vice President, Natalya Galyan, Intel's Regional Director for Russia, and Marina Alekseeva, CEO of R&D of Intel in Russia. Intel Experience Day 2019 attracted many IT market players who use Intel solutions in their work daily, and Auriga experts were among them. Several years ago, Auriga became a pioneer user of the Intel Multi-OS Engine tool to develop an innovative iPad application for patient monitoring.


The transformation of healthcare with AI and machine learning

#artificialintelligence

AI and ML solutions are already being used by thousands of companies with the goal of improving the healthcare experience. For example, Babylon Health is changing the way we manage and better understand health. Founder, Ali Parsa developed the app in 2013 with a mission of providing accessible and affordable healthcare to every individual on earth. Babylon's AI system has been designed to understand and recognise the way humans express their medical symptoms and it can interpret symptoms and medical questions through a chatbot interface and match them to the most appropriate service. It can recognise most healthcare issues seen in primary care and provide information on next steps to take.


Azure Functions for ML ?

#artificialintelligence

Functional programming in pure form implies no state and no side effects when called (since there is no state). Azure Functions (much like AWS Lambda and Google Cloud Functions) is a neat concept where you don't need any explicit infra, you just deploy a function and reference it via an endpoint (URI). During my search for the ultimate (aka, ultimately cheap/free) deployment for my ML models, I thought I would try this out since the pricing docs state a high free # of executions. Setup is easy, go to Azure Portal and run thru the usual "Create" and search for "Functions" Not well documented is that you have to install both the node package (npm install -g azure-functions-core-tools) and the python package via pip (pip install azure-functions) if you are coding in Python on Visual Studio Code. Some languages have an online editor within Azure Portal which is kind of nice (though kind of dangerous).


What does the future hold for your agency?

#artificialintelligence

Agency business models are changing. Whether it's building out new capabilities with emerging tech or helping clients get to grips with the current paradigm shifts of big data, personalisation and digital transformation, agencies are playing a bigger role than ever in brand disruption and innovation. Underpinning all of this change is the rapid development of cloud technologies. How agencies take advantage of cloud and integrate it into their products and services will determine their success or failure in this brave new world. Join Wirehive and the Microsoft Azure team to understand how agencies can work directly with Microsoft and the wider partner network to take advantage of the end-to-end capabilities of the Microsoft Cloud.


Microsoft & Nokia Come Together Yet Again, Aims To Work On AI, IOT, Cloud

#artificialintelligence

History exists so that man could learn from his mistakes, It's been around five years have passed since Microsoft's $7 billion ill-fated acquisition of Nokia's smartphone business, and now after learning too many lessons, both the tech champions are coming together yet again. Microsoft has announced a strategic collaboration with Nokia to accelerate transformation and innovation across industries with Cloud, Artificial Intelligence (AI) and Internet of Things (IoT). "Bringing together Microsoft's expertise in intelligent cloud solutions and Nokia's strength in building the business and mission-critical networks will unlock new connectivity and automation scenarios," Microsoft Azure Executive Vice President Jason Zander said in a statement. "We're excited about the opportunities this will create for our joint customers across industries." The new partnership combines Microsoft's expertise in Cloud Computing and Artificial Intelligence with Nokia's 5G private wireless and mission-critical networking prowess.


Sainsbury's taps Google Cloud for trends insights

#artificialintelligence

Sainsbury's commercial and technology teams are working with Accenture to implement machine learning processes that they say are providing the retailer with better insight into consumer behaviour. Using the Google Cloud Platform (GCP), the key aim of the collaboration is to generate new insights on what consumers want and the trends driving their eating habits. By tapping into data from multiple structured and unstructured sources, the supermarket chain has developed predictive analytics models that it uses to adjust inventory based on the trends it spots. According to Alan Coad, managing director of Google Cloud in the UK and Ireland, the platform can "ingest, clean and classify that data", while a custom-built front-end interface for staff can be used "to seamlessly navigate through a variety of filters and categories" to generate the relevant insights. Phil Jordan, group CIO of Sainsbury's, said: "The grocery market continues to change rapidly. "We know our customers want high quality at great value and that finding innovative and distinctive products is increasingly important to them.


Ask the Expert: Pete Hirsch, CTO at BlackLine EM360

#artificialintelligence

This week's Ask the Expert is with Pete Hirsch, CTO at BlackLine. BlackLine develops cloud-based solutions to automate and control the entire financial close process. At the company, Pete is responsible for their product and technology groups, including product management, engineering, data centres and cloud operations, and enterprise IT. Pete also has experience in building, transforming, and operating large-scale cloud software businesses in fintech and procurement. In this episode, Pete explores trust in data and artificial intelligence (AI).


AI-driven storage soars on the pay-as-you-go current

#artificialintelligence

It's hard to remember now, but there was a time not so long ago when storage was perceived as a somewhat stodgy, innovation-challenged zone of the data center. In the past decade, storage has been rocked by one breakthrough after another, from cloud storage services to enterprise all-flash arrays. Businesses wanted storage with tighter connections to networking and compute, and vendors responded with hyperconverged infrastructure. Manufacturers found they were running out of space on the memory chip, so they added more layers vertically, giving us 3D NAND memory. Such trends propelled the rise of intelligent storage: combinations of storage hardware and services that leverage remote data collection and artificial intelligence to actively manage their environment, whether on premises or in the cloud.


How does AI improve grid performance? No one fully understands and that's limiting its use

#artificialintelligence

Just as power system operators are mastering data analytics to optimize hardware efficiencies, they are discovering how the complexities of artificial intelligence tools can do far more, and how to choose which to use. With deployment of advanced metering infrastructure (AMI) and smart sensor-equipped hardware, system operators are capturing unprecedented levels of data. Cloud computing and massive computational capabilities are allowing data analytics to make these investments pay off for customers. But it may take machine learning (ML) and artificial intelligence (AI) to address new power grid complexities. AI is a form of computer science that would make power system management fully autonomous in real time, researchers and private sector providers of power system services told Utility Dive.


New White Paper: High-Performance Virtualized Spark Clusters on Kubernetes for Deep Learning - VMware VROOM! Blog

#artificialintelligence

A new white paper is available showing the advantages of running virtualized Spark Deep Learning workloads on Kubernetes. Recent versions of Spark include support for Kubernetes. For Spark on Kubernetes, the Kubernetes scheduler provides the cluster manager capability provided by Yet Another Resource Negotiator (YARN) in typical Spark on Hadoop clusters. Upon receiving a spark-submit command to start an application, Kubernetes instantiates the requested number of Spark executor pods, each with one or more Spark executors. The benefits of running Spark on Kubernetes are many: ease of deployment, resource sharing, simplifying the coordination between developer and cluster administrator, and enhanced security.