Functional programming in pure form implies no state and no side effects when called (since there is no state). Azure Functions (much like AWS Lambda and Google Cloud Functions) is a neat concept where you don't need any explicit infra, you just deploy a function and reference it via an endpoint (URI). During my search for the ultimate (aka, ultimately cheap/free) deployment for my ML models, I thought I would try this out since the pricing docs state a high free # of executions. Setup is easy, go to Azure Portal and run thru the usual "Create" and search for "Functions" Not well documented is that you have to install both the node package (npm install -g azure-functions-core-tools) and the python package via pip (pip install azure-functions) if you are coding in Python on Visual Studio Code. Some languages have an online editor within Azure Portal which is kind of nice (though kind of dangerous).
Agency business models are changing. Whether it's building out new capabilities with emerging tech or helping clients get to grips with the current paradigm shifts of big data, personalisation and digital transformation, agencies are playing a bigger role than ever in brand disruption and innovation. Underpinning all of this change is the rapid development of cloud technologies. How agencies take advantage of cloud and integrate it into their products and services will determine their success or failure in this brave new world. Join Wirehive and the Microsoft Azure team to understand how agencies can work directly with Microsoft and the wider partner network to take advantage of the end-to-end capabilities of the Microsoft Cloud.
History exists so that man could learn from his mistakes, It's been around five years have passed since Microsoft's $7 billion ill-fated acquisition of Nokia's smartphone business, and now after learning too many lessons, both the tech champions are coming together yet again. Microsoft has announced a strategic collaboration with Nokia to accelerate transformation and innovation across industries with Cloud, Artificial Intelligence (AI) and Internet of Things (IoT). "Bringing together Microsoft's expertise in intelligent cloud solutions and Nokia's strength in building the business and mission-critical networks will unlock new connectivity and automation scenarios," Microsoft Azure Executive Vice President Jason Zander said in a statement. "We're excited about the opportunities this will create for our joint customers across industries." The new partnership combines Microsoft's expertise in Cloud Computing and Artificial Intelligence with Nokia's 5G private wireless and mission-critical networking prowess.
Sainsbury's commercial and technology teams are working with Accenture to implement machine learning processes that they say are providing the retailer with better insight into consumer behaviour. Using the Google Cloud Platform (GCP), the key aim of the collaboration is to generate new insights on what consumers want and the trends driving their eating habits. By tapping into data from multiple structured and unstructured sources, the supermarket chain has developed predictive analytics models that it uses to adjust inventory based on the trends it spots. According to Alan Coad, managing director of Google Cloud in the UK and Ireland, the platform can "ingest, clean and classify that data", while a custom-built front-end interface for staff can be used "to seamlessly navigate through a variety of filters and categories" to generate the relevant insights. Phil Jordan, group CIO of Sainsbury's, said: "The grocery market continues to change rapidly. "We know our customers want high quality at great value and that finding innovative and distinctive products is increasingly important to them.
This week's Ask the Expert is with Pete Hirsch, CTO at BlackLine. BlackLine develops cloud-based solutions to automate and control the entire financial close process. At the company, Pete is responsible for their product and technology groups, including product management, engineering, data centres and cloud operations, and enterprise IT. Pete also has experience in building, transforming, and operating large-scale cloud software businesses in fintech and procurement. In this episode, Pete explores trust in data and artificial intelligence (AI).
It's hard to remember now, but there was a time not so long ago when storage was perceived as a somewhat stodgy, innovation-challenged zone of the data center. In the past decade, storage has been rocked by one breakthrough after another, from cloud storage services to enterprise all-flash arrays. Businesses wanted storage with tighter connections to networking and compute, and vendors responded with hyperconverged infrastructure. Manufacturers found they were running out of space on the memory chip, so they added more layers vertically, giving us 3D NAND memory. Such trends propelled the rise of intelligent storage: combinations of storage hardware and services that leverage remote data collection and artificial intelligence to actively manage their environment, whether on premises or in the cloud.
A new white paper is available showing the advantages of running virtualized Spark Deep Learning workloads on Kubernetes. Recent versions of Spark include support for Kubernetes. For Spark on Kubernetes, the Kubernetes scheduler provides the cluster manager capability provided by Yet Another Resource Negotiator (YARN) in typical Spark on Hadoop clusters. Upon receiving a spark-submit command to start an application, Kubernetes instantiates the requested number of Spark executor pods, each with one or more Spark executors. The benefits of running Spark on Kubernetes are many: ease of deployment, resource sharing, simplifying the coordination between developer and cluster administrator, and enhanced security.
This webinar will help you understand the benefits and process to create a file inventory: • Cloud vendors are now charging based on storage used, storage is no longer cheap • Successful cloud migrations require planning and file cleanup • Creating a file inventory will help evaluate and choose those files that should be migrated to the cloud • DocAuthority's Data Evolutional Artificial Intelligence (AI) automates the discovery, collection and categorization of documents into well-defined business categories, eliminating reliance on end-user
InfiniteIO, the world's fastest metadata platform to reduce application latency, today announced the new Application Accelerator, which delivers dramatic performance improvements for critical applications by processing file metadata independently from on-premises storage or cloud systems. The new platform provides organizations across industries the lowest possible latency for their mission-critical applications, such as AI/machine learning, HPC and genomics, while minimizing disruption to IT teams. This press release features multimedia. "Bandwidth and I/O challenges have been largely overcome, yet reducing latency remains a significant barrier to improving application performance," said Henry Baltazar, vice president of research at 451 Research. "Metadata requests are a large part of file system latency, making up the vast majority of requests to a storage system or cloud. InfiniteIO's approach to abstracting metadata from file data offers IT managers a nondisruptive way to immediately accelerate application performance."
This post demonstrates a *basic* example of how to build a deep learning model with Keras, serve it as REST API with Flask, and deploy it using Docker and Kubernetes. This is NOT a robust, production example. This is a quick guide for anyone out there who has heard about Kubernetes but hasn't tried it out yet. To that end, I use Google Cloud for every step of this process. The reason is simple -- I didn't feel like installing Docker and Kubernetes on my Windows 10 Home laptop. The additional benefit to you all following along is reproducing my steps should be much easier as you can run everything using the exact specifications I used.