Collaborating Authors

Cloud Computing

Classifying The Modern Edge Computing Platforms


A decade ago, edge computing meant delivering static content through a distributed content delivery network (CDN). Akamai, Limelight Networks, Cloudflare and Fastly are some of the examples of CDN services. They provide high availability and performance by distributing and caching the content closer to the end user's location. The definition of the edge has changed significantly over the last five years. Today, an edge represents more than a CDN or a compute layer.

Google Moves to Secure the Cloud From Itself


Sensitive data needs to be encrypted both when it's at rest and in transit--that is, when it's passively stored and when it's being sent from one spot to another. Covering these two bases protects information a lot of the time, but still doesn't account for every scenario. Now Google Cloud Services--which counts PayPal, HSBC, and Bloomberg as customers--is working to fill a crucial gap. When you're storing tons of data in the cloud, you typically don't just move it into place and then leave it. Organizations generally want to actively process the information they hold--meaning cloud customers want to comb and index their data, train machine learning models with it, or otherwise crunch some numbers.



The graph represents a network of 1,554 Twitter users whose tweets in the requested range contained "#cloudcomputing", or who were replied to or mentioned in those tweets. The network was obtained from the NodeXL Graph Server on Monday, 13 July 2020 at 10:38 UTC. The requested start date was Monday, 13 July 2020 at 00:01 UTC and the maximum number of days (going backward) was 14. The maximum number of tweets collected was 5,000. The tweets in the network were tweeted over the 1-day, 22-hour, 34-minute period from Saturday, 11 July 2020 at 01:18 UTC to Sunday, 12 July 2020 at 23:53 UTC.

Nvidia brings Ampere A100 GPUs to Google Cloud


Just over a month after announcing its latest generation Ampere A100 GPU, Nvidia said this week that the powerhouse processor system is now available on Google Cloud. The A100 Accelerator Optimized VM A2 instance family is designed for enormous artificial intelligence workloads and data analytics. Nvidia says users can expect substantive improvements over previous processing models, in this instance up to a 20-fold performance boost. The Nvidia Ampere is the largest 7 nanometer chip ever constructed. It sports 54 billion transistors and offers innovative features such as multi-instance GPU, automatic mixed precision, an NVLink that doubles GPU-to-GPU direct bandwidth and faster memory reaching 1.6 terabytes per second.

Nvidia EGX takes AI computing to the edge of the network


Nvidia is launching its EGX Platform to bring real-time artificial intelligence to the edge of the network. This means AI computing will happen where sensors collect data before it is sent to cloud-connected datacenters. "There's a massive change in the computing industry being driven by growth of [internet of things] sensors," said Justin Boitano, senior director of enterprise and edge computing, in a press briefing. "There are cameras for seeing the world, microphones for hearing the world, and devices being deployed so machines can detect what is happening in the real world." But this also means there's an exponential increase in the amount of raw data that has to be analyzed.

MadHive selects SADA to lead $50MN Google Cloud OTT initiative


Global technology consultancy SADA has closed a five-year, $50 million deal with ad tech provider MadHive to expand the over-the-top (OTT) ad solutions company's use of Google Cloud technologies to deliver new products and services. MadHive's end-to-end advertising solution -- based on cryptography, blockchain and AI to power modern media -- was first deployed on the Google Cloud Platform (Google Cloud) in 2017 with help from Google Cloud Premier Partner SADA. The challenge was to deliver MadHive's next-generation platform at scale with low latency while supporting a rapid, iterative development cycle, machine learning requirements, and a short go-to-market timeline. "SADA's first step with Madhive was analysing the limits of the Kubernetes- and Docker-based implementation they had previously used for prototypes," said SADA director of cloud adoption Simon Margolis. MadHive said that from ideation to research, patent and deployment, Google Cloud's big data and machine learning tools were the only backend technologies capable of meeting its technical demands.

From COVID-19 vaccines to drugs and data analysis: How AWS is helping in the global pandemic response


Amazon Web Services held an online panel discussion Thursday that looked at how the company's cloud infrastructure is supporting the COVID-19 response, from outbreak prediction to vaccine development. The rapid progression from viral outbreak in China to full-blown global pandemic has magnified the role of clinical researchers, biotech companies and drug manufacturers in the global response to the virus. For the key players in this space, the COVID-19 pandemic has become a high stakes data challenge and cloud technology case study. AWS customers BlueDot, Lifebit, AbCellera, Moderna, UC San Diego Health System, and Babylon are using a range of cloud technologies to increase the pace of innovation, accelerate development timelines and help improve outcomes during the COVID-19 pandemic. From cancelled conferences to disrupted supply chains, not a corner of the global economy is immune to the spread of COVID-19.

Google Cloud lands Renault as analytics, machine learning, AI customer


Infrastructure around the world is being linked together via sensors, machine learning and analytics. We examine the rise of the digital twin, the new leaders in industrial IoT (IIoT) and case studies that highlight the lessons learned from production IIoT deployments. Automaker Renault said it will use Google Cloud's analytics, machine learning and artificial intelligence services on a wide range of initiatives to make the company more efficient. Specifically, Renault will use Google Cloud to improve manufacturing, supply chain, production and sustainability. The automaker wants to develop its data infrastructure and participate in so-called Industry 4.0.

IBM expands storage portfolio to drive AI deployments


IBM on Thursday unveiled new and updated storage products designed to help enterprises build AI-optimized infrastructure. The expanded product lineup includes the new IBM Elastic Storage System 5000, which is scalable to yottabyte configurations. In May, IBM CEO Arvind Krishna kicked off IBM's Think Digital 2020 conference by promising that IBM "will support your journeys" through hybrid cloud and AI and declaring that AI will be a key reason customers work with IBM. "Every company will become an AI company," he said. The new Elastic Storage System 5000 (ESS 5000) is a software-defined storage solution designed for the first stage of an organization's "AI journey" -- data collection.

The Role of AI in Augmenting Cloud Computing


The global artificial intelligence (AI) software market is expected to undergo a major growth in the coming years, with revenues increasing from around ten billion U.S. dollars in 2018 to about 126 billion by 2025. As artificial intelligence continues to power cloud technology, cloud computing is the fuel that has been increasing the scope of and the impact AI can have on the software market. The increasing adoption of digital assistants like Alexa, Siri, Google Home, and others points towards the ways in which the combination of AI and cloud computing is improving everyone's daily lives. Whether it is playing a song or making an online purchase, the fusion of these two technologies enables connected and intuitive experiences. On a macro level, AI capabilities are being merged with enterprise cloud computing infrastructure so organizations can be more strategic, agile, efficient, and insight-driven.