As the coronavirus pandemic has brought the workforce online, organizations are struggling to manage dynamic remote workloads. On Thursday, a team of data scientists led by a Purdue University professor, Somali Chaterji, introduced a solution called OPTIMUSCLOUD. This new software technology, which runs with a database server, harnesses machine learning to create algorithms to improve the efficiency of virtual machine selection and options for database management systems. The system is designed to help organizations reap the greatest benefit from cloud-based databases. Chaterji directs the Innovatory for Cells and Neural Machines and teaches agricultural and biological engineering.
Not only has 2020 been one of the most challenging years in the history of humanity, but it has also completely transformed our way of thinking and working. Many of the top technology trends help us navigate the crisis. Throughout the year, COVID-19 has hurt all areas of life, causing businesses to close, economies collapse, and people to be anxious. Amid panic and uncertainty, technology and connectivity have become vital elements that have given people and companies hope, courage, optimism, and the competence to continue. The year was marked by many technological trends surrounding the cloud, robotics, the Internet of Things (IoT), and augmented reality (AR), leading companies through the crisis.
The COVID-19 pandemic has affected almost every industry in some way or another. In most cases, the effect has been negative, but several sectors have benefited from the new remote working and learning norms -- not least of which is cloud computing. The increased dependence on this industry has resulted in a surge in revenues, and the worldwide cloud market grew 33% in Q3 2020 to US$36.5 billion, which was US$2.0 billion higher than the previous quarter and up US$9.0 billion year-on-year, according to Canalys data. A new surge of COVID-19 cases in the United States and Europe will continue this trend as social distancing measures are put back in place, meaning cloud will remain vital for sustaining business operations, remote working and learning, as well as customer engagement. The report found that Amazon Web Services (AWS) was the leading cloud service provider in Q3 2020, increasing its share of total spend from last quarter to 32%.
Google blasted through the coronavirus pandemic with gangbuster earnings, just a week after U.S. prosecutors sued the company for operating a purported illegal monopoly in its flagship search business. Alphabet Inc. reported a third-quarter profit of $11.2 billion, well outstripping analyst estimates. As importantly, digital advertising revenue of $37.1 billion was up compared with last year, marking a turnaround from a quarter earlier, when the company recorded the first drop in the category in company history. Cogs across the Alphabet empire were clicking. Helped by stay-at-home trends, YouTube pulled in more than $5 billion in advertising for the first time, gaining 32% over the same period a year earlier.
An ongoing health crisis and a global recession: even for the most attuned of analysts, the past months have brought in a load of unexpected events that have made the coming years especially difficult to envision. Yet research firm CCS Insights has taken up the challenge and delivered a set of 100 tech predictions for the years 2021 and beyond. The exercise is an annual one for the company, which last year anticipated, among many other things, that the next decade could see the rise of deep fake detection technology, or the adoption of domestic robots in some households. One year later, and many of those predictions have been affected in one way or another by the COVID-19 pandemic. "What we've seen in the last few months has completely transformed a lot of the areas we cover," Angela Ashenden, principal analyst at CCS Insights, told ZDNet.
Human augmentation conjures up visions of futuristic cyborgs, but humans have been augmenting parts of the body for hundreds of years. Glasses, hearing aids and prosthetics evolved into cochlear implants and wearables. Even laser eye surgery has become commonplace. But what if scientists could augment the brain to increase memory storage, or implant a chip to decode neural patterns? What if exoskeletons became a standard uniform for autoworkers, enabling them to lift superhuman weights?
This portion of a new special report from Data Center Frontier takes a look at some of the most prominent and growing examples of edge computing business cases in 2020 and amid the COVID-19 pandemic, ranging from AI and telehealth to autonomous cars and 5G infrastructure. For edge computing to succeed, advances in proximity and latency must translate into business value that can justify the considerable expense of creating a massively distributed network. Who are the users driving demand for edge computing? And what are the use cases that will deliver on this new architecture? Large enterprise users affirm their readiness to invest in edge strategies, establishing a market beyond the telcos and content players.
IBM announced it has reached a definitive agreement to acquire Brazilian software provider of robotic process automation (RPA) WDG Soluções Em Sistemas E Automação De Processos LTDA (referred to as "WDG Automation" throughout). The acquisition further advances IBM's comprehensive AI-infused automation capabilities, spanning business processes to IT operations. Financial terms were not disclosed. In today's digital era, companies are looking for new ways to create new business models, deliver new services and lower costs. The need to drive this transformation is even greater now given the uncertainties of COVID-19.
Robotic process automation (RPA) software company UiPath has announced that it has added conversational AI capabilities to the end-to-end hyperautomation platform. The capabilities will fuel UiPath Robots and industry-tailored chatbots, and new automation capabilities will enhance employee productivity and ensure always-on, scalable, best-in-class global support experiences. The company will also be deploying its powerful Robotic Process Automation (RPA) with Oracle cloud infrastructure and business application to streamline critical workflows and complex processes. "As a member of the Oracle Partner Network, combining our powerful RPA with Oracle's cloud infrastructure was a practical next step. As the COVID-19 pandemic continues, this collaboration allows users to quickly pivot and respond to the ongoing challenges today's environment presents," said Dhruv Asher, Senior Vice President of Alliances and Business Development at UiPath, speaking exclusively with Toolbox "Demand for quality digital customer service has increased greatly as the COVID-19 pandemic has unfolded. By combining Druid's chatbot platform with our Hyperautomation Platform, enterprises can truly achieve end-to-end automation for customers and employees – improving both customer self-service and employee productivity."
Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis in locations close to where data is captured based on artificial intelligence. The aim of edge intelligence is to enhance the quality and speed of data processing and protect the privacy and security of the data. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this paper, we present a thorough and comprehensive survey on the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, namely edge caching, edge training, edge inference, and edge offloading, based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare and analyse the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, etc. This survey article provides a comprehensive introduction to edge intelligence and its application areas. In addition, we summarise the development of the emerging research field and the current state-of-the-art and discuss the important open issues and possible theoretical and technical solutions.