From drones for food delivery and robots for automation to COVID-19 contact tracing apps, and online education learning platforms, we've seen a great acceleration in adoption of different technologies in the past few months. Technology has been a great pillar of strength during the pandemic and it's also going to help redefine the post COVID-19 world. Now, different businesses and industries will benefit from different technologies, but there are some common ones that are likely to dominate the world after COVID-19. Nuff said, let's take a look at some of the tech trends that are likely to see a surge in adoption post COVID-19. We know this one's too obvious but that's for a reason - AI is playing a massive role in helping us all get through the pandemic and it will see a greater adoption after the pandemic is over.
It has already been more than 60 years since the first video game was invented, and thanks to tremendous improvements in hardware capacity and innovations in game design, today's players have countless excellent options across countless game categories. The video game industry was worth US$139 billion in 2018, with a projected annual growth rate of 12 percent through 2025. As visual quality and gameplay becomes increasingly rich and sophisticated, leading video game companies are accelerating their investments in machine learning to take their games to the next level. Advanced computer vision technology is supercharging virtual and augmented reality, one of the latest milestones in video game design. Other AI technologies are enabling powerful enhancements not only in the development processes, for example with animation generation and intelligence enhancement of non-player characters (NPC), but also to implement breakthrough features such as infinite maps and character customizations.
Artificial Intelligence seems to be a unique technology of making a machine, a robot fully autonomous. AI is an analysis of how the machine is thinking, studying, determining and functioning when it is trying to solve problems. These kind of problems are present in all fields, the most emerging ones in 2020 and even beyond. The aim of Artificial Intelligence is to enhance machine functions relating to human knowledge, such as reasoning, learning and problems along with the ability to manipulate things. For example, virtual assistants or chatbots offer expert advice.
While the consumer-facing telecoms companies talk only about the speed of downloads, for manufacturing, the focus turns to ultra-reliable low-latency, density and ubiquitous connectivity. It's these lesser-known features, beyond the breakneck 5G speed, that will encourage industry to construct private 5G network infrastructure in industrial plants and warehouses. The sector is a production line for buzzwords; everything from the Industrial Internet of Things (IIoT) to Industry 4.0 are common, with'smart factories' and'edge computing' not far behind. From high-precision assembly lines and augmented reality overlays, to cloud robotics and cable-free factories, here are 12 ways 5G could transform manufacturing. Although it's an overstated part of 5G, there is no getting away from the fact that the ability to download data much, much faster will be a major attraction of 5G to the manufacturing industry.
With numerous organisations rapidly adapting to the use of new technologies within and outside of the workplace, wearable devices could soon become a common sight in offices as a means of enforcing workplace social distancing. Many are still working from home, but for those unable to carry out their jobs remotely, or those who have chosen to return to the workplace, ensuring they can do so safely is of paramount importance. The UK government has advised businesses to carry out a Covid-19 risk assessment, develop hygiene procedures, maintain workplace social distancing and manage transmission risk. But applying this to a busy workplace where employees attend meetings, collaborate on projects or simply socialise within the workplace makes keeping two metres apart a challenge. With this in mind, robotics company Tharsus has come up with a technology-based solution to "get businesses working again". The company, which has already developed technology solutions for companies such as DHL, Ocado, Rolls Royce, Automata and Small Robot Co, has developed "Bump", a Fitbit-style personal motion system designed to be a "simple, intuitive and friendly" way of improving workplace safety during the pandemic.
The graph represents a network of 4,223 Twitter users whose tweets in the requested range contained "#VR", or who were replied to or mentioned in those tweets. The network was obtained from the NodeXL Graph Server on Tuesday, 07 July 2020 at 18:36 UTC. The requested start date was Tuesday, 07 July 2020 at 00:01 UTC and the maximum number of days (going backward) was 14. The maximum number of tweets collected was 5,000. The tweets in the network were tweeted over the 1-day, 19-hour, 33-minute period from Sunday, 05 July 2020 at 04:27 UTC to Tuesday, 07 July 2020 at 00:01 UTC.
Artificial Intelligence (AI) is growing in spite of COVID-19. Though AI is not new, it has made major advancements recently in many fields. I will highlight five artificial intelligence trends for 2020. AI in digital marketing has ushered in unprecedented change on social media. It forecasts 24/7 chatbots, analyzes data and trends, manages custom feeds to generate content, search for content topics, create custom based personalized content, and make recommendations when required.
Researchers at University of Hawai'i at Mānoa, University of Illinois at Chicago, and Virginia Tech were awarded a $5 million National Science Foundation grant to synergize two complementary technologies -- large-scale data visualization and artificial intelligence -- to create the Smart Amplified Group Environment (SAGE3) open-source software. SAGE, soon to be on its third iteration as SAGE3, is the most widely used big-data visualization and collaboration software in the world. SAGE and SAGE2 are software to enable data-rich collaboration on high-resolution display walls. SAGE2 moved SAGE into cloud computing and SAGE3 ushers in the inclusion of artificial intelligence. Principal investigator Jason Leigh is a computer and information science professor at University of Hawai'i at Mānoa and the inventor of SAGE.
Edge computing can roughly be defined as the practice of processing and storing data either where it's created or close to where it's generated -- "the edge" -- whether that's a smartphone, an internet-connected machine in a factory or a car. The goal is to reduce latency, or the time it takes for an application to run or a command to execute. While that sometimes involves circumventing the cloud, it can also entail building downsized data centers closer to where users or devices are. Anything that generates a massive amount of data and needs that data to be processed as close to real time as possible can be considered a use case for edge computing: think self-driving cars, augmented reality apps and wearable devices. Edge computing can roughly be defined as the practice of processing and storing data either where it's created or close to where it's generated -- "the edge" -- whether that's a smartphone, an internet-connected machine in a factory or a car.
Artificial Intelligence (AI) is already ubiquitous in our day-to-day lives. From maps that find the optimal route, to Amazon, Netflix and Facebook who curate content and make recommendations tailored specifically to us. Your smartphone even understands voice commands and can perform tasks prompted by you. The technology is pervasive and is increasingly being applied in the education sector. Globally in the education sector, AI is being applied in tools that help develop learner skills, allow self-paced tailored learning, streamline assessment systems, and automate administrative activities.