Goto

Collaborating Authors

Results


Could 2021 be the year for technology? Here are some trends to watch out for

#artificialintelligence

The icing on the cake is that the action takes place in the PUBG universe. Some of the most exciting inventions in TV will be in 2021. LG has hinted at ditching the E-Series OLED and bringing in Gallery Series. On the other hand, Samsung might unveil a rotating Sero TV. This year will be bigger and mightier with TV screens measuring above 75-inch becoming mainstream.


AI-fueled app Natural offers new interface for consumer transactions

ZDNet

The Mac's mouse, the iPod's click wheel, the iPhone's multitouch display, and the Apple Watch's digital crown are all part of Apple lore in which a new device class mandated a new user interface. But there's a significant exception to that established pattern: Siri. The voice agent emerged as a way to control some of the iPhone's features but was never a way to completely control it the way Alexa served as the Echo's main user interface. Rather, it could retrieve bits of information and complete simple tasks online. Now, an app called Natural seeks to go beyond what agents such as Siri and Alexa can achieve in terms of transactions while remaining wed to the smartphone's -- or any connected device's -- touchscreen.


What's the future of mobile app development?

#artificialintelligence

The mobile industry is constantly evolving -- what was a groundbreaking technology yesterday may be a forgotten thing of the past today. That's why you need to continuously observe the mobile market so that you don't miss the opportunity for business growth. But you can't just analyse what's trending right now, that would be too easy. The mobile app market is changing at break-neck speed, so if you want to reach a much wider audience, improve user retention and get more app downloads, you need to predict future trends. That's what I'm here to help with.


The Future of Enterprise Billing

#artificialintelligence

The connectivity benefits of 5G are expected to make businesses more competitive and give consumers access to more information faster than ever before. Connected cars, smart communities, industrial IoT, healthcare, immersive education--they all will rely on unprecedented opportunities that 5G technology will create. The enterprise market opportunity is driving many telecoms operators' strategies for, and investments in, 5G. Companies are accelerating investment in core and emerging technologies such as cloud, internet of things, robotic process automation, artificial intelligence and machine learning. IoT (Internet of Things), as an example, improving connectivity and data sharing between devices, enabling biometric based transactions; with blockchain, enabling use cases, trade transactions, remittances, payments and investments; and with deep learning and artificial intelligence, utilization of advanced algorithms for high personalization.


Mobile Augmented Reality: User Interfaces, Frameworks, and Intelligence

arXiv.org Artificial Intelligence

Mobile Augmented Reality (MAR) integrates computer-generated virtual objects with physical environments for mobile devices. MAR systems enable users to interact with MAR devices, such as smartphones and head-worn wearables, and performs seamless transitions from the physical world to a mixed world with digital entities. These MAR systems support user experiences by using MAR devices to provide universal accessibility to digital contents. Over the past 20 years, a number of MAR systems have been developed, however, the studies and design of MAR frameworks have not yet been systematically reviewed from the perspective of user-centric design. This article presents the first effort of surveying existing MAR frameworks (count: 37) and further discusses the latest studies on MAR through a top-down approach: 1) MAR applications; 2) MAR visualisation techniques adaptive to user mobility and contexts; 3) systematic evaluation of MAR frameworks including supported platforms and corresponding features such as tracking, feature extraction plus sensing capabilities; and 4) underlying machine learning approaches supporting intelligent operations within MAR systems. Finally, we summarise the development of emerging research fields, current state-of-the-art, and discuss the important open challenges and possible theoretical and technical directions. This survey aims to benefit both researchers and MAR system developers alike.


The 20 technologies that defined the first 20 years of the 21st Century

The Independent - Tech

The early 2000s were not a good time for technology. After entering the new millennium amid the impotent panic of the Y2K bug, it wasn't long before the Dotcom Bubble was bursting all the hopes of a new internet-based era. Fortunately the recovery was swift and within a few years brand new technologies were emerging that would transform culture, politics and the economy. They have brought with them new ways of connecting, consuming and getting around, while also raising fresh Doomsday concerns. As we enter a new decade of the 21st Century, we've rounded up the best and worst of the technologies that have taken us here, while offering some clue of where we might be going. There was nothing much really new about the iPhone: there had been phones before, there had been computers before, there had been phones combined into computers before. There was also a lot that wasn't good about it: it was slow, its internet connection barely functioned, and it would be two years before it could even take a video.


Digital Voodoo Dolls

arXiv.org Artificial Intelligence

An institution, be it a body of government, commercial enterprise, or a service, cannot interact directly with a person. Instead, a model is created to represent us. We argue the existence of a new high-fidelity type of person model which we call a digital voodoo doll. We conceptualize it and compare its features with existing models of persons. Digital voodoo dolls are distinguished by existing completely beyond the influence and control of the person they represent. We discuss the ethical issues that such a lack of accountability creates and argue how these concerns can be mitigated.


Big video innovation is moving forward in 5G era ZTE

#artificialintelligence

As a mature 5G application scenario, video will embrace more promising prospects in the future. With 5G networks, high-quality video content such as UHD, 4K, 8K, and 120-frame videos will become popular for consumers, and VR, AR, interactive video, and AI-based video content will be their next hot videos. Paid video content will also change in intelligent distribution mode such as advertising, and the use of big data and AI can make video content target the right audience, which will improve video revenues. As 5G is widely adopted, there will be more video applications such as 4K/8K videos, VR immersive experience, AR, ultra-low latency live broadcast, high-speed mobile video communication, mobile communication in a crowded environment, as well as multimedia and IoV. In the 5G era, UHD video and ultra-high speed will greatly meet people's daily video viewing needs.


A Survey on Edge Intelligence

arXiv.org Artificial Intelligence

Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis in locations close to where data is captured based on artificial intelligence. The aim of edge intelligence is to enhance the quality and speed of data processing and protect the privacy and security of the data. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this paper, we present a thorough and comprehensive survey on the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, namely edge caching, edge training, edge inference, and edge offloading, based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare and analyse the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, etc. This survey article provides a comprehensive introduction to edge intelligence and its application areas. In addition, we summarise the development of the emerging research field and the current state-of-the-art and discuss the important open issues and possible theoretical and technical solutions.


The 84 biggest flops, fails, and dead dreams of the decade in tech

#artificialintelligence

The world never changes quite the way you expect. But at The Verge, we've had a front-row seat while technology has permeated every aspect of our lives over the past decade. Some of the resulting moments -- and gadgets -- arguably defined the decade and the world we live in now. But others we ate up with popcorn in hand, marveling at just how incredibly hard they flopped. This is the decade we learned that crowdfunded gadgets can be utter disasters, even if they don't outright steal your hard-earned cash. It's the decade of wearables, tablets, drones and burning batteries, and of ridiculous valuations for companies that were really good at hiding how little they actually had to offer. Here are 84 things that died hard, often hilariously, to bring us where we are today. Everyone was confused by Google's Nexus Q when it debuted in 2012, including The Verge -- which is probably why the bowling ball of a media streamer crashed and burned before it even came to market.