Goto

Collaborating Authors

Results


Artificial Intelligence (AI) vs. Machine Learning vs. Deep Learning

#artificialintelligence

Artificial intelligence, Machine Learning, Deep Learning … Technology is advancing by leaps and bounds and it is normal to feel lost if you don't know it. If until today you thought it was about similar concepts, we are sorry to tell you that you are wrong. At Yeeply, our mission is to shed light on these three technologies, so you can understand what they are and how they differ. Find out what they are, how they relate, and what apps they have. Artificial intelligence (AI) refers to the ability of a machine to imitate the cognitive functions that were previously only associated with humans.


Get ready for really low-power AI: Synaptics and Eta Compute envision neural nets that will observe every sound, every motion

ZDNet

Eta Compute had already developed its own ASIC chip and system board for low-power applications. Now it will devote its effort to making software tuned to Synaptics's chips. Smart buildings, smart cities, smart transportation -- such applications of the Internet of Things have been part of the lore of technology companies for over a decade now. But what does it really mean for there to be sensors that are constantly measuring the ambient noise of rooms, or watching people move about, day and night? That kind of constant surveillance may be coming to some built environments as soon as later this year, thanks to the arrival of chips and software that are dramatically more efficient at running algorithms within the tightest of energy constraints.


IBM is using light, instead of electricity, to create ultra-fast computing

ZDNet

To quench algorithms' seemingly limitless thirst for processing power, IBM researchers have unveiled a new approach that could mean big changes for deep-learning applications: processors that perform computations entirely with light, rather than electricity. The researchers have created a photonic tensor core that, based on the properties of light particles, is capable of processing data at unprecedented speeds, to deliver AI applications with ultra-low latency. Although the device has only been tested at a small scale, the report suggests that as the processor develops, it could achieve one thousand trillion multiply-accumulate (MAC) operations per second and per square-millimeter – more than twice as many, according to the scientists, as "state-of-the-art AI processors" that rely on electrical signals. IBM has been working on novel approaches to processing units for a number of years now. Part of the company's research has focused on developing in-memory computing technologies, in which memory and processing co-exist in some form.


A Decade Of AI: Most Defining Moments 2010-20

#artificialintelligence

People were talking, theorising and experimenting with AI for sure, but what happened in the last decade has made AI more tangible. This was the decade when AI went mainstream. Be it access to world standard courses, platforms, libraries, frameworks, hardware -- everything just fell into place. And, it wouldn't be an exaggeration if one were to say that what was accomplished in the last ten years single-handedly fortified the foundations of our future. In this article, we look at a few of the most important breakthroughs that directly or indirectly have made AI a household name.


Top 7 NLP Trends To Look Forward To In 2021

#artificialintelligence

Natural language processing first studied in the 1950s, is one of the most dynamic and exciting fields of artificial intelligence. With the rise in technologies such as chatbots, voice assistants, and translators, NLP has continued to show some very encouraging developments. In this article, we attempt to predict what NLP trends will look like in the future as near as 2021. A large amount of data is generated at every moment on social media. It also births a peculiar problem of making sense of all this information generated, which cannot be possibly done manually.


iiot ai_2020-12-25_03-33-58.xlsx

#artificialintelligence

The graph represents a network of 1,228 Twitter users whose tweets in the requested range contained "iiot ai", or who were replied to or mentioned in those tweets. The network was obtained from the NodeXL Graph Server on Friday, 25 December 2020 at 11:39 UTC. The requested start date was Friday, 25 December 2020 at 01:01 UTC and the maximum number of tweets (going backward in time) was 7,500. The tweets in the network were tweeted over the 2-day, 10-hour, 13-minute period from Tuesday, 22 December 2020 at 14:46 UTC to Friday, 25 December 2020 at 01:00 UTC. Additional tweets that were mentioned in this data set were also collected from prior time periods.


Top 100 Artificial Intelligence Companies in the World

#artificialintelligence

Artificial Intelligence (AI) is not just a buzzword, but a crucial part of the technology landscape. AI is changing every industry and business function, which results in increased interest in its applications, subdomains and related fields. This makes AI companies the top leaders driving the technology swift. AI helps us to optimise and automate crucial business processes, gather essential data and transform the world, one step at a time. From Google and Amazon to Apple and Microsoft, every major tech company is dedicating resources to breakthroughs in artificial intelligence. As big enterprises are busy acquiring or merging with other emerging inventions, small AI companies are also working hard to develop their own intelligent technology and services. By leveraging artificial intelligence, organizations get an innovative edge in the digital age. AI consults are also working to provide companies with expertise that can help them grow. In this digital era, AI is also a significant place for investment. AI companies are constantly developing the latest products to provide the simplest solutions. Henceforth, Analytics Insight brings you the list of top 100 AI companies that are leading the technology drive towards a better tomorrow. AEye develops advanced vision hardware, software, and algorithms that act as the eyes and visual cortex of autonomous vehicles. AEye is an artificial perception pioneer and creator of iDAR, a new form of intelligent data collection that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. Their mission was to acquire the most information with the fewest ones and zeros. This would allow AEye to drive the automotive industry into the next realm of autonomy. Algorithmia invented the AI Layer.


AI Debate 2: Night of a thousand AI scholars

ZDNet

Gary Marcus, top, hosted presentations by sixteen AI scholars on what AI needs to "move forward." A year ago, Gary Marcus, a frequent critic of deep learning forms of AI, and Joshua Bengio, a leading proponent of deep learning, faced off in a two-hour debate about AI at Bengio's MILA institute headquarters in Montreal. Wednesday evening, Marcus was back, albeit virtually, to open what is now the second installment of what has become a planned annual debate on AI, under the title "AI Debate 2: Moving AI Forward." Vincent Boucher, president of the organization Montreal.AI, who had helped to organize last year's debate, opened the proceedings, before passing the mic to Marcus as moderator. Marcus said 3,500 people had pre-registered for the evening, and at the start, 348 people were live on FaceBook. Last year's debate had 30,000 by the end of the night, noted Marcus. Bengio was not in attendance, but the evening featured presentations from sixteen scholars: Ryan Calo, Yejin Choi, Daniel Kahneman, Celeste Kidd, Christof Koch, Luis Lamb, Fei-Fei Li, Adam Marblestone, Margaret Mitchell, Robert Osazuwa Ness, Judea Pearl, Francesco Rossi, Ken Stanley, Rich Sutton, Doris Tsao and Barbara Tversky. "The point is to represent a diversity of views," said Marcus, promising a three hours that might be like "drinking from a firehose."


Interactive Visualization System that Helps Students Better Understand and Learn CNNs

#artificialintelligence

This research summary is just one of many that are distributed weekly on the AI scholar newsletter. To start receiving the weekly newsletter, sign up here. Artificial intelligence (AI) has grown tremendously in just a few years ushering us into the AI era. We now have self-driving cars, contemporary chatbots, high-end robots, recommender systems, advanced diagnostics systems, and more. Almost every research field is now using AI.


Developing smarter, faster machine intelligence with light

#artificialintelligence

Researchers at the George Washington University, together with researchers at the University of California, Los Angeles, and the deep-tech venture startup Optelligence LLC, have developed an optical convolutional neural network accelerator capable of processing large amounts of information, on the order of petabytes, per second. Global demand for machine learning hardware is dramatically outpacing current computing power supplies. State-of-the-art electronic hardware, such as graphics processing units and tensor processing unit accelerators, help mitigate this, but are intrinsically challenged by serial data processing that requires iterative data processing and encounters delays from wiring and circuit constraints. Optical alternatives to electronic hardware could help speed up machine learning processes by simplifying the way information is processed in a non-iterative way. However, photonic-based machine learning is typically limited by the number of components that can be placed on photonic integrated circuits, limiting the interconnectivity, while free-space spatial-light-modulators are restricted to slow programming speeds.