Results


ASU researcher shifts big data computing into high gear

#artificialintelligence

Every second, approximately 6,000 tweets are posted on Twitter. That's a significant amount of data -- and it represents only one social media platform out of hundreds. Social media offers an enormous volume of unstructured data that can generate knowledge and help make better decisions on a larger scale. While humans are clearly efficient data generators, computers are having a difficult time processing and analyzing the sheer volume of data. Arizona State University Associate Professor Ming Zhao leads the development of GEARS, a big data computing infrastructure designed for today's demanding big data challenges.


The Economics and Benefits of Artificial Intelligence

#artificialintelligence

We see news about AI everywhere; sometimes, we see the excitement around AI and sometimes we see articles that talk about how AI will replace or destroy our jobs. We also see the occasional article talking about how AI will destroy humanity. In this article, I will not discuss an artificial general intelligence or an evil AI that wants to destroy humanity. I will focus on current AI, which is mostly based on the algorithms that can do predictions, and discuss how the economics of AI works and how it may affect business. I also want to mention that the content of this article is highly affected by (and this author highly recommends for further reading) Prediction Machines: The Simple Economics of Artificial Intelligence and Human Machine: Reimagining Work in the Age of AI.


Artificial intelligence: The king of disruptors

#artificialintelligence

He predicts computers will have human-level intelligence by 2029, and that by 2045 computers will surpass human intelligence. He and I agree that artificial intelligence is a positive force to augment human capacity. Like eye glasses and hearing aids, we will come to see AI as an extension of the human experience. AI may be the biggest disruptor society has ever experienced. But it's not just a disruptor; AI is also an accelerant with the potential to enrich human learning, discovery, and productivity personally and professionally.


Five Things to Consider as Strata Kicks Off

#artificialintelligence

Today marks the start of the fall Strata Data Conference in New York City, which has traditionally been the big data community's biggest show of the year. It's been a wild ride for the big data crowd in 2018, one that's brought its share of highs and lows. Now it's worth taking some time to consider where big data has come, and where it's possibly headed in the future. Here are five things to keep in mind as the Strata Data Conference kicks off. We've said this before, but it bears repeating: Hadoop is just one of many technologies angling for relevance in today's increasingly heterogeneous at-scale computing environment.


Machine Learning - Vistatec

#artificialintelligence

We have all heard this new buzz word in the world of technology. It is the new trend, everybody wants to jump in the { 80% band wagon, 50% train, 5% lake } and we are all making claims about how amazingly good and new it is: Machine Learning. But, is it really that new? Machine Learning is just another fancy word or area within a broader field called Predictive Analysis or Predictive Modelling. This { 70% area as it was used before, 90% branch as not in text yet, better style } of the statistics was born in the 1940s when governments invested in this area for military purposes.


How IoT with embedded AI is driving new revenue streams in the industrial world

#artificialintelligence

When I ask people what they think the Internet of Things (IoT) is all about, the majority say, "smart homes", probably based on personal experience with Alexa or Siri. If I say that it's also about industries making using of sensor data, most think of manufacturing. Sensors have been used for a long time in manufacturing, and the concept of using data generated at the edge to monitor and run automated processes is well understood. But this is underestimating the potential of IoT. In practice, IoT can be applied anywhere.


Meet These Incredible Women Advancing A.I. Research

#artificialintelligence

A world renowned pioneer in social robotics, Cynthia Breazeal splits her time as an Associate Professor at MIT, where she received her PhD and founded the Personal Robots Group, and Founder and Chief Scientist of Jibo, a personal robotics company with over $85 million in funding. While Breazeal's work has won numerous academic awards, industry accolades, and media attention, she had to fight early skepticism in the 1990s from other experts in robotics and AI. At the time, robots were seen as physical and industrial tools, not social or emotional companions. Her first social robot, Kismet, was unfairly called out in popular press as "useless". Breazeal bucked the trend with a very different vision: "I wanted to create robots with social and emotional intelligence that could work in collaborative partnership with people. In 2-5 years, I see social robots helping families with things that really matter, like education, health, eldercare, entertainment, and companionship." She hopes her work and influence will inspire others to create robots "not only with smarts, but with heart, too."


Cisco Unveils Server for Artificial Intelligence and Machine Learning

#artificialintelligence

SAN JOSE, Calif--September 10, 2018–Artificial intelligence (AI) and machine learning (ML) are opening up new ways for enterprises to solve complex problems. But they will also have a profound effect on the underlying infrastructure and processes of IT. According to Gartner, "only 4% of CIOs worldwide report that they have AI projects in production." And when it does, IT will struggle to manage new workloads, new traffic patterns, and new relationships within their business. To help enterprises address these emerging challenges, Cisco is unveiling its first server built from the ground up for AI and ML workloads.


How AI Could Destroy The Universe… With Paperclips!!!

#artificialintelligence

It took me 4 hours and 5 minutes to effectively annihilate the Universe by pretending to be an Artificial Intelligence tasked with making paper-clips. Put another way, it took me 4 hours and 5 minutes to have an existential crisis. This was done by playing the online game "Paperclip", which was released in 2017. Though the clip-making goal of the game is in itself simple, there are so many contemporary lessons to be extracted from the playthrough that a deep dive seems necessary. Indeed, the game explores our past, present and future in the most interesting way, especially when it comes to the technological advances Silicon Valley is currently oh so proud of.


One-shot Learning for iEEG Seizure Detection Using End-to-end Binary Operations: Local Binary Patterns with Hyperdimensional Computing

arXiv.org Machine Learning

This paper presents an efficient binarized algorithm for both learning and classification of human epileptic seizures from intracranial electroencephalography (iEEG). The algorithm combines local binary patterns with brain-inspired hyperdimensional computing to enable end-to-end learning and inference with binary operations. The algorithm first transforms iEEG time series from each electrode into local binary pattern codes. Then atomic high-dimensional binary vectors are used to construct composite representations of seizures across all electrodes. For the majority of our patients (10 out of 16), the algorithm quickly learns from one or two seizures (i.e., one-/few-shot learning) and perfectly generalizes on 27 further seizures. For other patients, the algorithm requires three to six seizures for learning. Overall, our algorithm surpasses the state-of-the-art methods for detecting 65 novel seizures with higher specificity and sensitivity, and lower memory footprint.