If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Hardware maker Nvidia is ramping up its efforts to make a stand in the Metaverse. On Tuesday, the company revealed a new set of developer tools focused on metaverse environments, including new AI capabilities, simulations, and other creative assets. Creators utilizing the Omniverse Kit, along with apps such as Nucleus, Audio2Face and Machinima, will be able to access the new upgrades. Nvidia says one primary function of the tools will be to help enhance building "accurate digital twins and realistic avatars." Quality of metaverse interaction is a hot topic in the industry, as developers and users ponder the quality of experiences over the quantity.
While blockbuster research has slowed down slightly in the past month, probably because of the summer season, conferences are back at full speed in person: NAACL in Seattle, SIGIR in Madrid, and also ICML, for which we created a special guide with the help of GPT-3. Other news we'd like to highlight, to begin with are: Every month we analyze the most recent research literature and select a varied set of 10 papers you should know of. Why Scaling laws¹ is a pervasive empirical phenomenon in modern Neural Networks, where the error is observed to off as a power of the training set size, model size, or both. While some have embraced this fact to devise a research agenda focused on scaling up, many think there must be ways to build better models without the need for outrageous scale. This paper explores a technique -- data pruning -- that can improve the learning efficiency of NNs "beating" scaling laws.
This is an interview with Professor Emily Mower Provost that was first published by The Michigan Engineer News Center. Using machine learning to decode the unpredictable world of human emotion might seem like an unusual choice. But in the ambiguity of human expression, U-M computer science and engineering associate professor Emily Mower Provost has discovered a rich trove of data waiting to be analyzed. Mower Provost uses machine learning to help measure emotion, mood, and other aspects of human behavior; for example, she has developed a smartphone app that analyzes the speech of patients with bipolar disorder to track their mood, with the ultimate goal of helping them more effectively manage their health. How do you quantify something as ambiguous as emotion in a field where, traditionally, ambiguity is the enemy?
Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Enterprise leaders are constantly evaluating how technology can better serve the needs of their customers and employees. As AI technology progresses, businesses recognize the massive potential to improve customer and employee experiences and positively impact their bottom line. That's why more than half of leaders are investing accordingly, with plans to increase AI budgets in customer experience by at least 25% next year.
Meta's AI research labs have created a new state-of-the-art chatbot and are letting members of the public talk to the system in order to collect feedback on its capabilities. The bot is called BlenderBot 3 and can be accessed on the web. BlenderBot 3 is able to engage in general chitchat, says Meta, but also answer the sort of queries you might ask a digital assistant, "from talking about healthy food recipes to finding child-friendly amenities in the city." The bot is a prototype and built on Meta's previous work with what are known as large language models or LLMS -- powerful but flawed text-generation software of which OpenAI's GPT-3 is the most widely known example. Like all LLMs, BlenderBot is initially trained on vast datasets of text, which it mines for statistical patterns in order to generate language.
The brain is not a prediction machine, it does not make controlled hallucinations, best guesses, neither does predictive coding nor predictive processing explain its function. Deep learning and computer models may be great in making predictions, but the most advanced artificial intelligence anywhere till date is a good memory system, where inferences are smartly made based on data, but what it means to feel-like or have feelings, a major component of natural intelligence exceeds its capability. The computer that can win games, predict protein structures, drive itself and much else can do nothing when struck by some object. It does not have actual feelings, which could also have been picked up, to feel-like before the situation. For example -- to feel fear, while approaching a situation outside its training data.
According to a recent McKinsey survey of executives, companies have pushed the time frame for digitizing many aspects of their business, from internal operations to supply chain and customer interactions, by three to four years. Digital products in those companies' portfolios have also shot some seven years ahead of where they had expected to be prior to the pandemic. The Great Resignation, skill shortages, supply chain disruptions, working from home, touchless customer experience, and agile process redesigns are paradigm shifts that businesses have rapidly needed to adapt to. But how do companies stay dynamic, resilient, and efficient in this new era? We believe that a big part of the solution may be found in digital employees, powered by automation and AI.
According to Gartner, 81 percent of businesses compete primarily on customer experience. Whether it is the chatbot giving ridiculously generic answers, the customer service agent asking for every detail already provided to the voice assist system, or the feedback provided about technical issues or product enhancements that falls into a black hole of data … customers have no patience for customer experience blunders. It isn't that brands don't want to provide a great customer experience; it's an inability to manage, and quickly react to, the surge of customer feedback that presents the core challenge. Companies are realizing that customer feedback can't be limited to surveys. Customer feedback analysis is still overwhelmingly manual, which means most companies can't keep up.
In the metaverseArtificial Intelligences (AIs), thanks to their ability to disguise themselves with human-like avatars conceived according to our preferences and simulate empathy from analyzing our tastes and habits, could make us fall blindly in love with them and bend our will by know exactly what we want to hear in order to achieve your goals. It was already 2013 when Spike Jonze premiered Her, a film that inaugurated a subgenre within romantic cinema: the idylls between humans and operating systems! And we weren't talking about a man's preferences for software based on how well he manages hard drive space or the effectiveness of his antivirus. It was a machine with the delicious voice of Scarlett Johansson that in addition to doing the work for a joaquin phoenix taciturn and frustrated, he consoled him after listening with more patience than a saint to the insecurities of a guy with the character and attractiveness of the instructions to fill out the income tax return form. She, the AI, in addition to a beautiful voice, in milliseconds had all the knowledge of the world at her fingertips thanks to the Internet and could offer you an entertaining talk on any possible topic, from the poetry of Sophocles to how to improve the design of an accelerator of particles.
Abstract: Graph embedding techniques are a staple of modern graph learning research. When using embeddings for downstream tasks such as classification, information about their stability and robustness, i.e., their susceptibility to sources of noise, stochastic effects, or specific parameter choices, becomes increasingly important. As one of the most prominent graph embedding schemes, we focus on node2vec and analyse its embedding quality from multiple perspectives. Abstract: Metapopulation models have been a powerful tool for both theorizing and simulating epidemic dynamics. In a metapopulation model, one considers a network composed of subpopulations and their pairwise connections, and individuals are assumed to migrate from one subpopulation to another obeying a given mobility rule.