If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Neurological diseases, such as cerebrovascular disease, Parkinson's disease (PD), Alzheimer's disease, have become the leading cause of death in China. Neurological function evaluation is crucial for the diagnosis and intervention of neurological diseases. Clinically, neurological function is evaluated by various scales, tests, and questionnaires. However, these methods rely on costly professional equipment and medical personnel. They cannot be used as a means of daily evaluation of neurological diseases.
Researchers at the Massachusetts Institute of Technology (MIT) have used machine learning to reduce the processing power needed to render convincing holographic images, making it possible to generate them in near-real time on consumer-level computer hardware. Such a method could pave the way to portable virtual-reality systems that use holography instead of stereoscopic displays. Stereo imagery can present the illusion of three-dimensionality, but users often complain of dizziness and fatigue after long periods of use because there is a mismatch between where the brain expects to focus and the flat focal plane of the two images. Switching to holographic image generation overcomes this problem; it uses interference in the patterns of many light beams to construct visible shapes in free space that present the brain with images it can more readily accept as three-dimensional (3D) objects. "Holography in its extreme version produces a full optical reproduction of the image of the object. There should be no difference between the image of the object and the object itself," says Tim Wilkinson, a professor of electrical engineering at Jesus College of the U.K.'s University of Cambridge.
Houston-based hairstylist Taylor Crowley, 36, has built a reputation as a social media influencer and has been using augmented reality (AR) filters for the past few years as a "confidence booster." I don't wear a ton of makeup because less is more," she explains. "I try to choose filters that aren't going to distort my face." Crowley is also "big into photography" and views image filters like a filter on a camera that can be used to change tonal qualities. In one Instagram post of her posing with a large fish, Crowley used Adobe Lightroom to turn everything grayscale "because we live in Houston and fish in Galveston, and honestly, not everything is very pretty," she says. "I thought grayscale made the fish pop out." Similarly, Crowley posted a picture of herself in a bathing suit on Cinco de Mayo and grayed out the background to accentuate the beer can she was drinking from--and her green bathing suit. As someone who views content herself, "I think editing things that show a little more color or pop … grabs my attention a little bit more." Crowley is quick to add that she does not use filters when she posts client-related content. "Because I'm a hair stylist, I feel that it's cheating" to use filters, she says. I also want people to have a reasonable expectation when they come to get their hair done."
Researchers from the Institute of Electrical and Electronics Engineers (IEEE) have developed a method to increase the authenticity of low-cost, projection-based augmented reality installations, through special glasses that cause projected 3D images to go in and out of focus in the same way that they would if the objects were real, overcoming a critical perceptual hurdle for practical usage of projection systems in controlled environments. The IEEE system recreates depth planes for projected real and CGI imagery that will be superimposed into rooms. In this case, three CGI Stanford bunnies are being superimposed at the same depth plane as three real world objects, and their blurriness is controlled by where the viewer is looking and focusing. The system uses electrically focus-tunable lenses (ETL) embedded into the viewer's glasses (which are in any case necessary to separate the two image streams into a convincing, integrated 3D experience), and which communicate with the projection system, which then automatically changes the level of blurriness of the projected image seen by the viewer. The ETL lenses report back information about the user's focal attention and sets the level of blurriness on a per-plane basis for the rendering of the projected geometry.
Given all the innovations taking place in the AI world, the possibilities that it creates are seemingly endless. Yet with increasingly prevalent concerns about ethics and the responsible use of AI, many business and information technology leaders are looking to better understand how this technology will affect organizations and society, both now and in the future. Looking at future scenarios for how AI could evolve can help IT leaders demystify this emerging technology and better understand both its possibilities and its limitations. Augmented intelligence is the idea of taking human intelligence as a starting point and supporting it through intuitive, yet non-invasive, user interfacing – such as holograms or interactive visualizations. Examples in practice already exist, including augmented diagnosis for medical specialists or contextual recommendations for call center agents.
In an increasingly competitive world, we should have a deep understanding of the business in which we operate, how it is evolving, and the new innovations that we could embrace or build to remain competitive and conquer new market segments. To do this, we must be able to develop a clear vision of transformation that takes us to another level of performance. By embracing Digital Transformation, we will deal with artificial intelligence, machine and deep learning, virtual reality, and a lot of other innovative technologies. At first sight, it might even sound fearful to lead the business in such a complex and intricate direction. With this in mind, we will consider some strategies to better understand and take competitive advantage of the huge streaming of data in the current era of the digital revolution.
Facebook is reportedly preparing to unveil a new name as the company seeks to rebrand, and the internet has already come through with some pointed suggestions. The plans, first reported by the Verge on Tuesday, comes at a time of upheaval for the company. In the last few months alone, Facebook has been served with a lawsuit from the Federal Trade Commission, was the subject of a congressional hearing after a whistleblower revealed worrying internal practices at the company, and is facing a walkout of moderators over working conditions. Ideas for how to overhaul its toxic image followed swiftly after the news broke. Reporter Katie Notopoulos at Buzzfeed put forward a number of options: BookFace, MySpace, Facey McBookface, Definitely NOT Facebook, Hellsite, Oops We Facilitated Genocide, and The Good And Nice Company, Not At All Evil.
It seems that the world is gradually turning everything into data, and we are storing data at lightspeed. How are banks capitalising on new sources of data and using it to develop new services? Our watches now track our sleep, our movement, our heart rate and much more. With it they can "coach" us with training plans that are specific to our individual capabilities. At the top end of smartwatches, you have the ability to pay for things, listen to music, get guided GPS navigation and more.
The world of artificial intelligence is only just beginning. Given all the innovations taking place in the AI world, the possibilities that it creates are seemingly endless. Yet with increasingly prevalent concerns about ethics and the responsible use of AI, many business and information technology leaders are looking to better understand how this technology will affect organizations and society, both now and in the future. Looking at future scenarios for how AI could evolve can help IT leaders demystify this emerging technology and better understand both its possibilities and its limitations. Augmented intelligence is the idea of taking human intelligence as a starting point and supporting it through intuitive, yet non-invasive, user interfacing – such as holograms or interactive visualizations.
AI that understands the world from a first-person point of view could unlock a new era of immersive experiences, as devices like augmented reality (AR) glasses and virtual reality (VR) headsets become as useful in everyday life as smartphones. Imagine your AR device displaying exactly how to hold the sticks during a drum lesson, guiding you through a recipe, helping you find your lost keys, or recalling memories as holograms that come to life in front of you. To build these new technologies, we need to teach AI to understand and interact with the world like we do, from a first-person perspective -- commonly referred to in the research community as egocentric perception. Today's computer vision (CV) systems, however, typically learn from millions of photos and videos that are captured in third-person perspective, where the camera is just a spectator to the action. "Next-generation AI systems will need to learn from an entirely different kind of data -- videos that show the world from the center of the action, rather than the sidelines," says Kristen Grauman, lead research scientist at Facebook.