From helping humans live longer and hacking our performance, to repairing the body and understanding the brain, WIRED Health will hear from the innovators transforming this critical sector. It's only a matter of time before robots replace surgeons in the operating theatre, according to cancer specialist virtual reality surgery pioneer Shafi Ahmed. Ahmed is a leading proponent of virtual and augmented reality within operating theatres. In April 2016 he became the first surgeon to live-stream a surgical procedure in virtual reality, with millions of people worldwide watching him remove a tumour from the colon of a patient in his 70s. But this isn't a new form of gory entertainment – Ahmed hopes that virtual reality can revolutionise the way surgeons are trained, especially in the developing world.
TLDR; Virtual Reality can be immersive and fun, but add natural deep Artificial Intelligence and you quite literally get a new world which -- beyond the computer generated world around you -- may not actually be so virtual. As applications in Virtual Reality are increasing in quality and variety, early developers are providing us with new interaction dynamics that are expanding the richness of immersive virtual worlds. By layering in aspects of natural artificial intelligence, experiences are developing that lose the feeling of being so "unreal;" distinct memories, interactions and relationships are being created that cause the user to question -- well, if it happens in real life, but inside of a headset, does that not make it real? Of our five senses, Head Mounted Displays (HMDs) handle vision, a solid pair of 3D headphones like OSSIC handle sound; AxonVR and others are working on haptics and touch…next up is smell and taste, those should be, well…interesting. But beyond our five senses which create the feeling of physical "presence" in a virtual space, is the "immersion" of having a real experience, experiencing the unexpected and having the opportunity to create very real memories.
Professors and cognitive researchers frequently depend on test scores to determine how well students comprehend lessons. However, this practice ignores many critical aspects of learning, such as the engaging effect of classroom discussion or interests and motivations of classroom learners. By convention, a neutral observer would be required to recognize these unquantifiable moments of a great teaching experience but human observations are time-consuming and expensive. One can videotape classrooms, but that would be just as cumbersome and costly, requiring an expert to interpret and analyze the recordings afterwards. Because of advances in Artificial Intelligence, education researchers and computer scientists have come up with ways to create smart systems that can observe and listen in on classrooms, and instantaneously analyze the quality of a teacher's classroom delivery.