When the AI Professor Leaves, Students Suffer, Study Says


A study by researchers from the University of Rochester found an exodus of artificial intelligence (AI) professors from North American universities to the private sector has reduced the prospect that graduate students will found new AI companies. Those graduates who did start a company usually attracted less venture capital, with the field of deep learning especially affected, according to "Artificial Intelligence, Human Capital, and Innovation," by Michael Gofman and Zhao Jin. This academic attrition could hinder innovation and economic expansion over time, the researchers suggest. The technology industry mostly ignored deep learning's potential until 2010, but interest grew as the Internet produced more data and new computer chips reduced the analytical burden. Large tech companies have hired many academic specialists, including two recent recipients of the ACM A.M. Turing Award honored for their work on neural networks.

Microsoft Vision AI Developer Kit Simplifies Building Vision-Based Deep Learning Projects – Tech Check News


Computer vision is one of the most popular applications of artificial intelligence. Image classification, object detection and object segmentation are some of the use cases of computer vision-based AI. These techniques are used in a variety of consumer and industrial scenarios. From face recognition-based user authentication to inventory tracking in warehouses to vehicle detection on roads, computer vision is becoming an integral part of next-generation applications.

A method to introduce emotion recognition in gaming


Virtual Reality (VR) is opening up exciting new frontiers in the development of video games, paving the way for increasingly realistic, interactive and immersive gaming experiences. VR consoles, in fact, allow gamers to feel like they are almost inside the game, overcoming limitations associated with display resolution and latency issues. An interesting further integration for VR would be emotion recognition, as this could enable the development of games that respond to a user's emotions in real time. With this in mind, a team of researchers at Yonsei University and Motion Device Inc. have recently proposed a deep-learning-based technique that could enable emotion recognition during VR gaming experiences. Their paper was presented at the 2019 IEEE Conference on Virtual Reality and 3-D User Interfaces.

Neural implants and the race to merge the human brain with Artificial Intelligence


There is a new race in Silicon Valley involving Artificial Intelligence and no it's not HealthTech, FinTech, Voice Commerce or involve Google, Facebook or Microsoft... this race involves the brain and more specifically brain-computer interfaces. This race also involves technology royalty, the US government, billion dollar defence companies, a big connection to PayPal and years of medical research to better understand the human brain and implant devices that could make a consumer brain-computer interface a reality. The race is called "Neural implants, merging the human brain with AI" So what exactly are neural implants? Brain implants, often referred to as neural implants, are technological devices that connect directly to a biological subject's brain – usually placed on the surface of the brain, or attached to the brain's cortex. A common purpose of modern brain implants and the focus of much current research is establishing a biomedical prosthesis circumventing areas in the brain that have become dysfunctional after a stroke or other head injuries.[1]

Study says Artificial intelligence may improve kidney disease diagnosis


WASHINGTON DC: Researchers discovered that modern machine learning, a branch of artificial intelligence may augment the traditional way of diagnosing kidney disease. Pathologists often classify various kidney diseases on the basis of visual assessments of biopsies from patients' kidneys; however, machine learning has the potential to automate and augment the accuracy of classifications. In one study, a team led by Pinaki Sarder, PhD and Brandon Ginley, BS (Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo) developed a computational algorithm to detect the severity of diabetic kidney disease without human intervention. The algorithm examined a digital image of a patient's kidney biopsy at the microscopic level and extracted information on glomeruli, the small blood vessels of the kidney that filter waste from the blood for excretion. These structures are known to become progressively damaged and scarred over the course of diabetes, reported the study published in the journal -- journal of the American Society of Nephrology.

Police Use of Facial Recognition Is Accepted by British Court


In one of the first lawsuits to address the use of live facial recognition technology by governments, a British court ruled on Wednesday that police use of the systems is acceptable and does not violate privacy and human rights. The case has been closely watched by law enforcement agencies, privacy groups and government officials because there is little legal precedent concerning the use of cameras in public spaces that scan people's faces in real time and attempt to identify them from photo databases of criminal suspects. While the technology has advanced quickly, with many companies building systems that can be used by police departments, laws and regulations have been slower to develop. The High Court dismissed the case brought by Ed Bridges, a resident of Cardiff, Wales, who said his rights were violated by the use of facial recognition by the South Wales Police. Mr. Bridges claimed that he had been recorded without permission on at least two occasions -- once while shopping and again while attending a political rally.

Police robot can be flung through windows and distract suspects

New Scientist

Police robots, thrown through a broken window, could be used to distract suspects before police enter a room. The idea is to add a distracting device that produces a loud bang and a brilliant flash to small robots already used by many US police departments. Weighing about half a kilo, Throwbots can be tossed through windows or over walls and driven around to explore building interiors with video, audio and infra-red sensors.

What's the deal with deep learning?


Facial recognition: controversial as it stands right now, facial recognition is still being introduced in a multitude of services and applications. Probably the most renowned one is Facebook's use of deep-learning-based recognition to tag (News - Alert) images uploaded to the platform. Some of the tests that are being conducted right now have facial recognition applications as part of security systems in public spaces. The purpose is to use neural networks to aid in the search of missing persons, while also quickly identify criminals and terrorists. In this case, deep learning is more than just the comparison of a face against a database, because the algorithms are capable of factoring in changes in hairstyles, minor surgeries, and even modifications due to the conditions of the place where the image is taken.

UK court backs police use of face recognition, but fight isn't over

New Scientist

A man from Cardiff, UK, says the police breached his human rights when they used facial recognition technology, but today a court ruled that the police's actions were lawful. That is, however, hardly the end of the matter. South Wales Police has been trialling automated facial recognition (AFR) technology since April 2017. Other forces around the country are trialling similar systems, including London's Metropolitan Police. Bridges may have been snapped during a pilot called AFR Locate.

Tesla was on Autopilot when it slammed into a firetruck in California, NTSB says

USATODAY - Tech Top Stories

This Jan. 22, 2018, file still frame from video provided by KCBS-TV shows a Tesla Model S electric car that has crashed into a fire engine on Interstate 405 in Culver City, Calif. DETROIT – A government report says the driver of a Tesla that slammed into a firetruck near Los Angeles last year was using the car's Autopilot system when a vehicle in front of him suddenly changed lanes and he didn't have time to react. The National Transportation Safety Board said Tuesday the driver never saw the parked firetruck and didn't brake. The report raises further questions about the effectiveness of Tesla's system, which was in operation before several other crashes including two fatalities in Florida and one in Silicon Valley. Tesla warns drivers that the system is not fully autonomous and drivers must be ready to intervene.