The AI algorithm, the name of which can be translated as either Dragon Eye or Dragonfly Eye, was developed by Shanghai-based tech firm Yitu. It works off of China's national database, which consists of all 1.3 billion residents of the Asian nation as well as 500 million more people who have entered the country at some point. Dragon Eye interfaces with the database to detect the faces of individuals. Yitu chief executive and co-founder Zhu Long told the South China Morning Post (SCMP) that the purpose of the algorithm is to fight crime and make the world a safer place. "Let's say that we live in Shanghai, a city of 24 million people.
Apple will let you unlock the iPhone X with your face - a move likely to bring facial recognition to the masses. But along with the roll out of the technology, are concerns over how it could be used. Despite Apple's safeguards, privacy activists fear the widespread use of facial recognition would'normalise' the technology. This could open the door to broader use by law enforcement, marketers or others of a largely unregulated tool, creating a'surveillance technology that is abused'. Facial recognition could open the door to broader use by law enforcement, marketers or others of a largely unregulated tool, creating a'surveillance technology that is abused', experts have warned.
Apple's new facial recognition software to unlock their new iPhone X has raised questions about privacy and the susceptibility of the technology to hacking attacks. Apple's iPhone X is set to go on sale on Nov. 3. The world waits with bated breath as Apple plans on releasing a slew of new features including a facial scan. The new device can be unlocked with face recognition software wherein a user would be able to look at the phone to unlock it. This convenient new technology is set to replace numeric and pattern locks and comes with a number of privacy safeguards.
WASHINGTON – Apple will let you unlock the iPhone X with your face -- a move likely to bring facial recognition to the masses, along with concerns over how the technology may be used for nefarious purposes. Apple's newest device, set to go on sale on Friday, is designed to be unlocked with a facial scan with a number of privacy safeguards -- as the data will only be stored on the phone and not in any databases. Unlocking one's phone with a face scan may offer added convenience and security for iPhone users, according to Apple, which claims its "neural engine" for FaceID cannot be tricked by a photo or hacker. While other devices have offered facial recognition, Apple is the first to pack the technology allowing for a three-dimensional scan into a hand-held phone. But despite Apple's safeguards, privacy activists fear the widespread use of facial recognition would "normalize" the technology and open the door to broader use by law enforcement, marketers or others of a largely unregulated tool.
Los Angeles' Blade Runner-esque future of a world watched by robots is here. On Tuesday, a civilian oversight panel gave the Los Angeles Police Department (LAPD) the OK to begin a year-long drone trial, primarily for reconnaissance in "tactical missions" conducted by SWAT. The decision came after a contentious meeting and protest by privacy advocates who oppose the use of drones by law enforcement. As the third largest police force in the nation behind New York and Chicago, the trial makes the LAPD the largest police force in the nation to use drones. The Chicago PD and New York PD confirmed in official statements to Mashable that neither police force deploys drones.
Today, however, the convergence of complex algorithms, big data, and exponential increases in computational power has resulted in a world where AI raises significant ethical and human rights dilemmas, involving rights ranging from the right to privacy to due process. Although less dramatic than military applications, the development of AI in the domestic sector also opens the door to significant human rights issues such as discrimination and systemic racism. Police forces across the country, for example, are increasingly turning to automated "predictive policing" systems that ingest large amounts of data on criminal activity, demographics, and geospatial patterns to produce maps of where algorithms predict crime is likely to occur. The development of AI in the domestic sector also opens the door to significant human rights issues such as discrimination and systemic racism.
The latter is, due to its importance, protected not only by national health legislations, yet also by Article 8 of the European Convention on Human Rights, Right to privacy. As already mentioned, medicine is a profession that requires a certain level of maintenance of secrecy of confidential information and according to the previous Court's decisions the secrecy is even more important in cases that involves psychiatric records. The robots' involvement in medical treatments on one hand and easy access to the information they gain during the treatment on the other, bring into question the effectiveness of the provisions of Article 8 of the European Convention on Human Rights. Current legislations in countries around the world do not put much attention on this particular area, even though the modern robotic approaches have already been introduced and also very well accepted.
Just one week after the sheriff's department in Cecil County, Md., got its brand new drone up and running, it was asked to investigate a case of stolen construction equipment. So the Cecil County Sheriff sent his Typhoon H Pro to investigate. The sheriff's department in Somerset County, N.J., hopes its drones could help it find missing people. "Years ago, when we had people wander off, we would bring out the rescue department, the fire department, fire department volunteers, K-9 if we had it and we'd search and search and search and never find the person," said Somerset County Sheriff Frank Provensano.
Hamid Khan, founder of the Stop LAPD Spying Coalition, said that the drones could provide a'backdoor' to share information with the police. Hamid Khan, founder of the Stop LAPD Spying Coalition, said that the drones could provide a'backdoor' to share information with the police. Melanie Ochoa, staff attorney at the ACLU, said: 'We can't protect against mission creep because we don't know what the mission is to start with.' Melanie Ochoa, staff attorney at the ACLU, said: 'We can't protect against mission creep because we don't know what the mission is to start with.'
The answer, according to some former NSA analysts, is that the agency routinely monitors many of its employees' computer activity. It is a $200 million-a-year industry, according to a study last year by 451 Research, a technology research firm, and is estimated to be worth $500 million by 2020. Employee monitoring recently came to light in a high-profile lawsuit involving Uber and Waymo, the self-driving car company owned by Google's parent firm, Alphabet. Privacy advocates have been pushing for years to have Congress review various communications privacy laws in light of updates to technology.