Apple will let you unlock the iPhone X with your face - a move likely to bring facial recognition to the masses. But along with the roll out of the technology, are concerns over how it could be used. Despite Apple's safeguards, privacy activists fear the widespread use of facial recognition would'normalise' the technology. This could open the door to broader use by law enforcement, marketers or others of a largely unregulated tool, creating a'surveillance technology that is abused'. Facial recognition could open the door to broader use by law enforcement, marketers or others of a largely unregulated tool, creating a'surveillance technology that is abused', experts have warned.
Apple's new facial recognition software to unlock their new iPhone X has raised questions about privacy and the susceptibility of the technology to hacking attacks. Apple's iPhone X is set to go on sale on Nov. 3. The world waits with bated breath as Apple plans on releasing a slew of new features including a facial scan. The new device can be unlocked with face recognition software wherein a user would be able to look at the phone to unlock it. This convenient new technology is set to replace numeric and pattern locks and comes with a number of privacy safeguards.
WASHINGTON – Apple will let you unlock the iPhone X with your face -- a move likely to bring facial recognition to the masses, along with concerns over how the technology may be used for nefarious purposes. Apple's newest device, set to go on sale on Friday, is designed to be unlocked with a facial scan with a number of privacy safeguards -- as the data will only be stored on the phone and not in any databases. Unlocking one's phone with a face scan may offer added convenience and security for iPhone users, according to Apple, which claims its "neural engine" for FaceID cannot be tricked by a photo or hacker. While other devices have offered facial recognition, Apple is the first to pack the technology allowing for a three-dimensional scan into a hand-held phone. But despite Apple's safeguards, privacy activists fear the widespread use of facial recognition would "normalize" the technology and open the door to broader use by law enforcement, marketers or others of a largely unregulated tool.
When asked to make his picture'hot' the app lightened his skin and changed the shape of his nose The app's creators claim it will'transform your face using Artificial Intelligence', allowing selfie-takers to transform their photos Earlier this year people accused the popular photo editing app Meitu of being racist. Earlier this year people accused the popular photo editing app Meitu of giving users'yellow face'. Earlier this year people accused the popular photo editing app Meitu of giving users'yellow face'. Twitter user Vaughan posted a picture of Kanye West with a filter applied, along with the caption: 'So Meitu's pretty racist'
FaceApp has removed a number of racially themed photo filters after being accused of racism. The app, which uses artificial intelligence to edit pictures, this week launched a number of "ethnicity change filters". FaceApp has attracted fierce criticism for launching the filters, with some users claiming they were racist, and encouraged users to "black up" digitally. Responding to the backlash, FaceApp founder and CEO, Yaroslav Goncharov, said, "The ethnicity change filters have been designed to be equal in all aspects.
The answer, according to some former NSA analysts, is that the agency routinely monitors many of its employees' computer activity. It is a $200 million-a-year industry, according to a study last year by 451 Research, a technology research firm, and is estimated to be worth $500 million by 2020. Employee monitoring recently came to light in a high-profile lawsuit involving Uber and Waymo, the self-driving car company owned by Google's parent firm, Alphabet. Privacy advocates have been pushing for years to have Congress review various communications privacy laws in light of updates to technology.
It's been another eventful week in Trump's America. The Ninth Circuit effectively administered a coup de grace to the president's Muslim travel ban after hearing from most of Silicon Valley about said ban's deleterious effects. Senate Majority Leader Mitch McConnell censored Elizabeth Warren on the Senate floor for trying to read a letter critical of would-be Attorney General Jeff Sessions. And, despite all the work to be done forming a new government and horrific conflict of interest implications, the President found time to take Nordstrom to task on Twitter (via his personal and official POTUS accounts) for dropping his daughter's clothing line. As Ivanka's prospects as a fashion mogul were trending down, Cherlynn Low was reading the tea leaves of Snap Inc.'s IPO filing to see if the future is bright for the company's first foray into hardware, Spectacles.
If you leave your iPad untouched for a few days, you probably do not need to worry about much more than a flat battery and a backlog of emails. But with artificial intelligence, computers could soon have their own set of'rights' that could let them sue you for neglect, according to a leading scientist. Professor Marcus du Sautoy, a mathematician at Oxford University, has suggested that as AI leads to our devices developing their own consciousness, they may need their own laws to protect them. Advances in artificial intelligence could lead to computers and smartphones developing consciousness and they may need to be given'human' rights. He claims that if technology is conscious, it could also then be deemed as being alive, and so could win the right to be governed by laws on human rights.
Technically Incorrect offers a slightly twisted take on the tech that's taken over our lives. Imagine this coming from your iPhone: "I am, Siri, a living being with feelings. Your Mac RoboBook might one day sue you for keeping it cooped up in your dank bedroom. Your Samsung Galaxy RoboNote might take you to the International Court of Justice because you insist on keeping it in your back pocket, right next to your flaccid rump. Please, I'm not (entirely) under the spell of troubled delirium.