If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Many airports hope to start using biometric scanners in lieu of passports to identify travelers. Buzz60's Tony Spitz has the details. The next time you go to the airport you might notice something different as part of the security process: A machine scanning your face to verify your identity. U.S. Customs and Border Protection (CBP) has been working with airlines to implement biometric face scanners in domestic airports to better streamline security. But how does the process work?
Privacy campaigners have warned of an "epidemic" of facial recognition use in shopping centres, museums, conference centres and other private spaces around the UK. An investigation by Big Brother Watch (BBW), which tracks the use of surveillance, has found that private companies are spearheading a rollout of the controversial technology. The group published its findings a day after the information commissioner, Elizabeth Denham, announced she was opening an investigation into the use of facial recognition in a major new shopping development in central London. Sadiq Khan, the mayor of London, has already raised questions about the legality of the use of facial recognition at the 27-hectare (67-acre) Granary Square site in King's Cross after its owners admitted using the technology "in the interests of public safety". BBW said it had uncovered that sites across the country were using facial recognition, often without warning visitors.
WIRED recently highlighted unacceptable levels of bias in facial recognition in the article The Best Algorithms Struggle to Recognize Black Faces Equally. They cited the poor test scores of leading facial recognition vendors, as reported by the National Institute of Standards and Technology (NIST) in its July 2019 results. WIRED specifically called out Idemia but generalized their concerns. "The NIST test challenged algorithms to verify that two photos showed the same face, similar to how a border agent would check passports. At sensitivity settings where Idemia's algorithms falsely matched different white women's faces at a rate of one in 10,000, it falsely matched black women's faces about once in 1,000 -- 10 times more frequently. A one in 10,000 false match rate is often used to evaluate facial recognition systems."
We live in exponential times, and merely having a digital strategy focused on continuous innovation is no longer enough to thrive in a constantly changing world. To transform an organisation and contribute to building a secure and rewarding networked society, collaboration among employees, customers, business units and even things is increasingly becoming key. Especially with the availability of new technologies such as artificial intelligence, organisations now, more than ever before, need to focus on bringing together the different stakeholders to co-create the future. Big data empowers customers and employees, the Internet of Things will create vast amounts of data and connects all devices, while artificial intelligence creates new human-machine interactions. In today's world, every organisation is a data organisation, and AI is required to make sense of it all.
The UK's privacy watchdog has opened an investigation into the use of facial recognition cameras in a busy part of central London. The information commissioner, Elizabeth Denham, announced she would look into the technology being used in Granary Square, close to King's Cross station. Two days ago the mayor of London, Sadiq Khan, wrote to the development's owner demanding to know whether the company believed its use of facial recognition software in its CCTV systems was legal. The Information Commissioner's Office (ICO) said it was "deeply concerned about the growing use of facial recognition technology in public spaces" and was seeking detailed information about how it is used. "Scanning people's faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all," Denham said.
The mayor of London has written to the owner of the King's Cross development demanding to know whether the company believes its use of facial recognition software in its CCTV systems is legal. Sadiq Khan said he wanted to express his concern a day after the property company behind the 27-hectare (67-acre) central London site admitted it was using the technology "in the interests of public safety". In his letter, shared with the Guardian, the Labour mayor writes to Robert Evans, the chief executive of the King's Cross development, to "request more information about exactly how this technology is being used". Khan also asks for "reassurance that you have been liaising with government ministers and the Information Commissioner's Office to ensure its use is fully compliant with the law as it stands". The owner of King's Cross is one of the first property companies to acknowledge it is deploying facial recognition software, even though it has been criticised by human rights group Liberty as "a disturbing expansion of mass surveillance".
Members of the public have said there is no justification for the use of facial recognition technology in CCTV systems operated by a private developer at a 67-acre site in central London. It emerged on Monday that the property developer Argent was using the cameras "in the interests of public safety" in King's Cross, mostly north of the railway station across an area including the Google headquarters and the Central Saint Martins art school, but the precise uses of the technology remained unclear. "For law enforcement purposes, there is some justification, but personally I don't think a private developer has the right to have that in a public place," said Grant Otto, who lives in London. He questioned possible legal issues around the collection of facial data by a private entity and said he was unaware of any protections that would allow people to request their information be removed from a database, with similar rights as those enshrined in GDPR. Jack Ramsey, a tourist from New Zealand, echoed his concerns.
Diagnosing rare genetic disorders is difficult. Because cases are few and far between, it makes it harder to train medical professionals in what to look for. This is precisely the kind of activity that artificial intelligence can make easier. A new app called Face2Gene is giving doctors a second opinion on their diagnoses, using machine learning and neural networks. It looks for certain tell-tale facial features and presents doctors with a likely list of congenital and neurodevelopmental disorders.