denham
Data Engineer at Bosch Group - Denham, United Kingdom
Mobility is changing all over the world. Passenger cars and light commercial vehicles will continue to play a central role in mobility of the future. Bosch is driving progress in this segment with innovative ideas and advanced technology for greater safety, efficiency, sustainability, and driving pleasure. Please upload your CV to apply for this position. You may also include a brief email explaining why you would be a great candidate although it is not mandatory.
- Transportation > Ground > Road (0.95)
- Transportation > Passenger (0.58)
- Transportation > Freight & Logistics Services (0.58)
UK's data privacy watchdog may fine Clearview AI £17m
Clearview AI, the controversial startup known for scraping billions of selfies from people's public social network profiles to train a facial-recognition system, may be fined just over £17m ($22.6m) by the UK's Information Commissioner's Office (ICO). The watchdog on Monday publicly mulled punishing Clearview following an investigation launched last year with the Australian Information Commissioner. The ICO believes the US biz broke Britain's data-protection rules by, among other things, failing to have a "lawful reason" for collecting people's personal photos and info, and not being transparent about how the data was used and stored for its facial-recognition applications. Clearview harvests people's photos – 10 billion or more, it's thought – from their public social media profiles, and then builds a face-matching system so that if, say, the police upload a picture of someone from a CCTV still, the software can locate that person in its database and provide officers the corresponding name and online profiles. The images in Clearview AI Inc's database are likely to include the data of a substantial number of people from the UK and may have been gathered without people's knowledge from publicly available information online, including social media platforms.
- Europe > United Kingdom > England > Northamptonshire (0.06)
- Europe > United Kingdom > England > North Yorkshire (0.06)
- Information Technology > Security & Privacy (1.00)
- Information Technology > Communications > Social Media (0.80)
- Information Technology > Artificial Intelligence > Vision > Face Recognition (0.47)
US facial recognition firm faces £17m UK fine for 'serious breaches'
A US company that gathered photos of people from Facebook and other social media sites for use in facial recognition by its clients is facing a £17m fine after the Information Commissioner's Office found it had committed "serious breaches" of data protection law. Clearview AI, which describes itself as the "world's largest facial network", allows its customers to compare facial data against a database of over 10bn images harvested from the internet. The database is "likely to include the data of a substantial number of people from the UK and may have been gathered without people's knowledge from publicly available information online, including social media platforms", the ICO said. Clearview's technology had been offered on a "free trial basis" to UK law enforcement agencies, the data regulator added. It said Clearview had broken data protection law by failing to process the information of people in the UK in a way they were likely to expect or that was fair.
- Europe > United Kingdom (0.93)
- North America > United States (0.37)
- Information Technology > Security & Privacy (1.00)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Vision > Face Recognition (0.64)
UK's ICO warns over 'Big Data' surveillance threat of live facial recognition in public – TechCrunch
The UK's chief data protection regulator has warned over reckless and inappropriate use of live facial recognition (LFR) in public places. Publishing an opinion today on the use of this biometric surveillance in public -- to set out what is dubbed as the "rules of engagement" -- the information commissioner, Elizabeth Denham, also noted that a number of investigations already undertaken by her office into planned applications of the tech have found problems in all cases. "I am deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly. When sensitive personal data is collected on a mass scale without people's knowledge, choice or control, the impacts could be significant," she warned in a blog post. "Uses we've seen included addressing public safety concerns and creating biometric profiles to target people with personalised advertising. "It is telling that none of the organisations involved in our completed investigations were able to fully justify the processing and, of those systems that went live, none were fully compliant with the requirements of data protection law.
UK regulator to write to WhatsApp over Facebook data sharing
The UK's data regulator is writing to WhatsApp to demand that the chat app does not hand user data to Facebook, as millions worldwide continue to sign up for alternatives such as Signal and Telegram to avoid forthcoming changes to its terms of service. Elizabeth Denham, the information commissioner, told a parliamentary committee that in 2017, WhatsApp had committed not to hand any user information over to Facebook until it could prove that doing so respected GDPR. But, she said, that agreement was enforced by the Irish data protection authority until the Brexit transition period ended on 1 January. Now that Britain is fully outside the EU, ensuring that those promises are being kept falls to the Information Commissioner's Office. "The change in the terms of service, and the requirement of users to share information with Facebook, does not apply to UK users or to users in the EU," Denham told the digital, culture, media and sport sub-committee on online harms and disinformation, "and that's because in 2017 my office negotiated with WhatsApp so that they agreed not to share user information and contact information until they could show that they complied with the GDPR."
- Information Technology > Services (1.00)
- Information Technology > Security & Privacy (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.71)
Code of practice call over facial recognition
A code of practice should govern when police forces deploy facial recognition technology, the information commissioner has said. It comes after South Wales Police was found to have acted lawfully when a shopper complained his human rights were breached when he was photographed. An investigation by commissioner Elizabeth Denham has raised "serious concerns" over use of the technology. Ms Denham called on the government to introduce a statutory code of practice. Ed Bridges had brought a legal challenge after he was photographed shopping in Cardiff in 2017, and the following year at a peaceful protest against the arms trade.
UK police need to slow down with face recognition, says data watchdog
A legal code of practice is needed before face recognition technology can be safely deployed by police forces in public places, says the UK's data regulator. The Information Commissioner's Office (ICO) said it has serious concerns about the use of the technology as it relies on large amounts of personal information, in a blog post. Current laws, codes and practices "will not drive the ethical and legal approach that's needed to truly manage the risk that this technology presents," said information commissioner Elizabeth Denham. She called for police forces to be compelled to show justification that face recognition is "strictly necessary, balanced and effective" in each case it is deployed. Face recognition can map faces in a crowd by measuring the distance between facial features, then compare results with a "watch list" of images, which can include suspects, missing people and persons of interest. South Wales Police and the Met Police have been trialling face recognition as a possible way to reduce crime, but the move has been divisive.
ICO opens investigation into use of facial recognition in King's Cross
The UK's privacy watchdog has opened an investigation into the use of facial recognition cameras in a busy part of central London. The information commissioner, Elizabeth Denham, announced she would look into the technology being used in Granary Square, close to King's Cross station. Two days ago the mayor of London, Sadiq Khan, wrote to the development's owner demanding to know whether the company believed its use of facial recognition software in its CCTV systems was legal. The Information Commissioner's Office (ICO) said it was "deeply concerned about the growing use of facial recognition technology in public spaces" and was seeking detailed information about how it is used. "Scanning people's faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all," Denham said.
- Law (1.00)
- Information Technology > Security & Privacy (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.58)
Innocence lost: What did you do before the internet?
In moments of digital anxiety I find myself thinking of my father's desk. Dad was a travelling furniture salesman in the 1980s, a job that served him well in the years before globalisation hobbled the Canadian manufacturing sector. He was out on the road a lot, but when he worked from home he sat in his office, a small windowless study dominated by a large teak desk. And yet every day Dad spent hours there, making notes, smoking Craven "A"s, drinking coffee and yakking affably to small-town retailers about shipments of sectional sofas and dinette sets. This is what I find so amazing.
- Leisure & Entertainment (1.00)
- Information Technology (0.94)
- Media > Film (0.69)
- Health & Medicine > Therapeutic Area > Neurology (0.48)
- Information Technology > Artificial Intelligence (0.95)
- Information Technology > Communications > Networks (0.54)
Academic says he's being scapegoated in Facebook data case
An academic who developed the app used by Cambridge Analytica to harvest data from millions of Facebook users said Wednesday that he had no idea his work would be used in Donald Trump's 2016 presidential campaign and that he's being scapegoated in the fallout from the affair. Alexandr Kogan, a psychology researcher at Cambridge University, told the BBC that both Facebook and Cambridge Analytica have tried to place the blame on him for violating the social media platform's terms of service, even though Cambridge Analytica ensured him that everything he did was legal. "My view is that I'm being basically used as a scapegoat by both Facebook and Cambridge Analytica," he said. "Honestly, we thought we were acting perfectly appropriately, we thought we were doing something that was really normal." Authorities in Britain and the United States are investigating the alleged improper use of Facebook data by Cambridge Analytica, a U.K.-based political research firm.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.25)
- North America > United States > New York (0.05)
- North America > United States > California (0.05)