Los Angeles' Blade Runner-esque future of a world watched by robots is here. On Tuesday, a civilian oversight panel gave the Los Angeles Police Department (LAPD) the OK to begin a year-long drone trial, primarily for reconnaissance in "tactical missions" conducted by SWAT. The decision came after a contentious meeting and protest by privacy advocates who oppose the use of drones by law enforcement. As the third largest police force in the nation behind New York and Chicago, the trial makes the LAPD the largest police force in the nation to use drones. The Chicago PD and New York PD confirmed in official statements to Mashable that neither police force deploys drones.
In the two months since the Los Angeles Police Department revealed that it wants to try flying drones, the unmanned aircraft have been the source of an often heated back-and-forth. Advocates say the drones could help protect officers and others by using nonhuman eyes to collect crucial information during high-risk situations. Skeptics worry that use of the devices will steadily expand and include inappropriate -- or illegal -- surveillance. The LAPD's harshest critics want the drone program scrapped before it even takes off. On Tuesday, the civilian board that oversees the LAPD will vote on whether to allow the department to test drones during a one-year pilot program.
The submission stresses the need to critically evaluate the impact of Artificial Intelligence (AI) and automated decision making systems (AS) on human rights. Machine learning – the most successful subset of AI techniques – enables an algorithm to learn from a dataset using statistical methods. As such, AI has a direct impact on the ability of individuals to exercise their right to freedom of expression in the digital age. Development of AI is not new but advances in the digital environment – greater volumes of data, computational power, and statistical methods – will make it more enabling in the future.
A multiple-exposure portrait of Chinese contemporary artist and human rights activist Ai Weiwei, made on film in Beverly Hills, on the occasion of his new documentary, "Human Flow." A multiple-exposure portrait of Chinese contemporary artist and human rights activist Ai Weiwei, made on film in Beverly Hills, on the occasion of his new documentary, "Human Flow." He spent the better part of 2016 traveling around the globe visiting refugee camps for his new documentary feature film, "Human Flow," debuting in theaters this month. In New York, the contemporary artist and social justice activist is installing some 300 works across the city's five boroughs for the Public Art Fund exhibition "Good Fences Make Good Neighbors," opening Oct. 12.
The Los Angeles Police Department released formal guidelines on its proposal to fly drones during a one-year pilot program, spurring questions and concerns among members of a civilian oversight panel and the public at a contentious meeting Tuesday. "Our challenge is to create a policy that strikes a balance, that promotes public safety, the safety of our officers and does not infringe on individual privacy rights," Assistant Chief Beatrice Girmala told the Los Angeles Police Commission at the packed meeting. Before outlining the guidelines, Girmala reviewed initial feedback from the community on the proposed drone initiative. An assistant chief, the police chief and two police commissioners would also be notified.
Learned bias can occur as the result of incomplete data or researcher bias in generating training data. Because sentencing systems are based on historical data, and black people have historically been arrested and convicted of more crimes, an algorithm could be designed in order to correct for bias that already exists in the system. When humans make mistakes, we tend to rationalize their shortcomings and forgive their mistakes--they're only human!--even if the bias displayed by human judgment is worse than bias displayed by an algorithm. In a follow-up study, Dietvorst shows that algorithm aversion can be reduced by giving people control over an algorithm's forecast.
It'll be close, but it looks like women will be allowed to drive in Saudi Arabia with some time to spare before the automobile industry converts entirely to self-driving cars. A royal decree announced Tuesday that women would finally be allowed behind the wheel, heralding a preposterously overdue end to the most high-profile and infamous of the repressive kingdom's restrictions on women. Even a woman in prison requires a male guardian to agree to her release, according to the monitoring group Human Rights Watch, which described the guardianship system as the most significant impediment to women's rights in Saudi Arabia -- and even a barrier to the government's own plans to improve the economy. The abolition of the male guardianship system should be the next announcement we hear from the Saudi government.
In an apparently separate case, a student who attended the Mashrou' Leila concert was arrested hours later after being "caught in the act," the police said. Homosexuality is not illegal in Egypt, but the authorities frequently prosecute gay men for homosexuality and women for prostitution under loosely-worded laws that prohibit immorality and "habitual debauchery." The Arab Spring ushered in a brief period of respite, with a sharp rise in the use of dating apps as gay people socialized openly at parties and in bars. On Monday a court convicted Khaled Ali, a lawyer and opposition figure, for making an obscene finger gesture outside a Cairo courthouse last year after he and other lawyers won a case against the government.
Identify the traits of your top performing employees and hire people like them, but without the discrimanatory bias of traditional recruiting. You can watch my interview with Pymetrics' CEO Frida Polli below: A company's all-star employees play Pymetrics' set of games that assess things like memory, emotion detection, risk-taking, fairness and focus. Finally, Pymetrics recommends companies hire people who are similar on the inside to their best workers, but not necessarily on the outside. "Google did the famous study of resumes and performance test scores, and found an extremely small correlation," Polli tells TechCrunch.
Stanford's review board approved Kosinski and Wang's study. "The vast, vast, vast majority of what we call'big data' research does not fall under the purview of federal regulations," says Metcalf. Take a recent example: Last month, researchers affiliated with Stony Brook University and several major internet companies released a free app, a machine learning algorithm that guesses ethnicity and nationality from a name to about 80 percent accuracy. The group also went through an ethics review at the company that provided training list of names, although Metcalf says that an evaluation at a private company is the "weakest level of review that they could do."