In the letter, which is an internal petition, the employees asked for more transparency and oversight of Project Dragonfly, the project's internal title. "We urgently need more transparency, a seat at the table and a commitment to clear and open processes: Google employees need to know what we're building," the letter, seen by the Reuters news agency, reads. The employees are reportedly worried about kowtowing to China by implementing the government's requests for censorship. China restricts internet users massively by blocking websites, censoring words and clamping down on free speech. In the letter, the employees say Google would be validating China's restrictions on freedom of expression and violating its own clause in the company's code of conduct, "don't be evil".
The internal dissent over Dragonfly comes on the heels of the employee protests over Google's involvement in the Pentagon project to use artificial intelligence. After Google said it would not renew its contract with the Pentagon, it unveiled a series of ethical principles governing its use of A.I. In those principles, Google publicly committed to use A.I. only in "socially beneficial" ways that would not cause harm and promised to develop its capabilities in accordance with human rights law. Some employees have raised concerns that helping China suppress the free flow of information would violate these new principles. In 2010, Google said it had discovered that Chinese hackers had attacked the company's corporate infrastructure in an attempt to access to the Gmail accounts of human rights activists.
In May 2018, Amnesty International, Access Now, and a handful of partner organizations launched the Toronto Declaration on protecting the right to equality and non-discrimination in machine learning systems. The Declaration is a landmark document that seeks to apply existing international human rights standards to the development and use of machine learning systems (or "artificial intelligence"). Machine learning (ML) is a subset of artificial intelligence. It can be defined as " provid[ing] systems the ability to automatically learn and improve from experience without being explicitly programmed." How is this technology relevant to human rights?
Mr. Pichai, speaking Thursday at a weekly all-hands meeting in Mountain View, Calif., was responding to criticism from employees, human rights groups and others who in recent days have voiced concerns over the Alphabet Inc. unit's work with the Chinese government. Google is developing services for Chinese citizens, including a search engine that could adhere to China's strict censors, The Wall Street Journal and others reported last week. At the meeting, Google co-founder and Alphabet president Sergey Brin sounded optimistic about doing more business in China, cautioning that progress in the country is "slow-going and complicated." Mr. Brin was instrumental in Google's decision in 2010 to withdraw its search engine from China to protest the government's censorship regime and attempts to hack into the Gmail accounts of Chinese human rights activists. At the time, he described the government as having the "earmarks of totalitarianism" of the Soviet Union, where he was born.
Google's workforce is demanding answers over the company's secretive plans to build a search engine that will comply with censorship in China. More than 1,000 employees have signed a letter demanding more transparency over the project so they do not unwittingly suppress freedom of speech. In a version of the letter obtained by the New York Times, the employees say they lack the "information required to make ethically-informed decisions about our work, our projects, and our employment." China's censorship requirements "raise urgent moral and ethical issues," it adds. The letter, which has circulated through Google's internal communications, has gained more than 1,400 signatures, according to the Times.
Air pollution in cities can be an acute problem leading to damaging effects on people, animals, plants and property. It is an important topic which is getting increased attention as the human population of cities continues to increase. This year it was the subject the 2018 KDD Cup, the annual data mining and knowledge discovery competition organized by ACM SIGKDD. The burning of fossil fuels for transport and home heating is a major contributor to air pollution in urban environments, creating the pollutant nitrogen dioxide (NO2). This is a secondary pollutant produced by the oxidation of NO. It is a major contributor to respiratory problems. In the European Union, the Cleaner Air For Europe (CAFÉ) Directive 2008/50/EC established an hourly limit of 200 μg/m3 and an annual mean limit of 40 μg/m3 in respect of NO2.
SAN FRANCISCO – Google's plan to launch a censored search engine in China requires more "transparency, oversight and accountability," hundreds of employees at the Alphabet Inc. unit said in an internal petition seen by Reuters on Thursday. Hoping to gain approval from the Chinese government to provide a mobile search service, the company plans to block some websites and search terms, Reuters reported this month, citing two people familiar with the matter. Disclosure of the secretive effort has disturbed some Google employees and human rights advocacy organizations. They are concerned that by agreeing to censorship demands, Google would validate China's prohibitions on free expression and violate the "don't be evil" clause in the company's code of conduct. After employees petitioned this year, Google announced it would not renew a project to help the U.S. military develop artificial intelligence technology for drones.
We are deeply concerned by the way in which our friend and colleague Professor Francisco Ayala has been forced to resign from the University of California, Irvine (UCI), after being accused of sexual harassment ("Prominent geneticist out at UC Irvine after harassment finding," M. Wadman, News, 29 June, https://scim.ag/AyalaResignation). The charges that have been raised against him have had appalling consequences. Those of us who are well acquainted with Professor Ayala know that he is an honorable person, who throughout his career has treated his friends, co-workers, and students in a respectful, egalitarian way. His lifelong commitment to teaching, research, and outreach on biological evolution has won him worldwide recognition. He has been a generous benefactor to the University of California and throughout his fruitful career has opened new fields of biological research, promoted mutual respect and independence between evolutionary studies and religious perspectives, played a key role in several major scientific organizations, and helped many Spanish-speaking female scholars and Hispanic students, in particular, both in the United States and throughout the world.
Three of Tinder's founders and a handful of current executives say the popular dating app's parent companies cheated them out of as much as $2 billion by manipulating financial information to undermine its valuation, according to a lawsuit filed Tuesday. The co-founders and executives claim that Match Group Inc. MTCH -0.21% and IAC/InterActiveCorp . IAC 0.18% hid projections of Tinder's rapid growth in order to reduce payments to the holders of stock options, which were based on the company's valuation. The suit, filed by 10 plaintiffs in New York Supreme Court, also says that Greg Blatt, a longtime executive of IAC who served as interim chief executive of Tinder, groped and sexually harassed Tinder's vice president of marketing and communications, Rosette Pambakian, during the Los Angeles-based company's 2016 holiday party. Mr. Blatt didn't immediately respond to a request for comment.
Currently, algorithms are used to make life-altering financial and legal decisions like who gets a job, what medical treatment people receive, and who gets granted parole. In theory, this should lead to fairer decision making. In reality, AI tech can be just as biased as the humans who create it. We are living in the age of the algorithm. More and more we are handing decision making over to mathematical models.