We develop tools for utilizing correspondence experiments to detect illegal discrimination by individual employers. Employers violate US employment law if their propensity to contact applicants depends on protected characteristics such as race or sex. We establish identification of higher moments of the causal effects of protected characteristics on callback rates as a function of the number of fictitious applications sent to each job ad. These moments are used to bound the fraction of jobs that illegally discriminate. Applying our results to three experimental datasets, we find evidence of significant employer heterogeneity in discriminatory behavior, with the standard deviation of gaps in job-specific callback probabilities across protected groups averaging roughly twice the mean gap. In a recent experiment manipulating racially distinctive names, we estimate that at least 85% of jobs that contact both of two white applications and neither of two black applications are engaged in illegal discrimination. To assess the tradeoff between type I and II errors presented by these patterns, we consider the performance of a series of decision rules for investigating suspicious callback behavior under a simple two-type model that rationalizes the experimental data. Though, in our preferred specification, only 17% of employers are estimated to discriminate on the basis of race, we find that an experiment sending 10 applications to each job would enable accurate detection of 7-10% of discriminators while falsely accusing fewer than 0.2% of non-discriminators. A minimax decision rule acknowledging partial identification of the joint distribution of callback rates yields higher error rates but more investigations than our baseline two-type model. Our results suggest illegal labor market discrimination can be reliably monitored with relatively small modifications to existing audit designs.
An Amazon Echo device is displayed at the Ford booth at CES 2017 at the Las Vegas Convention Center on January 5, 2017 in Las Vegas, Nevada. Amazon is sticking to its guns in the fight to protect customer data. The tech titan has filed a motion to quash the search warrant for recordings from an Amazon Echo in the trial of James Andrew Bates, accused of murdering friend Victor Collins in Bentonville, Arkansas in November 2015. And it's arguing that the responses of Alexa, the voice of the Echo, has First Amendment rights as part of that motion. The case first came to light in December, when it emerged Amazon was contesting an order to provide audio from the Echo device, during the 48-hour period from November 21 through 22 2015, alongside subscriber and account information.
A free speech debate has erupted over Amazon's efforts to prevent prosecutors from obtaining audio that was recorded by one of the company's new Alexa personal assistants. Prosecutors in Arkansas say the audio could be important to proving the first-degree murder charge that it filed against James Andrew Bates, who is accused of killing a friend, Victor Collins. Bates' home had an Amazon Alexa, a device that can answer questions and perform simple functions, such as playing music. The voice-activated device is complemented by Echo, which contains speakers and microphones. Seattle-based Amazon says that the data recorded by the device, and the responses from the Alexa operating system, are protected by the First Amendment.
Prosecutors in an Arkansas murder trial have been attempting to obtain possible voice recordings from an Amazon Echo device that may be key evidence in the trial. But Amazon is claiming that Alexa, the voice of the Echo, has First Amendment rights and in the interest of protecting the customer's privacy and rights refuses to give up the recordings without legal requirement to do so, according to Forbes. The Echo product in question belongs to James Andrew Bates who's friend Victor Collins was found dead in Bates's hot tub in November 2015, according to The Northwest Arkansas Democrat Gazette. Bates pleaded not guilty to first-degree murder. In the argument against the warrant for the recordings Amazon argues that the information Alexa receives and gives can reveal the intricacies of the user's personal life and should thus be protected.
Most people use Google's search-by-image feature to either look for copyright infringement, or for shopping. See some shoes you like on a frenemy's Instagram? Search will pull up all the matching images on the web, including from sites that will sell you the same pair. In order to do that, Google's computer vision algorithms had to be trained to extract identifying features like colors, textures, and shapes from a vast catalogue of images. Luis Ceze, a computer scientist at the University of Washington, wants to encode that same process directly in DNA, making the molecules themselves carry out that computer vision work. And he wants to do it using your photos.