WOMEN


New AI can work out whether you're gay or straight from a photograph

#artificialintelligence

Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research that suggests machines can have significantly better "gaydar" than humans. The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology, and the potential for this kind of software to violate people's privacy or be abused for anti-LGBT purposes. The research found that gay men and women tended to have "gender-atypical" features, expressions and "grooming styles", essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.


AI in HR: Artificial intelligence to bring out the best in people

#artificialintelligence

Its main AI and HR analytics product is Cornerstone Insights, what CTO Mark Goldin called "machine learning in a box." The dispassionate analysis that AI brought to Expedia's recruiting practices can also be applied to performance management, which Holger Mueller, vice president and principal analyst at Constellation Research, considers talent management's core function -- and the part that's most broken. "The applications of AI basically are analytics applications, where the software is using history and algorithms and data to be smarter and smarter over time," Bersin explained. HR is a good target for AI because many HR practices are "handcrafted," cultural in nature and could be better at handling data, according to Josh Bersin, principal and founder of consulting firm Bersin by Deloitte.


Machine Learning Reveals Systematic Sexism in Astronomy

#artificialintelligence

Now, a quantitative study published on Friday in Nature Astronomy demonstrates that gender bias in astronomical research extends even to journal citations, which are an indicator of academic prestige and are linked with better access to grant money, speaking engagements, and professional advancement. Led by Neven Caplar, a PhD student at ETH Zürich's Institute of Astronomy, the new research found that papers with male lead authors were cited 10 percent more frequently than papers led by women, even after controlling for non-gender-specific disparities such as seniority, team size, publication date, field, and academic institution. The team reached this conclusion after using machine-learning to analyze a dataset of over 200,000 papers published between 1950 and 2015 in five influential journals: Astronomy & Astrophysics, The Astrophysical Journal, Monthly Notices of the Royal Astronomical Society, Nature, and Science. In cases where first authors used their initials--a tactic women researchers disproportionately use to avoid gender bias--Caplar's team took extra measures to identify exceptions in publishing records that exposed authors' full names.


How Bayesian Inference Works

@machinelearnbot

Since there are 25 long haired women and 2 long haired men, guessing that the ticket owner is a woman is a safe bet. To lay our foundation, we need to quickly mention four concepts: probabilities, conditional probabilities, joint probabilities and marginal probabilities. The probability of a thing happening is the number of ways that thing can happen divided by the total number of things that can happen. Combining these by multiplication gives the joint probability, P(woman with short hair) P(woman) * P(short hair woman).


Lazy coders are training artificial intelligences to be sexist

New Scientist

Employers: do the ladies on your payroll have any "female weaknesses" that would make them mentally or physically unfit for the job? The question comes to you courtesy of the year 1943. It was posed in a guide to hiring women, written for the flummoxed male supervisors at Transportation Magazine tasked with integrating a new female workforce during a wartime shortage of manpower. Back then, you wouldn't be surprised to see logical reasoning like "Men are to programmers as women are to homemakers". Or "Men are to surgeons what women are to nurses".


Google researchers develop a test for machine learning bias - SiliconANGLE

#artificialintelligence

A team of researchers at Google Inc. has developed a method for testing whether or not machine learning algorithms inject bias, such as gender or racial bias, into their decision-making processes. For some time, concerns have been raised about the possibility that machine learning algorithms are injecting bias into applications such as advertising, credit, education, employment and justice. Recent examples include a crime prediction algorithm that targeted black neighborhoods and an online advertising platform that was found to show highly paid executive jobs to men more often than women. "Decisions based on machine learning can be both incredibly useful and have a profound impact on our lives," said Moritz Hardt, a senior research scientist at Google, who co-authored the paper, "Equality of Opportunity in Supervised Learning." "Despite the demand, a vetted methodology for avoiding discrimination against protected attributes in machine learning is lacking."


Google researchers develop a test for machine learning bias - SiliconANGLE

#artificialintelligence

A team of researchers at Google Inc. has developed a method for testing whether or not machine learning algorithms inject bias, such as gender or racial bias, into their decision-making processes. For some time, concerns have been raised about the possibility that machine learning algorithms are injecting bias into applications such as advertising, credit, education, employment and justice. Recent examples include a crime prediction algorithm that targeted black neighborhoods and an online advertising platform that was found to show highly paid executive jobs to men more often than women. "Decisions based on machine learning can be both incredibly useful and have a profound impact on our lives," said Moritz Hardt, a senior research scientist at Google, who co-authored the paper, Equality of Opportunity in Supervised Learning. "Despite the demand, a vetted methodology for avoiding discrimination against protected attributes in machine learning is lacking."


What Female Body Part Do Men Find Sexiest On Women? Guys Prefer Chests Over Behinds, Study Says

International Business Times

Straight men find curvy women more attractive, according to a Dec. 13 survey conducted by fitness equipment review site Fitrated. The survey, which asked over 2,000 straight and gay people what they found to be the sexiest body part on a person, also found that straight men were most attracted to a woman's chest whether she has an average or curvy, but prefer a nice butt on a thin woman. As for the ladies, both straight and gay women found average body types more attractive. Straight women found arms to be the sexiest on men, whether they were average build, muscular or thin. Similar to straight men, gay women found a woman's chest to be her sexiest feature but chest but found both the chest and stomach to be the sexiest body parts on thin women.


How We Teach CS2All, and What to Do About Database Decay

Communications of the ACM

For many years I have been part of discussions about how to diversify computing, particularly about how we recruit and retain a more diverse cohort of computer science (CS) students. I wholeheartedly support this goal, and spend a considerable amount of my effort as chair of ACM-W helping to drive programs that focus on one aspect of this diversification, namely encouraging women students to stay in computing. Of late I have become very concerned about how some elements of the diversity argument are being expressed and then implemented in teaching practices. Problem 1. Women are motivated by social relevance, so when we teach them we have to discuss ways in which computing can contribute to the social good. Problem 2. Students from underrepresented minorities (URM) respond to culturally relevant examples, so when we teach them we have to incorporate these examples into course content.


2017: Rise of the Women (in Tech) – The Mission

#artificialintelligence

How the biggest disruption next year will come from women, not robots. Every year, we make predictions in tech about what we will see more of the following year. Whether it is the increase of virtual reality, chatbots, robots or (can it still possibly be) mobile. One thing is for sure. There aren't many banking on the fact it will be women.