New AI can work out whether you're gay or straight from a photograph


Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research that suggests machines can have significantly better "gaydar" than humans. The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology, and the potential for this kind of software to violate people's privacy or be abused for anti-LGBT purposes. The research found that gay men and women tended to have "gender-atypical" features, expressions and "grooming styles", essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.

AI in HR: Artificial intelligence to bring out the best in people


Its main AI and HR analytics product is Cornerstone Insights, what CTO Mark Goldin called "machine learning in a box." The dispassionate analysis that AI brought to Expedia's recruiting practices can also be applied to performance management, which Holger Mueller, vice president and principal analyst at Constellation Research, considers talent management's core function -- and the part that's most broken. "The applications of AI basically are analytics applications, where the software is using history and algorithms and data to be smarter and smarter over time," Bersin explained. HR is a good target for AI because many HR practices are "handcrafted," cultural in nature and could be better at handling data, according to Josh Bersin, principal and founder of consulting firm Bersin by Deloitte.

Machine Learning Reveals Systematic Sexism in Astronomy


Now, a quantitative study published on Friday in Nature Astronomy demonstrates that gender bias in astronomical research extends even to journal citations, which are an indicator of academic prestige and are linked with better access to grant money, speaking engagements, and professional advancement. Led by Neven Caplar, a PhD student at ETH Zürich's Institute of Astronomy, the new research found that papers with male lead authors were cited 10 percent more frequently than papers led by women, even after controlling for non-gender-specific disparities such as seniority, team size, publication date, field, and academic institution. The team reached this conclusion after using machine-learning to analyze a dataset of over 200,000 papers published between 1950 and 2015 in five influential journals: Astronomy & Astrophysics, The Astrophysical Journal, Monthly Notices of the Royal Astronomical Society, Nature, and Science. In cases where first authors used their initials--a tactic women researchers disproportionately use to avoid gender bias--Caplar's team took extra measures to identify exceptions in publishing records that exposed authors' full names.

How Bayesian Inference Works


Since there are 25 long haired women and 2 long haired men, guessing that the ticket owner is a woman is a safe bet. To lay our foundation, we need to quickly mention four concepts: probabilities, conditional probabilities, joint probabilities and marginal probabilities. The probability of a thing happening is the number of ways that thing can happen divided by the total number of things that can happen. Combining these by multiplication gives the joint probability, P(woman with short hair) P(woman) * P(short hair woman).

Why is Russia so good at encouraging women into tech?

BBC News

Irina Khoroshko, from Zelenograd near Moscow, had learned her times tables by the age of five. Her precocious talent, encouraged by a maths-mad family and a favourite female teacher who transformed every lesson into one giant problem-solving game, led to a degree in mathematical economics at Plekhanov Russian University of Economics. "My lecturer instilled in me the power of numbers and calculation, how it gives you the ability to predict things; in that sense the subject always felt magical," she says. Now Irina, 26, is a data scientist at Russian online lender, ID Finance, enjoying a lucrative career devising analytical models to determine loan eligibility. And this isn't an unusual story in Russia.

Why We Need More Women Taking Part In The AI Revolution


In 2011, entrepreneur and investor Marc Andreessen wrote his famous,"Why Software Is Eating the World" in the Wall Street Journal. Today, that story would more likely read, "Why Artificial Intelligence Is Eating the World." The market for artificial intelligence (AI) technologies-- from voice and image recognition to chat bots to self-driving cars-- is hot. A Narrative Science survey found last year that 38% of enterprises are already using AI, and that number will grow to 62% by 2018. Like the tech industry at large, the field of artificial Intelligence is dominated by white men.

Demystify the technology that creates AI


Beware of relying uncritically on big data computer systems, warns a St. Mary's University professor undertaking a five-year research project dubbed Where Science Meets Fiction: Social Robots and the Ethical Imagination. "There are real dangers now with big data," said Dr. Teresa Heffernan, the St. Mary's University professor undertaking the research project. "Algorithms have the same biases as humans." With her research project, the professor is hoping to demystify the technology that creates artificial intelligence and bring together experts from all walks of life to begin a dialogue about how humans and these machines should interact -- what to do and what not to do. "I want to shift the conversation that has been shaped by Silicon Valley . . . to make it more open and question the rhetoric, to demystify the technology and expose how the technology works rather than be dominated by it," said Heffernan.

Oops. You've been picking the wrong dating profile pics all this time.


Think you've obsessed over your dating profile pics and finally managed to get them just right? Sorry, but you'd better think again. The dating app Hinge sifted through thousands of their users photos and figured out what gets those likes and what gets ignored. The results are pretty surprising. Hinge's profile requires you to add six photos, and people can like them and start conversations based on them individually.

Even artificial intelligence can acquire biases against race and gender


Computers learning from human writing automatically see certain occupational words as masculine and others as feminine. One of the great promises of artificial intelligence (AI) is a world free of petty human biases. Hiring by algorithm would give men and women an equal chance at work, the thinking goes, and predicting criminal behavior with big data would sidestep racial prejudice in policing. But a new study shows that computers can be biased as well, especially when they learn from us. When algorithms glean the meaning of words by gobbling up lots of human-written text, they adopt stereotypes very similar to our own.

Flipboard on Flipboard


Open up the photo app on your phone and search "dog," and all the pictures you have of dogs will come up. This was no easy feat. Your phone knows what a dog "looks" like. This and other modern-day marvels are the result of machine learning. These are programs that comb through millions of pieces of data and start making correlations and predictions about the world.