Goto

Collaborating Authors

 computer programmer


Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings

Neural Information Processing Systems

The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding. Second, gender neutral words are shown to be linearly separable from gender definition words in the word embedding. Using these properties, we provide a methodology for modifying an embedding to remove gender stereotypes, such as the association between the words receptionist and female, while maintaining desired associations such as between the words queen and female. Using crowd-worker evaluation as well as standard benchmarks, we empirically demonstrate that our algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks. The resulting embeddings can be used in applications without amplifying gender bias.


Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings

Neural Information Processing Systems

The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding.


Revamping Python for an AI World

Communications of the ACM

Python is one of the most popular programming languages in existence. Easy to learn and easy to use, it has been around for years, so there is a large community of Python developers to support each other, and it has built up an ecosystem of libraries that allow users to drop in the functionalities they need. It does, however, come with downsides: its programs tend to run slowly, and because it is inefficient at running processes in parallel, it is not well suited to some of the latest artificial intelligence (AI) programming. Hoping to overcome those difficulties, computer scientist Chris Lattner set out to create a new language, Mojo, which offers the ease of use of Python, but the performance of more complex languages such as C or Rust. He teamed up with Tim Davis, whom he had met when they both worked for Google, to form Modular in January 2022.


What would the world's first computer programmer do about bias in AI?

#artificialintelligence

The 11th of October marks Ada Lovelace Day, a special moment in the annual tech calendar. It's an International Day of Recognition that celebrates women in STEM, named after the woman widely recognised as the world's first computer programmer. So immense was Ada Lovelace's contribution in a short life -- she died of illness in 1842 at age 36 -- that her notes provided inspiration for Alan Turing's work on the first modern computers in the 1940s. Ada Lovelace Day provides an opportunity to reflect. Our minds have travelled back not as far as the 1800s but to May this year when we hosted a panel at the Girls in Tech Australia Conference.


What Is It About Peter Thiel?

The New Yorker

Silicon Valley is not a milieu known for glamour and charisma. Still, Peter Thiel has cultivated a mystique. A billionaire several times over, Thiel was the first outside investor in Facebook; he went on to co-found PayPal, the digital-payment service, and Palantir, the data-intelligence company that has worked with the U.S. government. He has co-written a business best-seller, "Zero to One," and launched a hedge fund; he now runs three venture-capital firms. In 2018, citing a regional intolerance of conservative perspectives, he moved from Silicon Valley to Los Angeles; he recently purchased a mansion in Miami Beach.


No-code is code – TechCrunch

#artificialintelligence

Today, the release of OpenAI Codex, a new Al system that translates natural language to code, marks the beginning of a shift in how computer software is written. Over the past few years, there's been growing talk about "no code" platforms, but this is no new phenomenon. The reality is, ever since the first programmable devices, computer scientists have regularly developed breakthroughs in how we "code" computer software. The first computers were programmed with switches or punch cards, until the keyboard was invented. Coding became a matter of typing numbers or machine language, until Grace Hopper invented the modern compiler and the COBOL language, ushering in decades of innovation in programming languages and platforms.


12 Bytes by Jeanette Winterson review – how we got here and where we might go next

#artificialintelligence

In Mary Shelley's 1818 novel Frankenstein, a scientist creates life and is horrified by what he has done. Two centuries on, synthetic life, albeit in a far simpler form, has been created in a dish. What Shelley imagined has only now become possible. But as Jeanette Winterson points out in this essay collection, the achievements of science and technology always start out as fiction. Not everything that can be imagined can be realised, but nothing can be realised if it hasn't been imagined first.


New VA tool uses artificial intelligence to predict COVID-19 patient mortality - VAntage Point

#artificialintelligence

Tim Strebel is no stranger to the spirit of innovation. Currently a computer programmer focusing on health informatics at the Washington DC VA Medical Center, Strebel has been recognized by VA for his ingenuity. He's a two-time winner of the VA "Shark Tank" Award, which honors innovative practices, for developing two software packages to automate the jobs of those who work in prosthetics. Both products are used at many VA medical centers. For the eyeglass tool, he also won VA's Gears of Government Award, which recognizes federal employees and teams whose dedication supports exceptional delivery of key outcomes for the American people.


'It's good coding': Computer science students drawn to classes on Sanskrit, a 3,500-year-old language

#artificialintelligence

The course typically attracts students majoring in the study of religion, who are learning the language to further their research into Hinduism, Buddhism and Sikhism. Reading through her class list, however, Mills found that of six of the 40 enrolled students were actually computer science majors. "I'm always excited when there are students from an unexpected place," she says. The lingual connection between Sanskrit and computer science, it turns out, has been the subject of interest for quite some time. The first well-known publication that examined the relationship was in 1985, when NASA scientist Rick Briggs published a research paper in which he argued that the 3,500-year-old language was the best candidate for programming artificial intelligence technology – namely because of its adherence to rigid grammatical rules.


How to Remove Gender Bias in Machine Learning Models: NLP and Word Embeddings

#artificialintelligence

Most word embeddings used are glaringly sexist, let us look at some ways to de-bias such embeddings. Note - This article provides a review and the arguments made by Bolukbasi et al. in the paper "Man is to Computer Programmer as Woman is to Homemaker? All graphical drawings are made using draw.io. Word Embeddings are the core of NLP applications, and often, they end up being biased towards a gender due to the inherent stereotype present in the large text corpora they are trained on. Such models, when deployed to production can result in further widening of gender inequality and can have far fetched consequences on our society as a whole. To get a gist of what I'm talking about, here is a snippet from Bolukbasi et al., 2016 "Man is to Computer Programmer as Woman is to Homemaker?