elite university
I Teach Computer Science, and That Is Not All
"I teach computer science, and that is all," wrote Boaz Barak, of Harvard University, in a recent op-ed in The New York Times.a The main point of the op-ed was to protest the growing politicization of U.S. higher education, especially at elite universities, where we have seen many faculty members proceed from scholarship to advocacy. But in spite of the provocative title, the content of Barak's op-ed is quite more nuanced. "We should not normalize bringing one's ideology to the classroom," wrote Barak, and I could not agree more. But he also wrote that "The interaction of computer science and policy sometimes arises in my classes, and I make sure to present multiple perspectives." Here, Barak is advocating fairness and balance, rather than neutrality and avoidance of non-technical topics.
How one elite university is approaching ChatGPT this school year
The big thing this year seems to be the same one that defined the end of last year: ChatGPT and other large language models. Last winter and spring brought so many headlines about AI in the classroom, with some panicked schools going as far as to ban ChatGPT altogether. My colleague Will Douglas Heaven wrote that it wasn't time to panic: generative AI, he argued, is going to change education but not destroy it. Now, with the summer months having offered a bit of time for reflection, some schools seem to be reconsidering their approach. For a perspective on how higher education institutions are now approaching the technology in the classroom, I spoke with Jenny Frederick.
How Compute Divide Leads To Discrimination In AI Research
Science doesn't discriminate, but probably technology does, at least in terms of accessibility. New research has found that the unequal distribution of compute power in academia is promoting inequality in the era of deep learning. The study conducted jointly by AI researchers from Virginia Tech and Western University found that this de-democratisation of AI has pushed people to leave academia and opt for high-paying industry jobs. The study found that the amount of compute power at elite universities, ranked among top 50 as per QS World University Rankings, is much more than at mid-to-low tier institutions. For the research, authors analysed over 170,000 papers presented across 60 prestigious computer science conferences such as ACL, ICML, and NeurIPS in categories like computer vision, data mining, NLP, and machine learning.
Where is the accountability for AI ethics gatekeepers?
Elite institutions, the self-appointed arbiters of ethics are guilty of racism and unethical behavior but have zero accountability. In July 2020, MIT took a frequently cited and widely used dataset offline when two researchers found that the '80 Million Tiny Images' dataset used racist, misogynistic terms to describe images of Black and Asian people. According to The Register, Vinay Prabhu, a data scientist of Indian origin working at a startup in California, and Abeba Birhane, an Ethiopian PhD candidate at University College Dublin, who made the discovery that thousands of images in the MIT database were "labeled with racist slurs for Black and Asian people, and derogatory terms used to describe women." This problematic dataset was created back in 2008 and if left unchecked, it would have continued to spawn biased algorithms and introduce prejudice into AI models that used it as training dataset. This incident also highlights a pervasive tendency in this space to put the onus of solving ethical problems created by questionable technologies back on the marginalized groups negatively impacted by them. IBM's recent decision to exit the Facial Recognition industry, followed by similar measures by other tech giants, was in no small part due to the foundational work of Timnit Gebru, Joy Buolamwini, and other Black women scholars.
- North America > United States > California (0.36)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.05)
- Education (1.00)
- Law > Civil Rights & Constitutional Law (0.95)