In the early days of digital computing, it was not uncommon to find a radio receiver tuned to a particular frequency (I don't recall which one, sigh) so that the RF emitted by the computer could be picked up and played through the radio. You could tell when a program went into a loop and sometimes you could tell roughly where a computation had reached by the sounds coming from the radio monitor. Fast-forward to the 21st century and we are seeking a different kind of sound: the sound of programming. Bootstrap Worlda has developed online courses in programming, among other subjects, but what makes Bootstrap World so memorable for me is that the team has focused heavily on accessibility. The programming environment is extremely friendly to screen readers so that a blind programmer can navigate easily through complex programs using keyboard navigation coupled with oral descriptions/renderings of the program text and structure.b
A look under the hood of any major search, commerce, or social-networking site today will reveal a profusion of "deep-learning" algorithms. Over the past decade, these powerful artificial intelligence (AI) tools have been increasingly and successfully applied to image analysis, speech recognition, translation, and many other tasks. Indeed, the computational and power requirements of these algorithms now constitute a major and still-growing fraction of datacenter demand. Designers often offload much of the highly parallel calculations to commercial hardware, especially graphics-processing units (GPUs) originally developed for rapid image rendering. These chips are especially well-suited to the computationally intensive "training" phase, which tunes system parameters using many validated examples.
Lucas asserted "... technology-enhanced teaching and learning can dramatically improve the quality and success of higher education ..." His Figure 1 and Figure 2, in outlining traditional versus technology-enhanced courses, suggested traditional teaching methods deliver a low-quality result, while professional (Hollywood) production methods deliver a high-quality result, with, again, no evidence provided. The idea of universities as "content producers" giving students "content" consisting of "course materials and exercises" gave me an analogous idea. Families give food and clothing to their children, but families are inefficient and can involve bloated administrations (parents). Just as parents do more than feed (they try to create an environment where their children can develop and thrive), universities likewise try to create a learning environment for students. Indispensable elements include laboratory work, fieldwork, real essays marked by real scholars (not against a list of bullet points), and project work.
Anyone who has been frustrated asking questions of Siri or Alexa--and then annoyed at the digital assistant's tone-deaf responses--knows how dumb these supposedly intelligent assistants are, at least when it comes to emotional intelligence. "Even your dog knows when you're getting frustrated with it," says Rosalind Picard, director of Affective Computing Research at the Massachusetts Institute of Technology (MIT) Media Lab. "Siri doesn't yet have the intelligence of a dog," she says. Yet developing that kind of intelligence--in particular, the ability to recognize human emotions and then respond appropriately--is essential to the true success of digital assistants and the many other artificial intelligences (AIs) we interact with every day. Whether we're giving voice commands to a GPS navigator, trying to get help from an automated phone support line, or working with a robot or chatbot, we need them to really understand us if we're to take these AIs seriously.
The ability to manipulate and understand data is increasingly critical to discovery and innovation. As a result, we see the emergence of a new field--data science--that focuses on the processes and systems that enable us to extract knowledge or insight from data in various forms and translate it into action. In practice, data science has evolved as an interdisciplinary field that integrates approaches from such data-analysis fields as statistics, data mining, and predictive analytics and incorporates advances in scalable computing and data management. But as a discipline, data science is only in its infancy. The challenge of developing data science in a way that achieves its full potential raises important questions for the research and education community: How can we evolve the field of data science so it supports the increasing role of data in all spheres? How do we train a workforce of professionals who can use data to its best advantage? What should we teach them? What can government agencies do to help maximize the potential of data science to drive discovery and address current and future needs for a workforce with data science expertise?
Across diverse fields, investigators face problems and opportunities involving data. Scientists, scholars, engineers, and other analysts seek new methods to ingest data, extract salient patterns, and then use the results for prediction and understanding. These methods come from machine learning (ML), which is quickly becoming core to modern technological systems, modern scientific workflow, and modern approaches to understanding data. The classical approach to solving a problem with ML follows the "cookbook" approach, one where the scientist shoehorns her data and problem to match the inputs and outputs of a reliable ML method. This strategy has been successful in many domains--examples include spam filtering, speech recognition, and movie recommendation--but it can only take us so far.
I was fortunate to enter computing in the era of site funding by the Defense Advanced Research Projects Administration (DARPA). "DARPA sites" from 1960s through the mid-1990s had sustained investment of $10M/year (inflation adjusted). The talent and vision combined with sustained funding at scale enabled the undertaking of bold transformative ideas, such as timesharing, entire new operating systems, novel computing system architectures, new models of networking, and a raft of exciting artificial intelligence technologies--in robotics, self-driving cars, computer vision, and more. For example, I joined Arvind's Tagged-token Dataflow Computing project as an MIT graduate student. This single project, which involved a dozen graduate and undergraduates plus staff and faculty, garnered $13.5M support with a run rate exceeding $4M/year.
Deep learning might be a booming field these days, but few people remember its time in the intellectual wilderness better than Yann LeCun, director of Facebook Artificial Intelligence Research (FAIR) and a part-time professor at New York University. LeCun developed convolutional neural networks while a researcher at Bell Laboratories in the late 1980s. Now, the group he leads at Facebook is using them to improve computer vision, to make predictions in the face of uncertainty, and even to understand natural language. Your work at FAIR ranges from long-term theoretical research to applications that have real product impact.