Reading medieval literature, it's hard not to be impressed with how much the characters get done--as when we read about King Harold doing battle in one of the Sagas of the Icelanders, written in about 1230. The first sentence bristles with purposeful action: "King Harold proclaimed a general levy, and gathered a fleet, summoning his forces far and wide through the land." By the end of the third paragraph, the king has launched his fleet against a rebel army, fought numerous battles involving "much slaughter in either host," bound up the wounds of his men, dispensed rewards to the loyal, and "was supreme over all Norway." What the saga doesn't tell us is how Harold felt about any of this, whether his drive to conquer was fueled by a tyrannical father's barely concealed contempt, or whether his legacy ultimately surpassed or fell short of his deepest hopes. In his short story "Forever Overhead," the 13-year-old protagonist takes 12 pages to walk across the deck of a public swimming pool, wait in line at the high diving board, climb the ladder, and prepare to jump.
Our future will be bright, fast--and full of robots. It'll be more Asimov than Terminator: servant robots, more or less similar to us. Some will be upright androids, but most will be boxes filled with computer chips running software agents. And there will be a lot of them. Forecasts predict that, within just three years, we'll have 1.7 million robots in industry, 32 million in our households, and 400,000 in professional offices.1 Robots will begin to run our factories.
In early 1999, during the halftime of a University of Washington basketball game, a time capsule from 1927 was opened. Among the contents of this portal to the past were some yellowing newspapers, a Mercury dime, a student handbook, and a building permit. The crowd promptly erupted into boos. One student declared the items "dumb." Such disappointment in time capsules seems to run endemic, suggests William E. Jarvis in his book Time Capsules: A Cultural History.
People too often forget that IQ tests haven't been around that long. Indeed, such psychological measures are only about a century old. Early versions appeared in France with the work of Alfred Binet and Theodore Simon in 1905. However, these tests didn't become associated with genius until the measure moved from the Sorbonne in Paris to Stanford University in Northern California. There Professor Lewis M. Terman had it translated from French into English, and then standardized on sufficient numbers of children, to create what became known as the Stanford-Binet Intelligence Scale. The original motive behind these tests was to get a diagnostic to select children at the lower ends of the intelligence scale who might need special education to keep up with the school curriculum. But then Terman got a brilliant idea: Why not study a large sample of children who score at the top end of the scale?
The sun formed 4.5 billion years ago, but it's got around 6 billion years more before its fuel runs out. It will then flare up, engulfing the inner planets. And the expanding universe will continue--perhaps forever--destined to become ever colder, ever emptier. To quote Woody Allen, eternity is very long, especially toward the end. Any creatures witnessing the sun's demise won't be human--they'll be as different from us as we are from a bug.
The science-fiction writer Robert Heinlein once wrote, "Each generation thinks it invented sex." He was presumably referring to the pride each generation takes in defining its own sexual practices and ethics. But his comment hit the mark in another sense: Every generation has to reinvent sex because the previous generation did a lousy job of teaching it. In the United States, the conversations we have with our children about sex are often awkward, limited, and brimming with euphemism. At school, if kids are lucky enough to live in a state that allows it, they'll get something like 10 total hours of sex education.1
Mattel's AI nanny, called Aristotle, recently gained the notorious distinction of being subject to a bipartisan protest in the US Congress. Plus, there was a petition against it with over 15,000 signatures. The Campaign for a Commercial-Free Childhood, which organized the petition, argued that Aristotle is a consumerist ploy. It "attempts to replace the care, judgment and companionship of loving family members with faux nurturing and conversation from a robot designed to sell products and build brand loyalty." Aristotle, designed to interact with kids, was based on the same technologies as virtual assistants such as Amazon's Alexa.
When Adrian Owen, a neuroscientist at the University of Western Ontario, asked Scott Routley to imagine playing a game of tennis, any acknowledgement would have been surprising. After all, Routely had been completely unresponsive for the 12 years since his severe traumatic brain injury. He was thought to be in a vegetative state: complete unawareness of self or environment. But, as Owen watched Routley's brain inside a functional magnetic resonance imaging (fMRI) scanner, he saw a region of the motor cortex called the supplementary motor area--thought to play a role in movement--light up with activity. When he told Routely to relax, the activity ceased.
Our online personality is now as measurable as our carbon footprint. In addition to some rather obvious statistics, such as how often we tweet, how many others we follow, and how many others follow us, we principally reveal ourselves in our choice of words. How often I refer to "I," "me," "myself," "mine," and "my" can tell you a good deal about my propensity for self-absorption, while a frequent use of "we" and "our" indicates a willingness to share either credit or blame. The frequency with which I use "you" or "your" is just as indicative of a desire to channel my feelings outward, so if I also show a partiality for negative words, this pairing of observables is strongly indicative of hostility. Frequent use of "LOL," "OMG," and the exclamation point reveals an excitable personality, while emoticons and hashtags such as #irony and #sarcasm make explicit not just my feelings but a playful stance toward the content of my own tweets.
Errol Morris feels that Thomas Kuhn saved him from a career he was not suited for--by having him thrown out of Princeton. In 1972, Kuhn was a professor of philosophy and the history of science at Princeton, and author of The Structure of Scientific Revolutions, which gave the world the term "paradigm shift." As Morris tells the story in his recent book, The Ashtray, Kuhn was antagonized by Morris' suggestions that Kuhn was a megalomaniac and The Structure of Scientific Revolutions was an assault on truth and progress. To say the least, Morris, then 24, was already the iconoclast who would go on to make some of the most original documentary films of our time. After launching the career he was suited for with The Gates of Heaven in 1978, a droll affair about pet cemeteries, Morris earned international acclaim with The Thin Blue Line, which led to the reversal of a murder conviction of a prisoner who had been on death row. In 2004, Morris won an Academy Award for The Fog of War, a dissection of former Secretary of Defense Robert McNamara, a major architect of the Vietnam War. His 2017 film, Wormwood, a miniseries on Netflix, centers on the mystery surrounding a scientist who in 1975 worked on a biological warfare program for the Army, and suspiciously fell to his death from a hotel room. The Ashtray--Morris explains the title in our interview below--is as arresting and idiosyncratic as Morris' films.