When the Dutch arrived in New York Harbor in 1609, Staten Island--or Staaten Eylandt, as they named it--was a wild wonderland, woodland in the middle and tidal salt marsh on the edges, populated by the local Lenape tribe, plus an embarrassment of natural riches: eels, bluefish, bitterns, herons, muskrats, ducks, clams, crabs, wild turkeys, porpoises, and more. Jutting midway into the island from the west coast, like a hook in the island's side, was the Fresh Kills estuary, a tidal wetland thriving with plants and critters, created by the retreat of the Wisconsin Ice Sheet some 17,000 years ago. After World War II, the bursting city of New York found itself with a trash problem. In 1948, the city started officially dumping its trash into the marshes and waters of Fresh Kills. What became America's first landfill was meant to be temporary, but it stuck. By 1955, it was the biggest landfill in the world--indeed, at 2,700 acres, it was the biggest human-made structure in the world. By 1991, the landfill contained 150 million tons of tightly packed garbage in more volume than could fill the Great Wall of China. Fresh Kills was wetland no more.
To death and taxes, Benjamin Franklin's binary list of life's certainties, add the expectation that this six-note sequence: Although we ponder ways to avoid or evade Franklin's list of unavoidable events, we generally accept this more benign certainty as immutable. The penultimate note of the tune generates such strong and specific anticipation that you are likely finding it difficult to continue reading without resolving the sequence. That anxious pause is key to composition and music's power. It creates a sense of prophetic certainty that allows musicians to play against expectations by thwarting the expected. The controlled manipulation of certainty and likelihood lurks behind those magical moments in which music has caused a shiver or a tear to fall. By infusing uncertainty or surprise into the mix, musicians literally play on our emotions.
My high school biology teacher, Mr. Whittington, put a framed picture of a primate ancestor in the front of his classroom--a place of reverence. In a deeply religious and conservative community in rural America, this was a radical act. Evolution, among the most well-supported scientific theories in human history, was then, and still is, deliberately censored from biological science education. But Whittington taught evolution unapologetically, as "the single best idea anybody ever had," as the philosopher Dan Dennett described it. Whittington saw me looking at the primate in wonder one day and said, "Cristine, look at its hands.
After the fall of the Berlin Wall, East German citizens were offered the chance to read the files kept on them by the Stasi, the much-feared Communist-era secret police service. To date, it is estimated that only 10 percent have taken the opportunity. In 2007, James Watson, the co-discoverer of the structure of DNA, asked that he not be given any information about his APOE gene, one allele of which is a known risk factor for Alzheimer's disease. Most people tell pollsters that, given the choice, they would prefer not to know the date of their own death--or even the future dates of happy events. Each of these is an example of willful ignorance.
Now tell me: How much time has passed since you first logged on to your computer today? Time may be a property of physics, but it is also a property of the mind, which ultimately makes it a product of the brain. Time measures out and shapes our lives, and how we live our lives in turn affects how we perceive the passage of time. Your sense of time is malleable and subjective--it changes in response to changing contexts and input, and it can be distorted when the brain is damaged, or affected by drugs, disease, sleep deprivation, or naturally altered states of consciousness. However, a new set of neuroscience research findings suggests that losing track of time is also intimately bound up with creativity, beauty, and rapture.
Asked at the start of the final 1988 presidential debate whether he would support the death penalty if his wife were raped and murdered, Michael Dukakis, a lifelong opponent of capital punishment, quickly and coolly said no. It was a surprising, deeply personal, and arguably inappropriate question, but in demonstrating an unwavering commitment to his principles, Dukakis had handled it well. "The reporters sensed it instantly," wrote Roger Simon about the scene at the debate immediately after Dukakis gave his response. "Even though the 90-minute debate was only seconds old, they felt it was already over for Dukakis." Dukakis' poll numbers plummeted, his campaign never recovered, and George H. W. Bush became the 41st President of the United States.
Silicon Valley has a term for startups that reach the $1 billion valuation mark: unicorns. It suggests not only that hugely successful startups are rare, but also that there's something unreal about them. Founded by a 19-year-old Stanford dropout, Elizabeth Holmes, who went on to become the world's youngest self-made female billionaire, it raised nearly a billion dollars from investors and was valued at $10 billion at its peak. It claimed to have developed technology that dramatically increased the affordability, convenience, and speed of blood testing. It partnered with Safeway and Walgreens, which together spent hundreds of millions of dollars building in-store clinics that were to offer Theranos tests. Tens of thousands of Americans had their blood tested by its proprietary technology. The problem was that Theranos' technology was never close to ready.
Reprinted with permission from Quanta Magazine's Abstractions blog. When he talks about where his fields of neuroscience and neuropsychology have taken a wrong turn, David Poeppel of New York University doesn't mince words. "There's an orgy of data but very little understanding," he said to a packed room at the American Association for the Advancement of Science annual meeting in February. He decried the "epistemological sterility" of experiments that do piecework measurements of the brain's wiring in the laboratory but are divorced from any guiding theories about behaviors and psychological phenomena in the natural world. It's delusional, he said, to think that simply adding up those pieces will eventually yield a meaningful picture of complex thought.
Once you're called a "genius," what's left? No, getting called a "genius" is the final accolade, the last laudatory label for anyone. At least that's how several members of Mensa, an organization of those who've scored in the 98th percentile on an IQ test, see it. "I don't look at myself as a genius," LaRae Bakerink, a business consultant and a Mensa member, said. "I think that's because I see things other people have done, things they have created, discovered, or invented, and I look at those people in awe, because that's not a capability I have."
You die at the beginning of Mass Effect 2. It's 2183, and you--Commander Shepard--have just saved every space-faring species in the Milky Way from an extra-galactic threat. In the resulting explosion, you're flung into the void, drifting as you struggle to breathe. The military logs you as "killed in action." But of course, a deceased protagonist does not a sequel make. Your corpse is soon found and brought back to life.