The term AI, or "Artificial Intelligence," encompasses the attempts to create machines that can match or surpass human intelligence. One way that that this is measured is through a machine's ability to create. The Turing Test measures this; devised by British mathematician Alan Turing in the 1950s, it says that a machine is intelligent if it a person communicating with it is sufficiently fooled into thinking that they are talking with another human (Du Sautoy 2019, 6). An increasingly more popular way to measure artificial intelligence is through the Lovelace Test. Created by another British mathematician, Ada Lovelace, this test says that in order for a machine to be considered intelligent, it must be able to produce something beyond the boundaries of what was programmed into it (Du Sautoy 2019, 2).
More than a decade has passed since the British government issued an apology to the mathematician Alan Turing. The tone of pained contrition was appropriate, given Britain's grotesquely ungracious treatment of Turing, who played a decisive role in cracking the German Enigma cipher, allowing Allied intelligence to predict where U-boats would strike and thus saving tens of thousands of lives. Unapologetic about his homosexuality, Turing had made a careless admission of an affair with a man, in the course of reporting a robbery at his home in 1952, and was arrested for an "act of gross indecency" (the same charge that had led to a jail sentence for Oscar Wilde in 1895). Turing was subsequently given a choice to serve prison time or undergo a hormone treatment meant to reverse the testosterone levels that made him desire men (so the thinking went at the time). Turing opted for the latter and, two years later, ended his life by taking a bite from an apple laced with cyanide.
John McCarthy was the son of a penniless Irish immigrant from Kerry and maybe the most important Irish American you never heard of. He died in 2011 aged 84. He was an American computer scientist pioneer and inventor and is known as the father of artificial intelligence (AI) after playing the key role in the development of intelligent machines we now call computers. He won the Turing Prize, one step below the Nobel, in 1971. He coined the term artificial intelligence for a 1955 Dartmouth College conference he chiefly organized which was the first-ever AI conference.
AI has a long history. One can argue it even started long before the term was first coined; mostly in stories and later in actual mechanical devices called automata. This chapter only covers events relevant to the periods of AI winters without being too exhaustive in hope to extract knowledge that can be applied today. To aid understanding the phenomenon of AI Winters, the events leading up to them are examined. Many early ideas about thinking machines appeared in the late 1940s to '50s by people like Turing or Von Neumann.
A workshop held in 1956 at Dartmouth College, Hanover, NH, is usually considered the beginning of artificial intelligence. Participants included John McCarthy and Marvin Minski. Alan Turing and Konrad Zuse, who already dealt with this topic in the 1940s, are also mentioned as the founders of this discipline. For decades, machine chess was considered the highlight of artificial intelligence. It was not until 1997 that IBM's Deep Blue program was able to beat then-world chess champion Garry Kasparov.
In his recently published book "Astounding," the author Alec Nevala-Lee brings American science fiction's Golden Age back into focus by following four key figures: John W. Campbell, Robert A. Heinlein, L. Ron Hubbard -- and Isaac Asimov, who officially turned 100 today (his exact birth date was unknown). Nevala-Lee's warts-and-all portrait paints Asimov -- known to his fans as the Good Doctor -- far more sympathetically than the genre's other founding fathers. But Nevala-Lee is clear about another aspect of Asimov's story: He was someone who unapologetically groped women. As recounted in "Astounding," Judith Merrill said Asimov was known in his younger days as "the man with a hundred hands." Harlan Ellison wrote, "Whenever we walked up the stairs with a young woman, I made sure to walk behind her so Isaac wouldn't grab her tush."
Every time I see this man I get sad and angry. I took a cryptology course two years ago, taught by a seemingly very nice couple of professors. At the end of the semester, we were given a list of important personalities and events in the history of cryptology. Alan Turing was listed there and, as a closeted gay male, I volunteered to do a presentation about him. I kept my presentation focused solely on Alan's achievements in cryptology, the man did crack the Nazi Enigma machine's code and shortened the war by quite a few years, after all.
According to an unofficial consensus, the birth of artificial intelligence as an independent research project can be dated to the summer of 1956, when John McCarthy at Dartmouth College, where he belonged to the Mathematical Department, was able to persuade the Rockefeller Foundation to finance an investigation " The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it". In addition to McCarthy (who was a professor at Stanford University until 2000 and is responsible for the coining of the term "artificial intelligence"), several other participants took part in the historical workshop at Dartmouth: Marvin Minsky (former professor at Stanford University), Claude Shannon (inventor of information theory); Herbert Simon (Nobel Prize winner in economics); Arthur Samuel (developer of the first chess computer program at world champion level); furthermore half a dozen experts from science and industry, who dreamed that it might be possible to produce a machine for coping with human tasks, which, according to the previous opinion, require intelligence. The Manifesto of Dartmouth (written at the dawn of the AI age) is both irritating and blurred. It is not clear whether the conference participants believed that one-day machines would actually think or just behave as if they could imagine. Both possible interpretations allow the word "simulate."
In this series, The Week looks at the ideas and innovations that permanently changed the way we see the world. Artificial intelligence (AI), sometimes referred to as machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence of humans. AI is the ability of a computer program or a machine to think and learn, so that it can work on its own without being encoded with commands. The term was first coined by American computer scientist John McCarthy in 1955. Human intelligence is "the combination of many diverse abilities", says Encyclopaedia Britannica.