As a combat veteran and more recently an industry technologist and university professor, I have observed with concern the increasing automation--and dehumanization--of warfare. Sarah Underwood's discussion of autonomous weapons in her news story "Potential and Peril" (June 2017) highlighting this trend also reminded me of the current effort to update the ACM Code of Ethics, which says nothing about the responsibilities of ACM members in defense industries building the software and hardware in weapons systems. Underwood said understanding the limitations, dangers, and potential of autonomous and other warfare technologies must be a priority for those designing such systems in order to minimize the "collateral damage" of civilian casualties and property/infrastructure destruction. Defense technologists must be aware of and follow appropriate ethical guidelines for creating and managing automated weapons systems of any kind.
This has changed in the last two decades, due to the progress in Satisfiability (SAT) solving, which by adding brute reason renders brute force into a powerful approach to deal with many problems easily and automatically. This combination of enormous computational power with "magical brute force" can now solve very hard combinatorial problems, as well as proving safety of systems such as railways. To solve the Boolean Pythagorean Triples Problem, it suffices to show the existence of a subset of the natural numbers, such that any partition of that subset into two parts has one part containing a Pythagorean triple. This performance boost resulted in the SAT revolution:3 encode problems arising from many interesting applications as SAT formulas, solve these formulas, and decode the solutions to obtain answers for the original problems.
I recently published and presented a paper at CHI 2017 (the annual ACM Conference on Human Factors in Computing Systems, https://chi2017.acm.org) This paper won an Honorable Mention award at the conference. Here's a summary of the project. There is now tremendous momentum behind initiatives to teach computer programming to a broad audience, yet many of these efforts (for example, Code.org, In contrast, I wanted to study the other end of the age spectrum: how older adults aged 60 and over are now learning to code. Because this population is already significant and also quickly growing as we all (hopefully!) continue to live longer in the coming decades. The United Nations estimates that by 2030, 25% of North Americans and Europeans will be over 60 years old, and 16% of the worldwide population will be over 60. There has been extensive research on how older adults consume technology, and some studies of how they curate and produce digital content such as blogs and personal photo ...
Particular attention has been devoted to the purported connection between a "Universal Turing Machine" (UTM), as introduced in Turing's article of 1936,27 and the design and implementation in the mid-1940s of the first stored-program computers, with particular emphasis on the respective proposals of John von Neumann for the EDVAC30 and of Turing himself for the ACE.26 In some recent accounts, von Neumann's and Turing's proposals (and the machines built on them) are unambiguously described as direct implementations of a UTM, as defined in 1936. "6 This and other similar testimonies have been cited repeatedly as solid historical evidence but are invariably vague and unsupported.a Similar is the case with the anecdotes about the purported early influence of Turing's paper on von Neumann; see, for example, Hodges.21 This article is intended as a further contribution to the historical ongoing debates about the actual role of Turing in the history of the modern electronic computer and, in particular, the putative connection between the UTM and the stored-program computer. Of particular interest is the actual, direct influence of Turing's paper on von Neumann at the time when the latter wrote his famous "First Draft." Moreover, I claim the very idea of a modern computer in the sense of either von Neumann's "First Draft" or of Turing's "Proposed Electronic Calculator" was in 1936 not only beyond the scope of Turing's capabilities but also of his concerns.
Indeed, rather than simply being used to replace contact center workers, artificial intelligence (Al)-based technologies, including machine learning, natural language processing, and even sentiment analysis, are being strategically deployed to improve the overall customer experience by providing functionality that would be too time-consuming or expensive to do manually. The company uses AI "bots" to handle routine tasks by utilizing natural language processing to interpret what customers are asking, search the business knowledge base system for an answer, and then interpreting this raw data into an intelligent, human-friendly response. Burgess highlights the power of machine learning and natural language processing to quickly process front-end requests, which often make up a significant amount of call volume and labor costs. The sheer number of possible phrases, words, and interactions does make it more challenging to automate the customer service experience, though with machine learning technology that can review thousands or millions of interactions, organizations can tailor responses based on its learnings.
"Computer graphics are pictures and movies created using computers." Geometric modeling, a subfield of computer graphics, has been motivated by industrial needs at the advent of computer-aided manufacturing, aiming at increased productivity via a completely digital workflow from design to production. The possibilities for digital shape design are almost unlimited and highly effective for the creation of "pictures and movies." I am not convinced here, since purely geometry-driven shape modeling creates bottlenecks when moving toward engineering and fabrication.
The 34-year-old assistant professor of physics at the Massachusetts Institute of Technology is the architect of a new theory called "dissipative adaptation," which has helped to explain how complex, life-like function can self-organize and emerge from simpler things, including inanimate matter. Wittgenstein argued that a word's meaning depends on its context, a context determined by the people who are using it. Wittgenstein, however, argued that a word's meaning depends on its context, a context determined by the people who are using it. "In the beginning, God created the heavens and the earth ..." Here, the Hebrew word for "create" is bara, the word for "heavens" is shamayim, and the word for "earth" is aretz; but their true meanings, England says, only come into view through their context in the following verses.
The total number of neurons in the human brain falls in the same ballpark of the number of galaxies in the observable universe. Interestingly enough, the total number of neurons in the human brain falls in the same ballpark of the number of galaxies in the observable universe. Researchers regularly use a technique called power spectrum analysis to study the large-scale distribution of galaxies. Based on the latest analysis of the connectivity of the brain network, independent studies have concluded that the total memory capacity of the adult human brain should be around 2.5 petabytes, not far from the 1-10 petabyte range estimated for the cosmic web!
There might be some pre-geometry, that would give rise to geometry just like atoms give rise to the continuum of elastic bodies. Clocks don't measure the flow of time, they measure intervals of time. And the other thing people contemplate: They think denying the flow of time is denying time asymmetry of the world. John Wheeler believed in and wrote about this in the 1950s--that there might be some pre-geometry, that would give rise to geometry just like atoms give rise to the continuum of elastic bodies--and people play around with that.
So, instead of curing the patient and removing the dangerous sources, we're actually creating an environment inside the patient that selects for organisms that are dangerous. But if we add the bacteria back into the mouse, literally with probiotic formulations into its gut, it'll start to hide back in the boxes like a normal mouse would. And we've started to demonstrate this in humans by taking formulations of bacteria that produce gamma-Aminobutyric acid [GABA], a neurotransmitter, which produces a sense of calm. So if we took the bacteria from an obese person and we put it into a mouse, the mouse would put on more calories than a mouse that got bacteria from a thin person for the same calorific intake, and for the same exercise regimen.