strachey
UnKE: Unstructured Knowledge Editing in Large Language Models
Deng, Jingcheng, Wei, Zihao, Pang, Liang, Ding, Hanxing, Shen, Huawei, Cheng, Xueqi
Recent knowledge editing methods have primarily focused on modifying structured knowledge in large language models, heavily relying on the assumption that structured knowledge is stored as key-value pairs locally in MLP layers or specific neurons. However, this task setting overlooks the fact that a significant portion of real-world knowledge is stored in an unstructured format, characterized by long-form content, noise, and a complex yet comprehensive nature. The "knowledge locating" and "term-driven optimization" techniques conducted from the assumption used in previous methods (e.g., MEMIT) are ill-suited for unstructured knowledge. To address these challenges, we propose a novel unstructured knowledge editing method, namely UnKE, which extends previous assumptions in the layer dimension and token dimension. Firstly, in the layer dimension, we discard the "knowledge locating" step and treat first few layers as the key, which expand knowledge storage through layers to break the "knowledge stored locally" assumption. Next, we replace "term-driven optimization" with "cause-driven optimization" across all inputted tokens in the token dimension, directly optimizing the last layer of the key generator to perform editing to generate the required key vectors. By utilizing key-value pairs at the layer level, UnKE effectively represents and edits complex and comprehensive unstructured knowledge, leveraging the potential of both the MLP and attention layers. Results on newly proposed unstructure knowledge editing dataset (UnKEBench) and traditional structured datasets demonstrate that UnKE achieves remarkable performance, surpassing strong baselines.
- Asia > Singapore (0.04)
- Asia > Indonesia > Bali (0.04)
- North America > United States > Louisiana > Orleans Parish > New Orleans (0.04)
- (9 more...)
- Government (1.00)
- Law (0.69)
An Interview with Dana Scott
ACM fellow Dana Stewart Scott, the recipient jointly with Michael Rabin of the 1976 A.M. Turing Award for the concept of nondeterministic finite automata, has made seminal contributions spanning computing science, mathematics, philosophy, automata theory, modal logic, model theory, set theory, and the theory of programming languages. After receiving a B.A. in mathematics from the University of California, Berkeley, in 1950, and a Ph.D. from Princeton University in 1958, he held faculty positions at the University of Chicago, UC Berkeley, and at Stanford, Princeton, Oxford, and Carnegie Mellon Universities. He retired as University Professor from CMU in 2003. The distinguished theoretical computer scientist Gordon Plotkin conducted a series of four oral history interviews of Scott between November 2020 and February 2021. The interviews, the transcripts and videos of which are online,a cover primarily the period leading up to the 1976 ACM A.M. Turing Award. Presented here is a condensed and highly edited version, which includes some additional post-interview material provided by Scott. I was born in 1932 in Berkeley, CA, where I am now in retirement. We lived on a farm near Susanville when I started first grade in a one-room school-house.
- North America > United States > California > Alameda County > Berkeley (0.54)
- North America > United States > Illinois > Cook County > Chicago (0.25)
- Europe > Netherlands > North Holland > Amsterdam (0.05)
- (7 more...)
- North America > United States > Pennsylvania (0.05)
- North America > United States > New York (0.05)
- North America > United States > Massachusetts (0.05)
Christopher Strachey's Nineteen-Fifties Love Machine
Overwrought love letters began turning up on the notice board at the University of Manchester's computer lab in August, 1953. Dripping with lustful vocabulary, they were all variations on a basic syntactic template: "YOU ARE MY [adjective] [noun]. And the signatory was always the same: "M.U.C.," for the Manchester University computer, a Ferranti Mark 1, the world's first general-purpose and commercially available machine of its kind. But the real author of the letters (in the first instance, anyway) was Christopher Strachey, a pioneering programmer. As he confessed in an article the following year, "There are many obvious imperfections in this scheme (indeed very little thought went into its devising), and the fact that the vocabulary was largely based on Roget's Thesaurus lends a very peculiar flavor to the results." For Strachey, though, the interesting thing was how a simple setup, using only about seventy base words, could produce a combinatorial explosion of results--on the order of three hundred billion different letters. The lovelorn user could run the program over and over until his fingers seized up, and never see the same letter twice. Strachey was something of an outlier, according to Martin Campbell-Kelly, a historian of computing at the University of Warwick. While scientists and mathematicians of the day typically used computers strictly for numerical calculations, like analyzing weapons trajectories or seeking prime factors of huge numbers, his fascination was with non-numerical computations--what soon became known as artificial intelligence. "Strachey grabbed hold of that much more than anybody else," Campbell-Kelly told me. The results were not always lovey-dovey. Besides training the Mark 1 to churn out billets-doux, he also taught it to play checkers ("draughts," in British parlance). If M.U.C.'s opponent made too many mistakes, it would get crotchety and print out a reprimand: "I refuse to waste any more time.
- Oceania > Australia (0.05)
- North America > United States > Virginia (0.05)
- Europe > United Kingdom (0.05)
- (3 more...)
1950s electro
The earliest known recording of music produced by a computer - a machine operated by Alan Turing, no less - has finally been made to sound exactly as it did 65 years ago. The performance is halting and the tone reedy. It starts with a few bars of the national anthem, then a burst of Baa Baa Black Sheep, followed by a truncated rendition of Glenn Miller's swing hit In The Mood. ("The machine's obviously not in the mood," an engineer can be heard remarking when it stops mid-way.) But the rudimentary audio track is a landmark - the first time that music played on a computer is known to have been recorded. It was captured by the BBC in the Autumn of 1951 during a visit to the University of Manchester, where the Ferranti Mark 1 - the world's first commercially available general purpose computer - was based.
2 Logic and learning: Turing's legacy S. Muggleton
Turing's best known work is concerned with whether universal machines can decide the truth value of arbitrary logic formulae. However, in this paper it is shown that there is a direct evolution in Turing's ideas from his earlier investigations of computability to his later interests in machine intelligence and machine learning. Turing realised that machines which could learn would be able to avoid some of the consequences of Godes and his results on incompleteness and undecidability. Machines which learned could continuously add new axioms to their repertoire. Inspired by a radio talk given by Turing in 1951, Christopher Strachey went on to implement the world's first machine learning program.
- North America > United States (0.93)
- Europe > United Kingdom > England (0.15)