Goto

Collaborating Authors

 eniac


No free lunch in LLM watermarking: Trade-offs in watermarking design choices

AIHub

Advances in generative models have made it possible for AI-generated text, code, and images to mirror human-generated content in many applications. Watermarking, a technique that embeds information in the output of a model to verify its source, aims to mitigate the misuse of such AI-generated content. Current state-of-the-art watermarking schemes embed watermarks by slightly perturbing probabilities of the LLM's output tokens, which can be detected via statistical testing during verification. Unfortunately, our work shows that common design choices in LLM watermarking schemes make the resulting systems surprisingly susceptible to watermark removal or spoofing attacks--leading to fundamental trade-offs in robustness, utility, and usability. To navigate these trade-offs, we rigorously study a set of simple yet effective attacks on common watermarking systems and propose guidelines and defenses for LLM watermarking in practice. Here, we briefly introduce LLMs and LLM watermarks.


Introduction to Artificial Intelligence.

#artificialintelligence

The starting point of modern information technology has as a starting point the year 1945 and the machine that defeated the Enigma code, the ENIAC, and the English mathematician and cryptanalyst, Alan Turing. "The original question, can machines think?" Forty years of development, starting from ENIAC, led to IBM's supercomputer Deep Blue. In 1985, Garry Kasparov became the world champion in chess beating 32 opponents, simultaneously. Deep Blue's predecessor, "Deep Thought", lost two times by the world chess champion Garry Kasparov in 1989.


Becoming Universal

Communications of the ACM

How to fit the history of computing into a book that can be picked up without needing a forklift truck? That was my challenge in writing A New History of Modern Computing5 (hereafter the "new history") with Paul Ceruzzi. Now we had to tell the story of billions of them, drawing on the work of an ever-expanding research community to help us find a story hiding among all the model numbers. I should be clear up front that this is an academic history of computing. Trade books are the ones that get stocked in bookstores, reviewed in newspapers, and so on. Their editors will select and rewrite manuscripts with a mass audience in mind. Trade publishers appear to have decided, perhaps correctly, the only way to sell books on the history of computing is to stuff them with people and stories that readers already know about while nevertheless insisting they are tragically forgotten. Their books feature a lot of Charles Babbage, Alan Turing, and other "geniuses." They obsess over the question of the "first computer" and spend a lot of time in the 1940s laboriously weighing evidence for the primacy of one invention or another before awarding the crown.


Roots of 'Program' Revisited

Communications of the ACM

Today, it is a widely accepted thesis amongst historians and computer scientists that the modern notion of computer programs has its roots in the work of John von Neumann. This is symptomatic of a general tendency amongst academic computer scientists to search for the foundations of their field in logic and mathematics and, accordingly, also its historical roots. This is a distorted view of what happened: at best, the modern computer was driven by concerns of applied mathematics and developed by a collective of people (mathematicians, engineers, physicists, (human) computers, and so forth). We will not repeat why, in computing, history is reshaped in function of disciplinary identity.2,15 Instead, we will revisit the origins of the word "program" and argue for the need of a deeper historical understanding, not just for the sake of academic history, but for the sake of the field itself.


Introduction to Machine Learning for Beginners

#artificialintelligence

We have seen Machine Learning as a buzzword for the past few years, the reason for this might be the high amount of data production by applications, the increase of computation power in the past few years and the development of better algorithms. Machine Learning is used anywhere from automating mundane tasks to offering intelligent insights, industries in every sector try to benefit from it. You may already be using a device that utilizes it. But there are much more examples of ML in use. It was in the 1940s when the first manually operated computer system, ENIAC (Electronic Numerical Integrator and Computer), was invented.


Female Pioneers in Data Science You May Not Know

#artificialintelligence

Whilst many will be familiar with our Women in AI lists which include those currently pushing boundaries in the present day, we thought we would put together a list of women who have been instrumental in the advancement of Computer Science and Data Science, providing the foundations for AI in the 21st Century. How many of the below are you familiar with? Dame Mary Cartwright was a University of Oxford Graduate in Mathematics at a time in which Women had only just been allowed to take degree classifications at the prestigious school. Mary then obtained a Yarrow fellowship at Cambridge University, later pursuing research in the theory of Functions through until her retirement in 1968, becoming one of the first to study what would later become known as Chaos theory. Cartwright had a distinguished career in analytic function theory and university administration, publishing over 100 papers on classical analysis, differential equations and related topological problems.


Untold History of AI: Invisible Women Programmed America's First Electronic Computer

IEEE Spectrum Robotics

The history of AI is often told as the story of machines getting smarter over time. What's lost is the human element in the narrative, how intelligent machines are designed, trained, and powered by human minds and bodies. In this six-part series, we explore that human history of AI--how innovators, thinkers, workers, and sometimes hucksters have created algorithms that can replicate human thought and behavior (or at least appear to). While it can be exciting to be swept up by the idea of super-intelligent computers that have no need for human input, the true history of smart machines shows that our AI is only as good as we are. On 14 February 1946, journalists gathered at the Moore School of Engineering at the University of Pennsylvania to witness a public demonstration of one of the world's first general-purpose electronic digital computers: the Electronic Numerical Integrator and Computer (ENIAC).


10 Women in Science and Tech Who Should Be Household Names

WIRED

It's International Women's Day, a day to celebrate the achievements of women around the world and throughout history. But the day is also about recognizing the hardships women face, and the continued urgency of the fight for gender equality. That is true of WIRED's world, too--the world of technology and science, of media and innovation. Though this magazine was co-founded by a woman, and women have been key figures in every part of scientific and technological progress, men's narratives still dominate. Men still hold more STEM jobs.


The Rise Of Machines That Think

#artificialintelligence

This week's milestones in the history of technology include the end of life of one of the first examples of artificial intelligence or "giant brains" and its 50th anniversary, patents for the transistor, xerography, and carbon paper, and the first solar-powered mobile phone. At 11:45pm, the power to the Electronic Numerical Integrator and Computer (ENIAC), is removed. For a few years after it started calculating in 1946, it was "the only fully electronic computer working in the U.S." Thomas Haigh, Mark Priestley and Crispin Rope write in ENIAC in Action: Making and Remaking the Modern Computer: Since 1955, When ENIAC punched its last card, its prominence has only grown… ENIAC was as much symbol as machine, producing cultural meanings as well as numbers… In its own small way, ENIAC has returned frequently to the forefront of public awareness over the decades as a symbol of a variety of virtues and vices. Among other things, the ENIAC was a symbol of the computer as a giant brain (see October 8 entry below), giving rise to today's warnings that artificial intelligence "will be able to do everything better than us." Walter H. Brattain and John Bardeen are granted a patent for a three-electrode circuit element utilizing semiconductive materials, otherwise known as the transistor.


Our duty to connect technology and humanity – Rohan Rajiv – Medium

#artificialintelligence

"Man," here, stands for the collective human race. But, why not use the latin word for "Wise woman" or "Wise person?" There was a movement in the tech world a few years ago to use female pronouns more often. Here's another question -- why do we call a list of bad things a "blacklist?" And, why is the opposite a "whitelist?" Why does white represent good and black represent bad?