Goto

Collaborating Authors

 farquhar


Scott Farquhar thinks Australia should let AI train for free on creative content. He overlooks one key point

The Guardian

Farquhar, the Tech Council of Australia CEO, told ABC's 7.30 program on Tuesday: "all AI usage of mining or searching or going across data is probably illegal under Australian law and I think that hurts a lot of investment of these companies in Australia". Farquhar's claim overlooks that this is not a settled issue in the US, and could have devastating effects on creative industries. Farquhar's argument is that it is not theft of people's work unless the AI is used to "copy an artist directly" such as creating a song in their style. "I do think people would say that, hey, if people are going to sit down with a digital companion, an AI song creator and they collaboratively work with an AI to create something new to the world, that's probably fair use." Farquhar said the benefits of large language models outweigh the issues raised by AI training its data on other people's work for free.


Scientists Develop New Algorithm to Spot AI 'Hallucinations'

TIME - Tech

An enduring problem with today's generative artificial intelligence (AI) tools, like ChatGPT, is that they often confidently assert false information. Computer scientists call this behavior "hallucination," and it's a key barrier to AI's usefulness. Hallucinations have led to some embarrassing public slip-ups. In February, AirCanada was forced by a tribunal to honor a discount that its customer-support chatbot had mistakenly offered to a passenger. In May, Google was forced to make changes to its new "AI overviews" search feature, after the bot told some users that it was safe to eat rocks. And last June, two lawyers were fined 5,000 by a U.S. judge after one of them admitted he had used ChatGPT to help write a court filing.


Farquhar

AAAI Conferences

Automatic extraction of heuristic estimates has been extremely fruitful in classical planning domains. We present a simple extension to the heuristic extraction process from the well-known HSP and FF systems which allow us to apply them to reward maximisation problems. These extensions involve computing an estimate of the maximal reward obtainable from a given state when ignoring delete lists, which are then used to guide the backward search in the FF system. The heuristics are evaluated in a simple robotic delivery task and shown to be effective in reducing the number of nodes evaluated. In this way we seek to apply recent advances in classical planning to a broader range of problems.


Letters

AI Magazine

At the risk of being scolded again for "employing universal truths and unarguable facts" in support of my position, I must point out that it is the responsibility of a scientist or engineer to document clearly the known limitations of any method he develops and publishes. In addition to truth in packaging, a clear and unblinking examination of the limitations of one's own work is an invaluable guide to further research. Akman observes, correctly, that QSIM is a purely mathematical formalism for expressing qualitative differential equation models of the world, and not a physical modeling methodology. Our research group has also been concerned with this limitation, so we have developed modelbuilding methods which compile QDEs for QSIM to simulate, either from a component-connection description of a device (Franke and Dvorak 1989, 1990), or from a physical scenario description via qualitative views and processes (Crawford, Farquhar, and Kuipers 1990). These two model-building methods are important elements of the QSIM perspective on qualitative reasoning (Kuipers 1989).


Scientists think doomsday is on its way and governments won't be able to save us

#artificialintelligence

Catastrophic climate change, nuclear war and natural disasters such as super volcanoes and asteroids could also pose a deadly risk to mankind, researchers said. It may sound like the stuff of sci-fi films, but experts said these apocalyptic threats are more likely than many realise. The report Global Catastrophic Risks, compiled by a team from Oxford University, the Global Challenges Foundation and the Global Priorities Project, ranks dangers that could wipe out 10% or more of the human population. It warns that while most generations never experience a catastrophe, they are far from fanciful, as the bouts of plague and the 1918 Spanish flu that wiped out millions illustrated. Sebastian Farquhar, director at the Global Priorities Project, told the Press Association: "There are some things that are on the horizon, things that probably won't happen in any one year but could happen, which could completely reshape our world and do so in a really devastating and disastrous way. "History teaches us that many of these things are more likely than we intuitively think."Many of these risks are changing and growing as technologies change and grow and reshape our world. But there are also things we can do about the risks."


Asteroids, robots and deadly viruses could kill millions, report warns

#artificialintelligence

The rise of robots and deadly viruses are among the threats that could wipe out swathes of humanity - but governments are failing to prepare properly for them, a new report warns. Catastrophic climate change, nuclear war and natural disasters such as super volcanoes and asteroids could also pose a deadly risk to mankind, researchers said. It may sound like the stuff of sci-fi films, but experts said these apocalyptic threats are more likely than many realise. The report Global Catastrophic Risks, compiled by a team from Oxford University, the Global Challenges Foundation and the Global Priorities Project, ranks dangers that could wipe out 10% or more of the human population. It warns that while most generations never experience a catastrophe, they are far from fanciful, as the bouts of plague and the 1918 Spanish flu that wiped out millions illustrated.


Asteroid, robots & viruses 'could kill millions'

#artificialintelligence

The rise of robots and deadly viruses are among the threats that could wipe out swathes of humanity - but governments are failing to prepare properly for them, a new report warns. Catastrophic climate change, nuclear war and natural disasters such as super volcanoes and asteroids could also pose a deadly risk to mankind, researchers said. It may sound like the stuff of sci-fi films, but experts said these apocalyptic threats are more likely than many realise. The report Global Catastrophic Risks, compiled by a team from Oxford University, the Global Challenges Foundation and the Global Priorities Project, ranks dangers that could wipe out 10% or more of the human population. It warns that while most generations never experience a catastrophe, they are far from fanciful, as the bouts of plague and the 1918 Spanish flu that wiped out millions illustrated.


Asteroids, robots and deadly viruses could kill millions, report warns

#artificialintelligence

The report Global Catastrophic Risks, compiled by a team from Oxford University, the Global Challenges Foundation and the Global Priorities Project, ranks dangers that could wipe out 10% or more of the human population. It warns that while most generations never experience a catastrophe, they are far from fanciful, as the bouts of plague and the 1918 Spanish flu that wiped out millions illustrated. Sebastian Farquhar, director at the Global Priorities Project, told the Press Association: "There are some things that are on the horizon, things that probably won't happen in any one year but could happen, which could completely reshape our world and do so in a really devastating and disastrous way. "History teaches us that many of these things are more likely than we intuitively think. "Many of these risks are changing and growing as technologies change and grow and reshape our world. But there are also things we can do about the risks."


Asteroids, robots and deadly viruses could kill millions, report warns

#artificialintelligence

The rise of robots and deadly viruses are among the threats that could wipe out swathes of humanity - but governments are failing to prepare properly for them, a new report warns. Catastrophic climate change, nuclear war and natural disasters such as super volcanoes and asteroids could also pose a deadly risk to mankind, researchers said. It may sound like the stuff of sci-fi films, but experts said these apocalyptic threats are more likely than many realise. The report Global Catastrophic Risks, compiled by a team from Oxford University, the Global Challenges Foundation and the Global Priorities Project, ranks dangers that could wipe out 10% or more of the human population. It warns that while most generations never experience a catastrophe, they are far from fanciful, as the bouts of plague and the 1918 Spanish flu that wiped out millions illustrated.


Letters to the Editor

Kuipers, Benjamin J.

AI Magazine

Thus far, I believe, describing various approximately 120 copies have been limitations of QSIM. At the risk of distributed. The QSIM program is a being scolded again for "employing research tool, not a product, so any universal truths and unarguable commercial rights are retained, and I facts" in support of my position, I cannot warrant that it is free of bugs. Hall examination of the limitations of University of Texas at Austin one's own work is an invaluable Austin, Texas 78712 guide to further research. Akman observes, correctly, that References QSIM is a purely mathematical formalism for expressing qualitative differential Crawford, J.M., Farquhar, A., and Kuipers, 8. 1590 QPC: A Compiler from equation models of the Phvsical Models into Qualitative Differential world, and not a physical modeling Equations In Pr&eedings of the Thank you for publishing our reply Akman's letter refers to his difficulties to Prof. Kuipers in the last issue.