Goto

Collaborating Authors

Australia Government


Artificial intelligence impact on society

#artificialintelligence

Three friends were having morning tea on a farm in the Northern Rivers region in New South Wales (NSW), Australia, when they noticed a drilling rig setting up in a neighbor's property on the opposite side of the valley. They had never heard of the coal seam gas (CSG) industry, nor had they previously considered activism. That drilling rig, however, was enough to push them into action. The group soon became instrumental in establishing the anti-CSG movement, a movement whose activism resulted in the NSW government suspending gas exploration licenses in the area in 2014.2 By 2015, the government had bought back a petroleum exploration license covering 500,000 hectares across the region.3 Mining companies, like companies in many industries, have been struggling with the difference between having a legal license to operate and a moral4 one. The colloquial version of this is the distinction between what one could do and what one should do--just because something is technically possible and economically feasible doesn't mean that the people it affects will find it morally acceptable. Without the acceptance of the community, firms find themselves dealing with "never-ending demands" from "local troublemakers" hearing that "the company has done nothing for us"--all resulting in costs, financial and nonfinancial,5 that weigh projects down. A company can have the best intentions, investing in (what it thought were) all the right things, and still experience opposition from within the community. It may work to understand local mores and invest in the community's social infrastructure--improving access to health care and education, upgrading roads and electricity services, and fostering economic activity in the region resulting in bustling local businesses and a healthy employment market--to no avail. Without the community's acceptance, without a moral license, the mining companies in NSW found themselves struggling. This moral license is commonly called a social license, a phrase coined in the '90s, and represents the ongoing acceptance and approval of a mining development by a local community. Since then, it has become increasingly recognized within the mining industry that firms must work with local communities to obtain, and then maintain, a social license to operate (SLO).6 The concept of a social license to operate has developed over time and been adopted by a range of industries that affect the physical environment they operate in, such as logging or pulp and paper mills. What has any of this to do with artificial intelligence (AI)?


Lifeguards with drones keep humans and sharks safe

#artificialintelligence

A teenager in New South Wales recently died after a fatal shark bite, adding to four other unprovoked shark-related deaths this year. These tragic events send shockwaves through the community and re-ignite our fear of sharks. They also fuel the debate around the best way to keep people safe in the water while minimising impacts on marine wildlife. This was the aim of a five-year trial of shark-mitigation technology--the Shark Management Strategy – which finished recently. The NSW government created this initiative in response to an unprecedented spike in shark bites in 2015, particularly on the north coast of NSW.


'Booyaaa': Australian Federal Police use of Clearview AI detailed

ZDNet

Earlier this year, the Australian Federal Police (AFP) admitted to using a facial recognition tool, despite not having an appropriate legislative framework in place, to help counter child exploitation. The tool was Clearview AI, a controversial New York-based startup that has scraped social media networks for people's photos and created one of the biggest facial recognition databases in the world. It provides facial recognition software, marketed primarily at law enforcement. The AFP previously said while it did not adopt the facial recognition platform Clearview AI as an enterprise product and had not entered into any formal procurement arrangements with the company, it did use a trial version. Documents published by the AFP under the Freedom of Information Act 1982 confirmed that the AFP-led Australian Centre to Counter Child Exploitation (ACCCE) registered for a free trial of the Clearview AI facial recognition tool and conducted a pilot of the system from 2 November 2019 to 22 January 2020.


Amazon to create thousands of jobs at robotic mega warehouse – IAM Network

#artificialintelligence

We've got the roads, the rail and the airport to keep growing this nation, keep getting those products out of the warehouses and into people's shops and into people's homes," he said. Amazon's new hub is a "boost for this community," said NSW Premier Gladys Berejiklian. "People won't need to travel those longer distances to get the best jobs available. They'll be able to live and work near their communities, which is exactly what we want," Berejiklian said. Other retailers in Australia are gearing up for an increase in automation in their own logistics.


Australian Authorities Want an AI To Settle Your Divorce

#artificialintelligence

For better or worse, there's a good chance your current love life owes something to automation. Even if you're just hooking up with the occasional Tinder fling (which if you are, no judgment), you're still turning to Tinder's black-box algorithms to pick out that fling for you before turning to more black-box algorithms to pick out the best dingy bar to meet them at before turning to more black-box algorithms to figure out what, exactly, should be your date night lewk. If things get serious further down the line, you might turn to another black-box algorithm to plan your entire damn wedding for you. And if it turns out you got married for all the wrong reasons, it turns out there's another set of black boxes you can plug your details into to settle the details of your divorce. Known as "amica," the service was rolled out yesterday by the Australian government as a way to let soon-to-be-exes "make parenting arrangements" and "divide their money and property" without having to go through the hassle of hiring a lawyer to do the heavy lifting.


Australian government sinks AU$19 million into AI health research projects

ZDNet

The Australian government has announced it will invest AU$19 million over three years into artificial intelligence-based health research projects designed to prevent, diagnose, and treat a range of health conditions. There are five projects in total that will receive funding as part of this announcement. The Centre for Eye Research Australia and the University of New South Wales (UNSW) will each receive nearly AU$5 million for their research projects. The Centre for Eye Research Australia has developed an AI system to detect eye and cardiovascular diseases, while UNSW is focused on using AI to understand and improve the treatment of mental health, including stress, anxiety, and depression. Another AU$7 million is being put towards two projects developed by the University of Sydney (USyd).


Australia's National Intelligence Office seeks 'smart' satellites

ZDNet

The Australian government, through the Office of National Intelligence (ONI), is hoping to progress research on "smart" satellites. In a request for tender (RFT), ONI is seeking a provider of research and engineering services in order to develop, build, test, launch, and operate a prototype or proof-of-concept smart satellite to demonstrate the application of miniaturised satellite systems with on-board machine learning (ML) and artificial intelligence (AI) applications. ONI formally came into being on 20 December 2018 following the passage of the Office of National Intelligence Act 2018 a month prior. The National Intelligence Community (NIC) encompasses 10 Australian security and intelligence agencies: ONI, the Australian Signals Directorate, the Australian Geospatial-Intelligence Organisation, the Australian Secret Intelligence Service, the Australian Security Intelligence Organisation and the Defence Intelligence Organisation, as well as the Australian Criminal Intelligence Commission and the intelligence functions of the Australian Federal Police, Australian Transaction Reports and Analysis Centre, and The Department of Home Affairs. ONI is responsible for enterprise-level management of the NIC, aiming to provide a single point of accountability to the prime minister and National Security Committee of Cabinet.


South Australia Health introduces bot to answer COVID-19 queries

ZDNet

The South Australian government has rolled out a chatbot, nicknamed Zoe, to help answer COVID-19 queries. The virtual agent, developed by Adelaide-based tech firm Clevertar, has initially been designed to provide users with answers and relevant links to further information. Currently, it's able to answer a set of pre-defined questions. "Zoe was specifically implemented in response to COVID-19 to help reduce the extra pressure on South Australia's hospital switchboards and the 000 line, which experienced a surge in demand as a result of COVID-19 enquiries," a SA Health spokesperson told ZDNet. "The primary objectives were to provide the public with an additional, reliable source of COVID-19 information, and ultimately allow our operational services to focus on delivering health and emergency services."


Airlines take no chances with our safety. And neither should artificial intelligence

#artificialintelligence

You'd thinking flying in a plane would be more dangerous than driving a car. In reality it's much safer, partly because the aviation industry is heavily regulated. Airlines must stick to strict standards for safety, testing, training, policies and procedures, auditing and oversight. And when things do go wrong, we investigate and attempt to rectify the issue to improve safety in the future. Other industries where things can go very badly wrong, such as pharmaceuticals and medical devices, are also heavily regulated.


Public Authorities as Defendants: Using Bayesian Networks to determine the Likelihood of Success for Negligence claims in the wake of Oakden

arXiv.org Artificial Intelligence

Several countries are currently investigating issues of neglect, poor quality care and abuse in the aged care sector. In most cases it is the State who license and monitor aged care providers, which frequently introduces a serious conflict of interest because the State also operate many of the facilities where our most vulnerable peoples are cared for. Where issues are raised with the standard of care being provided, the State are seen by many as a deep-pockets defendant and become the target of high-value lawsuits. This paper draws on cases and circumstances from one jurisdiction based on the English legal tradition, Australia, and proposes a Bayesian solution capable of determining probability for success for citizen plaintiffs who bring negligence claims against a public authority defendant. Use of a Bayesian network trained on case audit data shows that even when the plaintiff case meets all requirements for a successful negligence litigation, success is not often assured. Only in around one-fifth of these cases does the plaintiff succeed against a public authority as defendant.