An artificial intelligence system is capable of being an "inventor" under Australian patent law, the federal court has ruled, in a decision that could have wider intellectual property implications. University of Surrey professor Ryan Abbott has launched more than a dozen patent applications across the globe, including in the UK, US, New Zealand and Australia, on behalf of US-based Dr Stephen Thaler. They seek to have Thaler's artificial intelligence device known as Dabus (a device for the autonomous bootstrapping of unified sentience) listed as the inventor. The applications claimed Dabus, which is made up of artificial neural networks, invented an emergency warning light and a type of food container, among other inventions. Several countries, including Australia, had rejected the applications, stating a human must be named the inventor.
With digital marketing, good, clean, and insightful data is a key pillar which a business stands to drive growth and profits. Having clear and precise data-driven outcomes should be a priority for all marketers. When used in tandem with well-defined marketing and sales goals, and various marketing tools and techniques, companies will discover that their lead to sale conversion process can be far less cumbersome and more rewarding. Possessing clean data will help marketers identify detailed segments based on user attributes, past behaviours, interactions, and other necessary data points. Data can be leveraged for highly targeted campaigns which will drive marketing return on investment (ROI).
Representatives from Google have told an Australian Parliamentary committee looking into foreign interference that the country has not been the target of coordinated influence campaigns. "We've not seen the sort of foreign coordinated foreign influence campaigns targeted at Australia that we have with other jurisdictions, including the United States," Google director of law enforcement and information security Richard Salgado said. "Some of the disinformation campaigns that originate outside Australia, even if not targeting Australia, may affect Australia as collateral ... but not as a target of the campaign. "We have found no instances of foreign coordinated influence campaigns targeting Australia." While acknowledging campaigns that reach Australia do exist, he reiterated they have not specifically targeted Australia. "Some of these campaigns are broad enough that the disinformation could be, sort of, divisive in any jurisdiction in which it is consumed, even if it's not targeting that jurisdiction," Salgado told the Select Committee on Foreign Interference Through Social Media. "Google services, YouTube in particular, which is where we have seen most of these kinds of campaigns run, isn't really very well designed for the purpose of targeting groups to create the division that some of the other platforms have suffered, so it isn't actually all that surprising that we haven't seen this on our services." Appearing alongside Salgado on Friday was Google Australia and New Zealand director of government affairs and public policy Lucinda Longcroft, who told the committee her organisation has been in close contact with the Australian government as it looks to prevent disinformation from emerging leading up the next federal election. Additionally, the pair said that Google undertakes a "constant tuning" of the artificial intelligence and machine learning tech used. It said it also constantly adjusts policies and strategies to avoid moments of surprise, where Google could find itself unable to handle a shift in attacker strategy or shift in volume of attack. Appearing earlier in the week before the Parliamentary Joint Committee on Corporations and Financial Services, Google VP of product membership and partnerships Diana Layfield said her company does not monetise data from Google Pay in Australia. "I suppose you could argue that there are non-transaction data aspects -- so people's personal profile information," she added. "If you sign up for an app, you have to have a Google account.
The Interactive Games and Entertainment Association (IGEA), the Australian Mobile Telecommunications Association (AMTA), and machinery manufacturer John Deere have once again pushed back on the proposal that any right to repair changes need to be introduced in Australia. In its response to the Productivity Commission's right-to-repair draft report, IGEA knocked back support for several of the recommendations that were put forward. These include enabling the Australian Competition and Consumer Commission (ACCC) to develop and publish estimates of the minimum expected durability for products, such as video game consoles and devices, and requiring manufacturers to include additional mandatory warranty text that state entitlements to consumer guarantees under Australian Consumer Law (ACL) do not require consumers to use authorised repair services or spare parts. It repeatedly cited in its latest submission [PDF] that making changes would "cause confusion for consumers", pointing out for instance that additional text may "erroneously cause consumers to believe that their entitlements under the voluntary warranty (as opposed to the guarantees) do not require consumers to use authorised repair services or spare parts (which may not necessarily be true)". As part of providing additional information to the Productivity Commission, IGEA added that if manufacturers were required to make additional repair information available where they could bypass Trusted Platform Modules, it would open up the potential for the information to be "weaponised" by malicious actors, particularly as there are no licensing or certification schemes for electronic repairers that would help manufacturers discern between legitimate and illegitimate repairers. IGEA also took the opportunity to defend video game console manufacturers saying that it is in the "financial interest" of console makers that customers have "well-functioning and reliable devices that last for years".
The government is wary of over-regulating new technologies such as artificial intelligence and will resist making ethics standards and codes mandatory for Australian businesses, Digital Economy minister Jane Hume says. In an address to the Committee for Economic Development of Australia (CEDA), Senator Hume said the federal government would play an enabling role in accelerating the growth of artificial intelligence, along with setting standards in terms of ethics. "AI, along with other digital technologies, will play an increasingly important role in our economy and society over the next decade and beyond," Senator Hume said. "As we continue to vault forward in this space, government has a pivotal role to play as an enabler, and as a standard setter – particularly in regards to ethics. "The government has a significant responsibility … to ensure that AI, as an industry as well as a technology, has every chance to flourish, making sure we have the right settings, skills and expertise in place to ensure Australia is a global forerunner." The May budget allocated $124 million to artificial intelligence initiatives, including $50 million for a National AI Intelligence Centre within CSIRO and $34 million in grants for AI projects addressing national challenges. The Coalition has also unveiled AI ethics principles, with eight guiding principles "designed to help achieve safer and more reliable outcomes for all Australians". These principles and other standards around AI are currently entirely voluntary for Australian businesses, and Senator Hume said the government will avoid making them mandatory. "I obviously would rather have a voluntary code where industry has the input to what's in the code.
Australia's Minister for Superannuation, Financial Services and the Digital Economy Jane Hume has assured the country's AI ethics framework will remain voluntary for the foreseeable future. As part of her address during the virtual CEDA AI Innovation in Action event on Tuesday, Hume explained there were sufficient regulatory frameworks in place and that another one would be unnecessary. "We already have a very strong regulatory framework; we already have privacy laws, we already have consumer laws, we already have a data commissioner, we already have a privacy commissioner, we have a misconduct regulator. We have all those guardrails that already sit around the way we run our businesses," she told ZDNet. "AI is simply a technology that's being imposed upon an existing business. It's important that technology is being used to solve problems. The problems themselves haven't really changed, so our regulations certainly have to be flexible enough to accommodate technology changes … we want to make sure that there's nothing in regulations and legislation that prevents the advancement of technology. "But at the same time, building new regulations for technology, unless we can see a use case for it, is something that we would be reluctant to do, to over legislate and overprescribe." The federal government developed the national AI ethics framework in 2019, following the release of a discussion paper by Data61, the digital innovation arm of the Commonwealth Scientific and Industrial Research Organisation (CSIRO). The discussion paper highlighted a need for development of AI in Australia to be wrapped with a sufficient framework to ensure nothing is set onto citizens without appropriate ethical consideration. Making up the framework are eight ethical principles: Human, social and environment wellbeing; human-centred values in respect to human rights, diversity, and the autonomy of individuals; fairness; privacy protection and security of data; reliability and safety in accordance with the intended purpose of the AI systems; transparency and explainability; contestability; and accountability. Hume believes the principles have been designed in a way that make them "kind of universal" and therefore industry would be willing to adopt them voluntarily. "There's nothing in there that people would feel uncomfortable with, there's nothing that's too prescriptive … these are all things that we would expect.
Wildfires across the world have been increasing in frequency and severity over the last five years. A number of records were broken in 2020 thanks to Australia's brush fires as well as wildfires in Spain and the Western United States. Due to the increase in devastation from fires, firefighters are also facing greater health risks after battling against blazes. To help with this problem, the Linux Foundation has announced that it will host Pyrrha -- a solution created by AI-platform Prometeo that uses artificial intelligence and the internet of things to guard the safety of firefighters. Prometeo won IBM's 2019 Call for Code Global Challenge after designing their platform, which monitors and acts on firefighter health and safety in real-time and over the long-term.
Backpacks to track bees, bushfire modelling and sensors to detect broken water pipes are some of the technologies being developed in Australia as part of an artificial intelligence boom. Digital Economy Minister Jane Hume says artificial intelligence has the capacity to improve the lives of all Australians. Senator Hume will tell a Committee for Economic Development of Australia event on Tuesday the government had two roles to play in terms of AI: an enabler and a standards setter, especially in terms of ethics. Earlier this year the government announced a $1.2 billion digital economic strategy which included an additional $124 million commitment to the AI initiatives. The CSIRO estimates AI technology will contribute $22 trillion into the global economy by 2030.
The one glaring gap in the Commonwealth government's AI strategy and action plan is a process to develop a coordinated governance framework around the development, use and procurement of AI services within commonwealth government agencies. This is where the NSW Government has taken a clear lead, setting out a mandatory customer service circular which all NSW Government agencies need to adhere to. There is practical guidance on adhering to principles, assessing risk, managing data, sourcing AI solutions, meeting legal obligations and more.
Earlier this year, Australia's Productivity Commission released an interim report that looked into vulnerable supply chains, focusing on imports. A final report is now sitting with the government and expected to focus on exports. The purpose of the work led by the Productivity Commission is explained as examining the nature and source of risks to the effective functioning of the Australian economy and Australians' wellbeing associated with disruptions to global supply chains, and to identify any significant vulnerabilities and possible approaches to managing them. "Improvements in technology and trade liberalisation have made it easier and cheaper to source many goods and services from overseas. This has brought benefits from specialisation and economies of scale. It has also lifted the complexity of supply chains -- modern supply chains often rely on inputs from across the globe and can consist of thousands of firms," the report [PDF] said, using the Toyota supply chain as an example, which consists of over 2,100 suppliers.