For better or worse, there's a good chance your current love life owes something to automation. Even if you're just hooking up with the occasional Tinder fling (which if you are, no judgment), you're still turning to Tinder's black-box algorithms to pick out that fling for you before turning to more black-box algorithms to pick out the best dingy bar to meet them at before turning to more black-box algorithms to figure out what, exactly, should be your date night lewk. If things get serious further down the line, you might turn to another black-box algorithm to plan your entire damn wedding for you. And if it turns out you got married for all the wrong reasons, it turns out there's another set of black boxes you can plug your details into to settle the details of your divorce. Known as "amica," the service was rolled out yesterday by the Australian government as a way to let soon-to-be-exes "make parenting arrangements" and "divide their money and property" without having to go through the hassle of hiring a lawyer to do the heavy lifting.
Alethea AI, a synthetic media company, is piloting âprivacy-preserving face skins,â or digital masks that counter facial recognition algorithms and help users preserve privacy on pre-recorded videos.Â The move comes as companies such as IBM, Microsoft, and Amazon announced they would suspend the sale of their facial recognition technology to law enforcement agencies.Â âThis is a new technique we developed inhouse that wraps a face with our AI algorithms,â said Alethea AI CEO Arif Khan. âAvatars are fun to play with and develop, but these âmasks/skinsâ are a different, more potent, animal to preserve privacy.â Related: Why CoinDesk Respects Pseudonymity: A Stand Against Doxxing See also: Human Rights Foundation Funds Bitcoin Privacy Tools Despite âCoin Mixingâ Legal Stigma The Los Angeles based startup launched in 2019 with a focus on creating avatars for content creators that the creators could license out for revenue. The idea comes as deepfakes, or manipulated media that can make someone appear as if they are doing or saying anything, becomes more accessible and widespread. According to a 2019 report from Deep Trace, a company which detects and monitors deepfakes, there were over 14,000 deepfakes online in 2019 and over 850 people were targeted by them. Alethea AI wants to let creators use their own synthetic media avatars for marketing purposes, in a sense trying to let people leverage deepfakes of themselves for money.Â Khan compares the proliferation of facial recognition data now to the Napster-style explosion in music piracy in the early 2000s. Companies, like Clearview AI, have already harvested large amounts of data from people for facial recognition algorithms, then resold this dataÂ to security services without consent, and with all the bias inherent in facial recognition algorithms, which are generally less accurate on women and people of color.Â Related: The Zcash Privacy Tech Underlying Ethereumâs Transition to Eth 2.0 Clearview AI, has marketed itself to law enforcement and scraped billions of images from websites like Facebook, Youtube, and Venmo. The company is currently being sued for doing so.Â Â âWe will get to a point where there needs to be an iTunes sort of layer, where your face and voice data somehow gets protected,â said Khan.Â One part of that is creators licensing out their likeness for a fee. Crypto entrepreneur Alex Masmej was the first such avatar, and for $99 you can hire the avatar to say 200 words of whatever you want, provided the real Masmej approves the text.Â We will get to a point whereâ¦ where your face and voice data somehow gets protected Alethea AI has also partnered with software firm Oasis Labs, so that all content generated for Alethea AIâs synthetic media marketplace will be verified using Oasis Labâs secure blockchain, akin to Twitterâs âverifiedâ blue check mark.Â âThere are a lot of Black Mirror scenarios when we think of deepfakes but if my personal approval is needed for my deepfakes and itâs then time-stamped on a public blockchain for anyone to verify the videos that I actually want to release, that provides a protection that deepfakes are currently lacking,â said Masmej.Â The privacy pilot takes this idea one step further, not only creating a deep fake license out, but preventing companies or anyone from grabbing your facial data from a recording.Â There are two parts to the privacy component. The first, currently being piloted, involves pre-recorded videos. Users upload a video, identify where and what face skin they would like superimposed on their own, and then Alethea AIâs algorithms map the key points on your own face, and wrap the mask around this key point map that is created. The video is then sent back to a client.Â See also: Fake News on Steroids: Deepfakes Are Coming â Are World Leaders Prepared? Alethea AI also wants to enable face masking during real time communications, such as over a Zoom call. But Khan says computing power doesnât quite allow that yet, though it should be possible in a year, he hopes.Â Alethea AI piloted one example of the tech with Crypto AI Profit, a blockchain and AI influencer, who used it during a Youtube video.Â Deepfakes, voice spoofing, and other tech enabled mimicry seem here to stay, but Khan is still optimistic that weâre not yet at the point of no return when it comes to protecting ourselves.Â âIâm hopeful that the individual is accorded some sort of framework in this entire emerging landscape,â said Khan. âItâs going to be a very interesting ride. I donât think the battle is fully decided, although existing systems are oriented towards preserving larger, more corporate input.â Related Stories JD.com Subsidiary Rolling Out Privacy Tech From Blockchain Firm ARPA From Australia to Norway, Contact Tracing Is Struggling to Meet Expectations
Language is as important as expressions when reading emotion, a study has found -- meaning that being told someone looks'grumpy' can makes them seem grumpier. Researchers from Australia and the US asked volunteers to rate the emotions of people in either photographs or videos. The team found that when the participants were told that the subjects were feeling a specific emotion, this biased how they interpreted the expressions on show. The effect was most pronounced when dealing with angry, sad or scared faces -- as opposed to happy, disgusted, embarrassed, proud or surprised -- the team found. Language is as important as expressions when reading emotion, a study has found -- meaning that being told someone looks'grumpy' can makes them seem grumpier (stock image) 'The current studies demonstrate that language context alters the dimensional affective foundations that underlie our judgements of others' expressions,' the researchers wrote in their paper.
In a bid to protect beachgoers from the animals that live in the water they're entering, the New South Wales government will spend AU$8 million on a new strategy that includes a fleet of shark-spotting drones to patrol the state's coastline. Minister for Agriculture Adam Marshall is calling the strategy "shark management" and said it is based on five years of scientific research into shark behaviour and the most effective ways to protect beachgoers. "As a government, our number one priority is keeping people at our beaches safe and that's why we're rolling out a revamped strategy to reduce the risk of shark attacks," Marshall said on Wednesday. "Our world-leading research showed SMART drumlines and drones are the most effective detection and surveillance tools." The government, in partnership with Surf Life Saving NSW, will deploy new drones at 34 beaches across the state and deploy 35 SMART drumlines in locations deemed high-risk along the state's north coast.
Artificial intelligence (AI) improved skin cancer diagnostic accuracy when used in collaboration with human clinical checks, an international study including University of Queensland researchers has found. The global team tested for the first time whether a'real world', collaborative approach involving clinicians assisted by AI improved the accuracy of skin cancer clinical decision making. UQ's Professor Monika Janda said the highest diagnostic accuracy was achieved when crowd wisdom and AI predictions were combined, suggesting human-AI and crowd-AI collaborations were preferable to individual experts or AI alone "This is important because AI decision support has slowly started to infiltrate healthcare settings, and yet few studies have tested its performance in real world settings or how clinicians interact with it," Professor Janda said. "Inexperienced evaluators gained the highest benefit from AI decision support and expert evaluators confident in skin cancer diagnosis achieved modest or no benefit. "These findings indicated a combined AI-human approach to skin cancer diagnosis may be the most relevant for clinicians in the future." Although AI diagnostic software has demonstrated expert level accuracy in several image-based medical studies, researchers have remained unclear on whether its use improved clinical practice. "Our study found that good quality AI support was useful to clinicians but needed to be simple, concrete, and in accordance with a given task," Professor Janda said. "For clinicians of the future this means that AI-based screening and diagnosis might soon be available to support them on a daily basis.
The Commonwealth Bank of Australia spent around $750 million and 5 years of work to convert its platform from COBOL to Java. Migrating an existing codebase to a modern or more efficient language like Java or C requires expertise in both the source and target languages, and is often costly. Usually, a transcompiler is deployed that converts source code from a high-level programming language (such as C or Python) to another. Transcompilers are primarily used for interoperability, and to port codebases written in an obsolete or deprecated language (e.g. They typically rely on handcrafted rewrite rules, applied to the source code abstract syntax tree.
Global e-commerce giant Amazon has announced it will build what it is touted to be its largest fulfilment centre in Australia. To be built at Kemps Creek in western Sydney, the new storage and distribution centre will measure almost 200,000 square metres, which according to Amazon, is equivalent to the land size of Taronga Zoo or 22 rugby fields, and be able to house up to 11 million items. Amazon said it would be the first centre in the southern hemisphere where the company's "latest robotics systems" is deployed. "The Amazon robotics fulfilment centre will more than double our operational footprint in Australia, enhance efficiency and safety for our associates while ultimately providing customers with wider selection and faster delivery," Amazon Australia director of operations Craig Fuller said. "We look forward to creating more than 1,500 jobs, the majority of which are permanent full-time jobs, with the opportunity to work alongside advanced robotics to deliver the ultimate in service for customers."
"We needed to invest in a building of that type of size and scale so we can deliver the convenience, in terms of delivery speed, to the Australian customer base."Mr Fuller said while the centre would likely improve Amazon's delivery times across most of its Australian customers, the retailer would not know the material benefits of the centre until its completion in 2021.When we launched in Australia there were lots of unknowns…we had to learn the nuances of the Australian marketplaceCraig Fuller, Amazon Australia's director of operationsWhile Amazon operates around 30 robotic fulfilment centres internationally, this will be its first in Australia. The centre will still use humans to pick and pack items, but instead of workers walking to the shelves to pick the items, robotic units take the shelves to them, improving fulfilment time and reducing the amount of walking workers have to do.Amazon has faced criticism in the past over the treatment of its distribution centre workers, who have described working conditions at its Melbourne centre as a "hellscape" due to allegedly unrealistic performance targets.New South Wales Premier Gladys Berejiklian said the jobs created by the new centre come at a time the Australian economy " …
The Australian government has introduced a new online service to help separating couples work out parenting arrangements, how to divide assets, and how agreements are recorded. Developed by the National Legal Aid (NLA), with AU$3 million in funding from the Australian government, the tool, known as Amica, uses so-called artificial intelligence technology to suggest how couples could split their assets by taking into account their circumstances, the kinds of agreements reached by couples in similar situations, and how courts have handled similar disputes. Attorney-General Christian Porter said Amica would enable users to negotiate and communicate online with their former partner at their own pace. "Amica will be a valuable tool to help many couples resolve their disputes between themselves and avoid court proceedings," he said. The federal government said Amica is suitable for couples whose relationship is "relatively amicable", and could also be used by separating parents to develop a parenting plan for their children.
Is your business ready to embrace artificial intelligence (AI)? At a recent event, Microsoft's head of AI urged business leaders to get their heads around the applications and ethics of the technology, saying that over the next decade, every company is going to become led by AI. Speaking at Australia's Future Briefing event in February 2020, Mitra Azizirad, corporate vice president of Microsoft AI, said that AI has the potential to be more of a game changer than any technological advance that has come before it; it is the next technology set to "run the world." "Software has transformed every industry; you hear it all the time – every company became a software company," Azizirad said. "But that's really changing because AI is now a totally different way to create software."