Goto

Collaborating Authors

sutherland


Supreme Court to rule on 'paedophile hunters' case

BBC News

A convicted paedophile who was snared by a vigilante group is to have his case examined at the UK Supreme Court. Judges at the UK's highest court will consider whether prosecutions based on the covert operations of "paedophile hunters" breach the right to privacy. Mark Sutherland, 37, believed he was communicating with a 13-year-old boy on the dating app Grindr. But in reality it was a 48-year-old man who was part of a group called Groom Resisters Scotland. The Supreme Court will hold a virtual hearing to consider the case and will issue its judgement later.


12 Artificial Intelligence (AI) Milestones: 3. Computer Graphics Give Birth To Big Data

#artificialintelligence

The explosion of breakthroughs, investments, and entrepreneurial activity around artificial intelligence over the last decade has been driven exclusively by deep learning, a sophisticated statistical analysis technique for finding hidden patterns in large quantities of data. A term coined in 1955--artificial intelligence--was applied (or mis-applied) to deep learning, a more advanced version of an approach to training computers to perform certain tasks--machine learning--a term coined in 1959. The recent success of deep learning is the result of the increased availability of lots of data (big data) and the advent of Graphics Processing Units (GPUs), significantly increasing the breadth and depth of the data used for training computers and reducing the time required for training deep learning algorithms. The technology that animated movies like "Toy Story" and enabled a variety of special effects is the ... [ ] focus of this year's Turing Award, the technology industry's version of the Nobel Prize. The term "big data" first appeared in computer science literature in an October 1997 article by Michael Cox and David Ellsworth, "Application-controlled demand paging for out-of-core visualization," published in the Proceedings of the IEEE 8th conference on Visualization.


12 Artificial Intelligence (AI) Milestones: 3. Computer Graphics Give Birth To Big Data

#artificialintelligence

The explosion of breakthroughs, investments, and entrepreneurial activity around artificial intelligence over the last decade has been driven exclusively by deep learning, a sophisticated statistical analysis technique for finding hidden patterns in large quantities of data. A term coined in 1955--artificial intelligence--was applied (or mis-applied) to deep learning, a more advanced version of an approach to training computers to perform certain tasks--machine learning--a term coined in 1959. The recent success of deep learning is the result of the increased availability of lots of data (big data) and the advent of Graphics Processing Units (GPUs), significantly increasing the breadth and depth of the data used for training computers and reducing the time required for training deep learning algorithms. The technology that animated movies like "Toy Story" and enabled a variety of special effects is the ... [ ] focus of this year's Turing Award, the technology industry's version of the Nobel Prize. The term "big data" first appeared in computer science literature in an October 1997 article by Michael Cox and David Ellsworth, "Application-controlled demand paging for out-of-core visualization," published in the Proceedings of the IEEE 8th conference on Visualization.


2020: The year the office finds its voice?

#artificialintelligence

While voice-based digital assistants such as Amazon Alexa, Apple Siri and Google Assistant are becoming increasingly common at home – and smartphones and wearables can be used handsfree via speech – the use of voice in the workplace is just getting started. That's likely to change in 2020 and beyond. More efficient employees, "smarter" voice-based assistants, easier ways of completing routine tasks and a digital experience in the office that matches what's used at home. A survey by 451 Research in 2019 indicated that voice UIs and digital assistants are among the most disruptive technologies for enterprises (IoT and AI are the top two), with four in 10 respondents planning to adopt voice technology within 24 months. "I expect 2020 will be the year when voice user interfaces will become prevalent in the workplace," said Raúl Castañón-Martínez, a senior analyst at 451 Research.


Oracle's holistic AI/ML strategy resonates with businesses

#artificialintelligence

Oracle has a holistic AI/ML strategy, embedding intelligence at every layer of the cloud. According to Oracle, AI/Ml is resonating more with businesses. AI has come of age Why AI/Ml is resonating more with businesses? Andrew Sutherland, SVP Technology, Oracle EMEA and JAPAC, said there is a lot of focus on AI. He said: "We were a multi-discliplinary team. We worked in speech recognition. We digitised data and gave it to computers. We did learning of patterns. That was the beginning of AI in the form of machine learning. The key was the amount of data. The more data, the more accurate was the speech recognition. You start constructing a lot of new rules. The machine learns from the data presented."


Adopt AI or lose competitive edge, warns Oracle chief

#artificialintelligence

Businesses across the GCC must embrace artificial intelligence technology or risk losing out to competitors, according to Oracle's senior vice president for the region, Andrew Sutherland. Speaking on the sidelines of the Oracle OpenWorld event at the Dubai World Trade Centre yesterday (Tuesday), Sutherland argued that the maxim'if it ain't broke, don't fix it' no longer applies for companies due to the emergence of new efficiency-driving technologies, which will give first movers a distinct advantage over those who are slow to adopt. "I am strongly of the opinion now that artificial intelligence (AI) in general, and autonomous databases in particular, are an inevitability," he told reporters. "This is not just airlines or card processing companies. Even a modest-sized organisation will need to handle more and more data and will want to spend more and more time getting value from that data – not administering it. "If your neighbour is getting 70 per cent less cost in managing their data, how are you going to compete?


AI bias: 9 questions leaders should ask

#artificialintelligence

As the use of artificial intelligence applications – and machine learning – grows within businesses, government, educational institutions, and other organizations, so does the likelihood of bias. Researchers have studied and found significant racial bias in facial recognition technology, for example, and in particular in the underlying algorithms. That alone is a massive problem. When you more broadly consider the role AI and ML will play in societal and business contexts, the problem of AI bias becomes seemingly limitless – one that IT leaders and others need to pay close attention to as they ramp up AI and ML implementations. AI bias often begins with people, which runs counter to the popular narrative that we'll all soon be controlled by AI robot overlords.


The VA Wants to Use DeepMind's AI to Prevent Kidney Disease

WIRED

The human body is frail and people end up in intensive care units for all kinds of reasons. Whatever brings them there, more than half of adults admitted to an ICU end up sharing the same potentially life-threatening condition: kidney damage known as acute kidney injury. The Veterans Administration thinks artificial intelligence could reduce the toll. In a project that drew on roughly 700,000 medical records from US veterans, the agency worked with Google parent Alphabet's DeepMind unit to create software that attempts to predict which patients are likely to develop AKI. The VA hopes to test whether those predictions can help doctors prevent people from developing the condition.


Learning deep kernels for exponential family densities

arXiv.org Machine Learning

The kernel exponential family is a rich class of distributions,which can be fit efficiently and with statistical guarantees by score matching. Being required to choose a priori a simple kernel such as the Gaussian, however, limits its practical applicability. We provide a scheme for learning a kernel parameterized by a deep network, which can find complex location-dependent local features of the data geometry. This gives a very rich class of density models, capable of fitting complex structures on moderate-dimensional problems. Compared to deep density models fit via maximum likelihood, our approach provides a complementary set of strengths and tradeoffs: in empirical studies, the former can yield higher likelihoods, whereas the latter gives better estimates of the gradient of the log density, the score, which describes the distribution's shape.


Using AI to Analyze Brain Cells May Advance Parkinson's Research

#artificialintelligence

Computers may be taught to identify features in nerve cells that have not been stained or undergone other damaging treatments for microscope use, an approach with the potential to revolutionize the way researchers study neurodegenerative diseases such as Parkinson's. "Researchers are now generating extraordinary amounts of data. For neuroscientists, this means that training machines to help analyze this information can help speed up our understanding of how the cells of the brain are put together and in applications related to drug development," Margaret Sutherland, PhD, said in a press release. Sutherland is program director at the National Institute of Neurological Disorders and Stroke (NINDS), which helps fund the research. The study "In silico labeling: Predicting fluorescent labels in unlabeled images" was published in the journal Cell.