Brent is a PhD Candidate in the Department of Computer Science and a member of the Φ Lab and Insight Lab. He was previously the instructor for MMASc 9251A: Professional Computing for Applied Scientists and presently the Teaching Assistant for Unstructured Data. As of March 1st 2019, Brent will also be a Mitacs Accelerate Intern. This work is with the Parkwood Institute and IBM with the target of improving mental health resources for Canadian Veterans. His research interests are two-fold.
In May, the United Nations released a troubling report, arguing that female-sounding voices for AI assistants such as Apple's Siri and Amazon's Alexa perpetuate gender biases and encourage users to be sexist. Now, Google has come out to explain why it chose to give its Assistant a female-sounding voice -- and the search giant says it has nothing to do with gender biases and everything to do with the available technology. According to Google Assistant product manager Brant Ward, Google initially planned to launch Assistant with a male voice. The problem, he told Business Insider, was that the audio produced by text-to-speech systems was easier to understand if delivered in a higher-pitched, female-sounding voice. "At the time, the guidance was the technology performed better for a female voice," Ward said.
Four years ago, to the day, I published an article on the challenges of Ethics in AI, talking about the ethical blindness of algorithms since then we've seen a number of challenges that come down to that issue: if the data is biased the analysis is biased. The problem here is the same as was found in European data about car insurance: reality said that women were a better risk than men. This challenge of bias was in that case driven by something that individuals have control over, their driving style, and is partly why there has been a rise in insurance companies that now price based on driving style. However the challenge is much greater, and insidious, when the data is biased inherently against a given group. New Scientist covered 5 examples of AI being biased and a large part of that is due to the system itself being biased, therefore producing biased data which means any AI will be inherently biased.
Mayo Clinic has entered into a 10-year strategic partnership with Google to use the tech giant's cloud platform to accelerate innovation through digital technologies. Terms of the deal were not disclosed. The Rochester, Minn.-based hospital said it selected Google Cloud to be the cornerstone of its "digital transformation." As part of the collaboration, Mayo Clinic will store patient data in the cloud and use advanced cloud computing, data analytics, machine learning, and artificial intelligence to advance the diagnosis and treatment of disease, hospital executives said in a press release. "Data-driven medical innovation is growing exponentially, and our partnership with Google will help us lead the digital transformation in health care," Gianrico Farrugia, M.D., president and CEO of Mayo Clinic, said in a statement.
This is the web version of Data Sheet, Fortune's daily newsletter on the top tech news. Sign up here to get it delivered to your inbox. The number of transistors packed onto a modern chip inside your phone or PC runs into the billions but it's still sometimes amazing to comprehend the computing power you can easily hold in the palm of your hand. When I met Intel vice presidents Gadi Singer and Carey Kloss on Wednesday, they showed me a new circuit board the company has created for speeding up artificial intelligence apps. The board is the size of an SSD drive, made to plug into a standard PC or server.
A new technology using artificial intelligence detects depressive language in social media posts more accurately than current systems and uses less data to do it. The technology, which was presented during the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, is the first of its kind to show that, to more accurately detect depressive language, small, high-quality data sets can be applied to deep learning, a commonly used AI approach that is typically data intensive. Previous psycholinguistic research has shown that the words we use in interaction with others on a daily basis are a good indicator of our mental and emotional state. Past attempts to apply deep learning techniques to detect and monitor depression in social media posts have been shown to be tedious and expensive, explained Nawshad Farruque, a University of Alberta PhD student in computing science who is leading the new study. He explained that a Twitter post saying that somebody is depressed because Netflix is down isn't really expressing depression, so someone would need to "explain" this to the algorithm.
The News: Oracle is pushing the envelope on what NVIDIA GPUs can do in the cloud. Find out how, next week at Oracle OpenWorld and Code One in San Francisco, where NVIDIA and Oracle will showcase their growing collaboration to bring AI and GPU-accelerated applications to the enterprise. Integrating CUDA-X libraries into GraalVM applications, enhancing conversational AI with Oracle Digital Assistant, and accelerating data science pipelines through the Oracle Cloud Infrastructure Data Science service are a few examples of how enterprise customers and developers worldwide will benefit from GPU-accelerated computing. The companies first teamed up by bringing bare metal GPUs to the public cloud through Oracle Cloud Infrastructure, fueling innovation across a broad range of industries. Engineers, developers, data scientists and researchers are using these instances to power visualization, AI/machine learning, big data, database and HPC workloads.
Introducing--artificial intelligence--(AI) to sales training and coaching can provide a more individualized learning experience that can scale across the organization, according to Gartner, Inc. Creating a high-performing sales organization is difficult with traditional training and coaching technology as coaching content and recommendations are generally delivered by role to the sales organization and do not account for individuals-- learning styles. The use of complex--machine learning--algorithms and AI can guide reps and sales managers with recommendations for training and coaching based on their learning style. These technologies utilize branching, a method to guide an individual--s learning through a module based on responses, as well as adaptive learning, where the system directs the learner to appropriate training or coaching based on their interaction with the system. In a--Gartner survey--of organizations that are piloting or deploying AI technologies, 61% of respondents reported the resulting value delivered to the organization as significant. When asked how AI will improve their sales organization, respondents cited increased efficiency, cost reduction and improved revenue streams.