Goto

Collaborating Authors

Health & Medicine


Full-Stack Machine Learning Engineer

#artificialintelligence

The Telehealth & Program Technology Team is responsible for technology support for Orbis programs, which includes the Telehealth and distance learning platform Cybersight. The team is also responsible for guiding and supporting partner organizations' efforts in healthcare technology management. The team focuses on integrating Cybersight with Orbis programs across the world and work together to expand Orbis's clinical, educational and technology initiatives while developing long-term relationships with physicians, telehealth-partnering hospitals, educators, and IT & communications specialists. The team also closely monitors trends and new innovations in the healthcare technology and public health space that should be nurtured in order to revolutionize practice in the low-resource settings we serve. As an essential member of the Telehealth & Program Technology Team within the global program department, the Full-Stack Machine Learning Engineer will work together with the Principal AI Architect in a global effort to develop and implement AI-based systems to detect sight-threatening conditions and support eye health professionals around the world in diagnosing and treating these conditions.


Research lab opens in India focused on deep learning

#artificialintelligence

Medical equipment manufacturer Wipro GE Healthcare has partnered with the Indian Institute of Science (IISc) to open a research lab. The lab is located at the Department of Computational and Data Sciences (CDS) in Bangalore. Work will also be done on digital interfaces to produce sophisticated diagnostic and medical image reconstruction techniques. This research unit will involve around fifty students and three faculty members of IISc to begin with. They will work closely with clinicians as well as Wipro GE Healthcare to integrate computational models into clinical workflows, to help doctors improve patient outcomes.


Dispelling myths around AI within cyber security - Information Age

#artificialintelligence

The perception that artificial intelligence (AI) can serve as a "silver bullet" against a developing threat landscape is rooted in the ongoing quest to find technologies that will be able to automate threat detection and response without the need for human intervention. Capgemini even reported that 69% of enterprise executives surveyed felt AI is essential for responding to cyber threats in a report on AI and cyber security last year. But despite its promise, AI within cyber security should be approached with a discerning eye. Often, when firms discuss the promise of AI as well as their current capabilities, the reality is they are practicing machine learning. Generally viewed as a subset of AI, machine learning algorithms build a mathematical model based on sample data to detect behavioural patterns that identify variants of attacks, and to make predictions or decisions without being explicitly programmed to do so. In the field of cyber security, machine learning techniques are most applicable in various detect and respond technologies and are utilised in SIEM, EDR, XDR, and sandboxing solutions.


Researchers say they can predict epileptic seizures an hour in advance

Engadget

Researchers from Ben-Gurion University of the Negev in Israel have developed a wearable electroencephalogram (EEG) device they claim can predict epileptic seizures up to an hour before the onset. Epiness uses machine learning algorithms to analyze brain activity and detect potential seizures, and it can send a warning to a connected smartphone. Other devices on the market can detect seizures in real-time, but can't give advance warnings. However, researchers from the University of Louisiana at Lafayette last year unveiled an AI prediction model of their own. That was said to offer a similar level of prediction accuracy to Epiness, and it can also alert patients up to an hour in advance of a seizure taking hold.


How a Memory Quirk of the Human Brain Can Galvanize AI

#artificialintelligence

Take a two-year-old that first learns to recognize a dog and a cat at home, then a horse and a sheep in a petting zoo. The kid will then also be able to tell apart a dog and a sheep, even if he can't yet articulate their differences. This ability comes so naturally to us it belies the complexity of the brain's data-crunching processes under the hood. To make the logical leap, the child first needs to remember distinctions between his family pets. When confronted with new categories--farm animals--his neural circuits call upon those past remembrances, and seamlessly incorporate those memories with new learnings to update his mental model of the world.


Overwatch, Call of Duty League teams can defer multimillion-dollar franchise fees due to covid-19

Washington Post - Technology News

Front offices around CDL and OWL agree that the goal is to return to live, in-person events whenever it's safe to do so. Activision Blizzard created both leagues to pioneer a city-based model, built like traditional sports leagues, and that's always been a selling point for franchise spots. Team presidents and owners see the attention to a local market as a surefire way to created a dedicated fan base with regional sponsors and packed stadiums.


AI Does Heavy Lifting Data Analysis for First Responders - Connected World

#artificialintelligence

The market for AI (artificial intelligence) technologies is going to expand tremendously in the next decade. Grand View Research says the global AI market will reach $733.7 billion by 2027, growing at a CAGR (compound annual growth rate) of 42.2%. One of the many sectors that will increasingly look to leverage AI technologies between now and 2027 (and beyond) is first response. In fact, in some cases, the first-response industry is already engaged in piloting AI technologies for use on the front lines. What AI-related innovations are to come, and how will they make first responders' jobs easier?


Applying AI Towards A Better World: GDP, Jobs Growth & Less Pollution

#artificialintelligence

The economic recession that follows as a consequence of the Covid-19 crisis and in particular the demise of certain sectors of the economy (physical retail, hospitality sector, etc) means that there will be greater pressure on politicians around the world to consider how to stimulate GPD growth in the post-pandemic world. However, there are also increasing pressures on politicians to combat the threat posed by Climate Change. Are the desired objectives of GDP and employment growth as well as reducing pollution at odds with each other? What if there is a pathway to GDP growth with the creation of new jobs and yet at the same time we are able to reduce emissions of Green House Gasses (GHGs)? A report entitled "How AI can enable a sustainable future" by PWC and commissioned by Microsoft (lead authors Celine Herweijer of PWC and Lucas Joppa of Microsoft) estimates that using AI for environmental applications across four sectors – agriculture, water, energy and transport. The report estimated that such applications could contribute up to $5.2 trillion USD to the global economy in 2030, a 4.4% increase relative to business as usual.


Amazon Alexa: How developers use AI to help Alexa understand what you mean and not what you say

#artificialintelligence

How does Amazon help Alexa understand what people mean and not just what they say? And, we couldn't be talking about Alexa, smart home tech, and AI at a better time. During this week's Amazon Devices event, the company made a host of smart home announcements, including a new batch of Echo smart speakers, which will include Amazon's new custom AZ1 Neural Edge processor. In August this year, I had a chance to speak with Evan Welbourne, senior manager of applied science for Alexa Smart Home at Amazon, about everything from how the company is using AI and ML to improve Alexa's understanding of what people say, Amazon's approach to data privacy, the unique ways people are interacting with Alexa around COVID-19, and where he sees the future of voice and smart tech going in the future. The following is an transcript of our conversation edited for readability. Bill Detwiler: So before we talk about maybe IoT, we talk about Alexa, and kind of what's happening with the COVID pandemic, as people are working more from home, and as they may have questions that they're asking about Alexa, about the pandemic, let's talk about kind of just your role there at Amazon, and what you're doing with Alexa, especially with AI and ML. So I lead machine learning for Alexa Smart Home. And what that sort of means generally is that we try to find ways to use machine learning to make Smart Home more useful and easier to use for everybody that uses smart home. It's always a challenge because we've got the early adopters who are tech savvy, they've been using smart home for years, and that's kind of one customer segment. But we've also got the people who are brand new to smart home these days, people who have no background in smart home, they're just unboxing their first light, they may not be that tech savvy.


GloVe Word Embeddings on Movies Plot

#artificialintelligence

Every word can be represented into N-Dimension Space after applying Machine Learning Algorithms on documents. The most famous algorithms are the Word2Vec built by Google and the GloVe built by Stanford University. We will work with the GloVe pre-trained model. The idea is to represent into a50-D space every Movie Plot Summary and based on this vector to find similar movies. Finally, we will do dimensionality reduction by applying the T-SNE algorithm and to represent the plot summaries into 2-D space.