Goto

Collaborating Authors

Machine Learning


Machine Learning in Medicine

#artificialintelligence

In perinatal medicine, fetal weight is a sort of Goldilocks problem: it has to be just right. If the fetus is too small, it may not be developing properly; too large and the mother faces much greater risks in childbirth. The trouble is, there is no way to directly measure fetal weight. Instead, doctors must rely on estimates that are calculated using a formula that includes the fetus's head and abdominal circumference, and the length of the femur. Unfortunately, this formula isn't always as accurate as doctors would like.


Simple Eye Exam With Powerful Artificial Intelligence Could Lead to Early Parkinson's Disease Diagnosis

#artificialintelligence

An example of a fundus eye images taken from the UK Biobank. A simple eye exam combined with powerful artificial intelligence (AI) machine learning technology could provide early detection of Parkinson's disease, according to research being presented at the annual meeting of the Radiological Society of North America (RSNA). Parkinson's disease is a progressive disorder of the central nervous system that affects millions of people worldwide. Diagnosis is typically based on symptoms like tremors, muscle stiffness and impaired balance -- an approach that has significant limitations. "The issue with that method is that patients usually develop symptoms only after prolonged progression with significant injury to dopamine brain neurons," said study lead author Maximillian Diaz, a biomedical engineering Ph.D. student at the University of Florida in Gainesville, Florida. "This means that we are diagnosing patients late in the disease process."


Use of Artificial Intelligence in 2020

#artificialintelligence

This means that 2020 will be an important year for the next decade of innovations in the AI space to set the tone and continue the current momentum. But what does this mean for organizations selling and buying AI solutions? In which areas should they invest? IDC and Forrester issued lately their forecasts for artificial intelligence (AI) in 2020 and beyond. While outside "market events" can make firms cautious about AI, says Forrester, "brave ones" will continue to invest and expand the first "timid" measures they took in 2019.


5 Emerging AI And Machine Learning Trends To Watch In 2021

#artificialintelligence

Artificial Intelligence and machine learning have been hot topics in 2020 as AI and ML technologies increasingly find their way into everything from advanced quantum computing systems and leading-edge medical diagnostic systems to consumer electronics and "smart" personal assistants. Revenue generated by AI hardware, software and services is expected to reach $156.5 billion worldwide this year, according to market researcher IDC, up 12.3 percent from 2019. But it can be easy to lose sight of the forest for the trees when it comes to trends in the development and use of AI and ML technologies. As we approach the end of a turbulent 2020, here's a big-picture look at five key AI and machine learning trends– not just in the types of applications they are finding their way into, but also in how they are being developed and the ways they are being used. Hyperautomation, an IT mega-trend identified by market research firm Gartner, is the idea that most anything within an organization that can be automated – such as legacy business processes – should be automated.


What is a Neural Network?

#artificialintelligence

Think back to the first time you heard the phrase "neural networks" or "neural nets" -- perhaps it's right now -- and try to remember what your first impression was. As an Applied Math and Economics major with a newfound interest in data science and machine learning, I remember thinking that whatever neural networks are, they must be extremely important, really cool, and very complicated. I also remember thinking that a true understanding of neural networks must be on the other side of a thick wall of prerequisite knowledge including neuroscience and graduate mathematics. Through taking a machine learning course with Professor Samuel Watson at Brown, I have learned that three of the previous four statements are true in most cases -- neural nets are extremely important, really cool, and they can be very complicated depending on the architecture of the model. But most importantly, I learned that understanding neural networks requires minimal prerequisite knowledge as long as the information is presented in a logical and digestable way.


5 companies that are revolutionizing recruiting using Artificial Intelligence

#artificialintelligence

Artificial intelligence (AI), the use of human-like intelligence through software and mechanisms, enables the disruption of the most diverse segments. After all, this is an industry that has grown an average of 20% per year for the past 5 years, according to a survey by BBC Research. Many organizations have already joined the "future" and gained space by efficiently applying AI in everyday activities. For example, some banks started to perform financial services without the help of a human; farms use drones capable of identifying points in a crop that need more irrigation and automatically trigger sprinklers. AI is not set to replace the recruiter's work, the importance of the interview, the empathy, and the sparkle in the eye that we sometimes feel when interviewing a candidate.


Machine learning approach could improve radar in congested environments - Military Embedded Systems

#artificialintelligence

Research being conducted by the U.S. Army Combat Capabilities Development Command (DEVCOM) is focused on a new machine learning approach that could improve radar performance in congested environments. Researchers from DEVCOM, Army Research Laboratory, and Virginia Tech have developed an automatic way for radars to operate in congested and limited-spectrum environments created by commercial 4G LTE and future 5G communications systems. The researchers claim they examined how future Department of Defense radar systems will share the spectrum with commercial communications systems. The team used machine learning to learn the behavior of ever-changing interference in the spectrum and find clean spectrum to maximize the radar performance. Once clean spectrum is identified, waveforms can be modified to best fit into the spectrum.


Top 10 Data Science Career Options That are on The Hype

#artificialintelligence

Data science is one of the most appealing industries with a lot of features and opportunities. With humans using 2.5 quintillion data per day, the data landscape is at a dynamic space, almost mimicking the real global connectivity. New technologies to tackle data overwhelming are introduced year after year and the transformation is likely to continue into the coming decade. The rise for data-related practitioners in the fast-moving world is very real. According to a report, data related jobs post 2020 is anticipated to add around 1.5 lakh new openings.


Best and Worst Cases of Machine-Learning Models -- Part-1

#artificialintelligence

It's very important to know where our model works well and where it fails. If there is a low latency requirement, definitely KNN will be a worse choice. Similarly, if data is non-linear, then choosing logistic regression is not good so let's dive deep into the discussion and find the pros and cons of models.


Research shows the intrinsically nonlinear nature of receptive fields in vision

#artificialintelligence

The receptive field (RF) of a neuron is the term applied to the space in which the presence of a stimulus alters the response of the same neuron. The responses of visual neurons, as well as visual perception phenomena in general, are highly nonlinear functions of the visual input (in mathematics, nonlinear systems represent phenomena whose behavior cannot be expressed as the sum of the behaviors of its descriptors). Conversely, vision models used in science are based on the notion of linear receptive field; in artificial intelligence and machine learning, as artificial neural networks are based on classical models of vision, also use linear receptive fields. "Modeling vision based on a linear receptive field poses several inherent problems: it changes with each input, it presupposes a set of basis functions for the visual system, and it conflicts with recent studies on dendritic computations," asserts Marcelo Bertalmío, first author of a study recently published in the journal Scientific Reports. The paper proposes modeling the receptive field in a nonlinear manner, introducing the intrinsically nonlinear receptive field or INRF.