neurone
Rapid Visual Processing using Spike Asynchrony
We have investigated the possibility that rapid processing in the visual system could be achieved by using the order of firing in different neurones as a code, rather than more conventional firing rate schemes. Using SPIKENET, a neural net simulator based on integrate-and-fire neurones and in which neurones in the input layer function as analog(cid:173) to-delay converters, we have modeled the initial stages of visual processing. Initial results are extremely promising. Even with activity in retinal output cells limited to one spike per neuron per image (effectively ruling out any form of rate coding), sophisticated processing based on asynchronous activation was nonetheless possible.
Une comparaison des algorithmes d'apprentissage pour la survie avec donn\'ees manquantes
Dufossé, Paul, Benzekry, Sébastien
Survival analysis is an essential tool for the study of health data. An inherent component of such data is the presence of missing values. In recent years, researchers proposed new learning algorithms for survival tasks based on neural networks. Here, we studied the predictive performance of such algorithms coupled with different methods for handling missing values on simulated data that reflect a realistic situation, i.e., when individuals belong to unobserved clusters. We investigated different patterns of missing data. The results show that, without further feature engineering, no single imputation method is better than the others in all cases. The proposed methodology can be used to compare other missing data patterns and/or survival models. The Python code is accessible via the package survivalsim.
Formalising the Use of the Activation Function in Neural Inference
We investigate how activation functions can be used to describe neural firing in an abstract way, and in turn, why they work well in artificial neural networks. We discuss how a spike in a biological neurone belongs to a particular universality class of phase transitions in statistical physics. We then show that the artificial neurone is, mathematically, a mean field model of biological neural membrane dynamics, which arises from modelling spiking as a phase transition. This allows us to treat selective neural firing in an abstract way, and formalise the role of the activation function in perceptron learning. Along with deriving this model and specifying the analogous neural case, we analyse the phase transition to understand the physics of neural network learning. Together, it is show that there is not only a biological meaning, but a physical justification, for the emergence and performance of canonical activation functions; implications for neural learning and inference are also discussed.
- North America > United States > New York > Suffolk County > Stony Brook (0.04)
- North America > United States > Wisconsin > Dane County > Madison (0.04)
- North America > United States > Texas > Clay County (0.04)
A hybrid MGA-MSGD ANN training approach for approximate solution of linear elliptic PDEs
Dehghani, Hamidreza, Zilian, Andreas
We introduce a hybrid "Modified Genetic Algorithm-Multilevel Stochastic Gradient Descent" (MGA-MSGD) training algorithm that considerably improves accuracy and efficiency of solving 3D mechanical problems described, in strong-form, by PDEs via ANNs (Artificial Neural Networks). This presented approach allows the selection of a number of locations of interest at which the state variables are expected to fulfil the governing equations associated with a physical problem. Unlike classical PDE approximation methods such as finite differences or the finite element method, there is no need to establish and reconstruct the physical field quantity throughout the computational domain in order to predict the mechanical response at specific locations of interest. The basic idea of MGA-MSGD is the manipulation of the learnable parameters' components responsible for the error explosion so that we can train the network with relatively larger learning rates which avoids trapping in local minima. The proposed training approach is less sensitive to the learning rate value, training points density and distribution, and the random initial parameters. The distance function to minimise is where we introduce the PDEs including any physical laws and conditions (so-called, Physics Informed ANN). The Genetic algorithm is modified to be suitable for this type of ANN in which a Coarse-level Stochastic Gradient Descent (CSGD) is exploited to make the decision of the offspring qualification. Employing the presented approach, a considerable improvement in both accuracy and efficiency, compared with standard training algorithms such as classical SGD and Adam optimiser, is observed. The local displacement accuracy is studied and ensured by introducing the results of Finite Element Method (FEM) at sufficiently fine mesh as the reference displacements. A slightly more complex problem is solved ensuring its feasibility.
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning > Gradient Descent (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Evolutionary Systems (1.00)
Artificial neurons developed to fight disease
Scientists have made artificial nerve cells, paving the way for new ways to repair the human body. The tiny "brain chips" behave like the real thing and could one day be used to treat diseases such as Alzheimer's. A team from the University of Bath used a combination of maths, computation and chip design to come up with a way to replicate in circuit form what nerve cells (neurons) do naturally. Neurons carry signals to and from the brain and the rest of the body. Scientists are interested in replicating them, because of the potential that offers in treating diseases such as Alzheimer's, where neurons degenerate or die.
Mind-reading technology is everyone's next big security nightmare ZDNet
Technology allowing our thoughts and feelings to be translated into a digital form – and shared – is already a reality. Brain computer interfaces (BCI) allow us to connect our minds to computers for some limited purposes, and big tech companies including Facebook and many startups want to make this technology commonplace. The AI and ML deployments are well underway, but for CXOs the biggest issue will be managing these initiatives, and figuring out where the data science team fits in and what algorithms to buy versus build. For those of you terrified by the prospect of technology recording – and broadcasting – your opinions of the boss, your secret fears, or anything else – relax. BCIs are currently not sophisticated enough to collect such granular information.
- North America > United States > Utah (0.05)
- North America > United States > New York (0.05)
- Health & Medicine > Therapeutic Area > Neurology (1.00)
- Information Technology (0.69)
Who's Afraid of Machine Learning? Part 1 : What do they all talk about?!?
Human beings, since forever, have been fascinated by how nature works, and how can they use it for the'r own benefits. For example: when human wanted to create a machine that can fly through the air, what did they use for inspiration? They used the anatomy of wings and chest of birds! And when they wanted to create machines that can detects objects under water, or in the dark where they barely can see, what did they use for inspiration? They were inspired by bats and dolphin to create sonar based detectors!
- North America > United States > California > Los Angeles County > Los Angeles (0.06)
- Asia > Japan (0.06)
Artificial Intelligence, Machine Learning and Deep Learning – Why should you care?
In the tech space, these terms have been used a lot and sometimes interchangeably without understanding what they mean. So what is all the fuss about? Before we get to why you should care, let us first clear up the confusion of what each is all about and how it came to be. AI is simply human intelligence expressed by a machine. Well of course not, human intelligence on its own is a complex thing and replicating it is no easy task.
How Deep Learning AI Will Shape Asset Management - The Market Mogul
Everyone today talks about AI, big data and machine learning, yet most do not delve into the fundamental properties of how they will operate and how they might be an actual threat to asset managers. Some view technological methods as tools to assist them instead of being such a threat, and it would help provide both perspectives of the argument. Deep learning is a branch of machine learning that uses particular architectures of neural networks. These are artificial networks that attempt to actually replicate how the neural structures in human brains operate. Such methods have successfully been applied to areas such as computer vision – i.e. image processing and classification – as well as speech recognition. The techniques are readily available to any undergraduate student willing to learn the process.
- North America > United States (0.09)
- Asia > Japan (0.07)
How Deep Learning AI Will Shape Asset Management
Everyone today talks about AI, big data and machine learning, yet most do not delve into the fundamental properties of how they will operate and how they might be an actual threat to asset managers. Some view technological methods as tools to assist them instead of being such a threat, and it would help provide both perspectives of the argument. Deep learning is a branch of machine learning that uses particular architectures of neural networks. These are artificial networks that attempt to actually replicate how the neural structures in human brains operate. Such methods have successfully been applied to areas such as computer vision – i.e. image processing and classification – as well as speech recognition. The techniques are readily available to any undergraduate student willing to learn the process.
- North America > United States (0.09)
- Asia > Japan (0.07)