high-dimensional feature space
TRKM: Twin Restricted Kernel Machines for Classification and Regression
Restricted kernel machines (RKMs) have considerably improved generalization in machine learning. Recent advancements explored various techniques within the RKM framework, integrating kernel functions with least squares support vector machines (LSSVM) to mirror the energy function of restricted Boltzmann machines (RBM), leading to enhanced performance. However, RKMs may face challenges in generalization when dealing with unevenly distributed or complexly clustered data. Additionally, as the dataset size increases, the computational burden of managing high-dimensional feature spaces can become substantial, potentially hindering performance in large-scale datasets. To address these challenges, we propose twin restricted kernel machine (TRKM). TRKM combines the benefits of twin models with the robustness of the RKM framework to enhance classification and regression tasks. By leveraging the Fenchel-Young inequality, we introduce a novel conjugate feature duality, allowing the formulation of classification and regression problems in terms of dual variables. This duality provides an upper bound to the objective function of the TRKM problem, resulting in a new methodology under the RKM framework. The model uses an energy function similar to that of RBM, incorporating both visible and hidden variables corresponding to both classes. Additionally, the kernel trick is employed to map data into a high-dimensional feature space, where the model identifies an optimal separating hyperplane using a regularized least squares approach. Experiments on UCI and KEEL datasets confirm TRKM's superiority over baselines, showcasing its robustness and efficiency in handling complex data. Furthermore, We implemented the TRKM model on the brain age dataset, demonstrating its efficacy in predicting brain age.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States (0.04)
- North America > Canada > Quebec > Montreal (0.04)
- Asia > India > Madhya Pradesh (0.04)
- Research Report > Experimental Study (0.46)
- Research Report > New Finding (0.46)
Toward Model-Agnostic Detection of New Physics Using Data-Driven Signal Regions
Yi, Soheun, Alison, John, Kuusela, Mikael
In the search for new particles in high-energy physics, it is crucial to select the Signal Region (SR) in such a way that it is enriched with signal events if they are present. While most existing search methods set the region relying on prior domain knowledge, it may be unavailable for a completely novel particle that falls outside the current scope of understanding. We address this issue by proposing a method built upon a model-agnostic but often realistic assumption about the localized topology of the signal events, in which they are concentrated in a certain area of the feature space. Considering the signal component as a localized high-frequency feature, our approach employs the notion of a low-pass filter. We define the SR as an area which is most affected when the observed events are smeared with additive random noise. We overcome challenges in density estimation in the high-dimensional feature space by learning the density ratio of events that potentially include a signal to the complementary observation of events that closely resemble the target events but are free of any signals. By applying our method to simulated $\mathrm{HH} \rightarrow 4b$ events, we demonstrate that the method can efficiently identify a data-driven SR in a high-dimensional feature space in which a high portion of signal events concentrate.
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.15)
- North America > United States > Illinois > Cook County > Chicago (0.04)
Agile gesture recognition for low-power applications: customisation for generalisation
Liu, Ying, Guo, Liucheng, Makarovc, Valeri A., Gorbana, Alexander, Mirkesa, Evgeny, Tyukin, Ivan Y.
Automated hand gesture recognition has long been a focal point in the AI community. Traditionally, research in this field has predominantly focused on scenarios with access to a continuous flow of hand's images. This focus has been driven by the widespread use of cameras and the abundant availability of image data. However, there is an increasing demand for gesture recognition technologies that operate on low-power sensor devices. This is due to the rising concerns for data leakage and end-user privacy, as well as the limited battery capacity and the computing power in low-cost devices. Moreover, the challenge in data collection for individually designed hardware also hinders the generalisation of a gesture recognition model. In this study, we unveil a novel methodology for pattern recognition systems using adaptive and agile error correction, designed to enhance the performance of legacy gesture recognition models on devices with limited battery capacity and computing power. This system comprises a compact Support Vector Machine as the base model for live gesture recognition. Additionally, it features an adaptive agile error corrector that employs few-shot learning within the feature space induced by high-dimensional kernel mappings. The error corrector can be customised for each user, allowing for dynamic adjustments to the gesture prediction based on their movement patterns while maintaining the agile performance of its base model on a low-cost and low-power micro-controller. This proposed system is distinguished by its compact size, rapid processing speed, and low power consumption, making it ideal for a wide range of embedded systems.
- Europe > United Kingdom > England > Leicestershire > Leicester (0.04)
- Europe > Spain > Galicia > Madrid (0.04)
- Energy (0.68)
- Information Technology > Security & Privacy (0.46)
- Information Technology > Artificial Intelligence > Vision > Gesture Recognition (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning > Support Vector Machines (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition (1.00)
Max-Margin Markov Networks
In typical classification tasks, we seek a function which assigns a label to a sin- gle object. Kernel-based approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from the ability to use high-dimensional feature spaces, and from their strong theoretical guaran- tees. However, many real-world tasks involve sequential, spatial, or structured data, where multiple labels must be assigned. Existing kernel-based methods ig- nore structure in the problem, assigning labels independently to each object, los- ing much useful information. Conversely, probabilistic graphical models, such as Markov networks, can represent correlations between labels, by exploiting problem structure, but cannot handle high-dimensional feature spaces, and lack strong theoretical generalization guarantees.
An Introduction to Support Vector Machines
The authors believe that SVMs are a topic now sufficiently mature that it should be viewed as its own subfield of machine learning. SVMs, first introduced by Vladimir Vapnik, are a type of linear learning machines much like the famous perceptron algorithm and, thus, function to classify input patterns by first being trained on labeled data sets (supervised learning). However, SVMs represent a significant enhancement in function over perceptrons. The power of SVMs lies in their use of nonlinear kernel functions that implicitly map input into high-dimensional feature spaces. In the high-dimensional feature spaces, linear classifications are possible; they become nonlinear in the transformation back to the original input space.
Broiler chickens can benefit from machine learning: support vector machine analysis of observational epidemiological data
Broiler farmers have used data as an aid to health and production management for over 40 years [1,2]. Food and water consumption, growth and mortality have been used to construct standard production curves to monitor and improve performance. Daily flock data are plotted graphically on broiler house'door charts' and deviations used as early indicators of flock health and welfare [3]. Increasingly, these and other sensor-recorded data are being collected electronically, giving birth to the concept of precision livestock farming [4]. Broiler flocks generate large datasets.