A Neural-Network Solution to the Concentrator Assignment Problem

Neural Information Processing Systems

Thispaper presents a neural-net solution to a resource allocation problem that arises in providing local access to the backbone of a wide-area communication network.The problem is described in terms of an energy function that can be mapped onto an analog computational network. Simulation results characterizing the performance of the neural computation are also presented. INTRODUCTION This paper presents a neural-network solution to a resource allocation problem that arises in providing access to the backbone of a communication network. 1 Inthe field of operations research, this problem was first known as the warehouse location problem and heuristics for finding feasible, suboptimal solutions have been developed previously.2.


Exploiting the Hidden Structure of Junction Trees for MPE

AAAI Conferences

The role of decomposition-trees (also known as junction and clique trees) in probabilistic inference is widely known and has been the basis for many well known inference algorithms.Recent approaches have demonstrated that such trees have a "hidden structure," which enables the characterization of tractable problem instances as well as lead to insights that enable boosting the performance of inference algorithms. We consider the MPE problem on a Boolean formula in CNF where each literal in the formula is associated with a weight.We describe techniques for exploiting the junction-tree structure of these formulas in the context of a branch-and-bound algorithm for MPE.


Shallow Neural Network from scratch (deeplearning.ai assignment)

#artificialintelligence

I recently started taking Deep Learning Specialization course on Coursera, and I'm enjoying it so much that I'm having even dreams of neural networks. I am currently on General Assembly London's Data Science Immersive course, of course we have a lot of projects, and all the lectures to catch up with, but we are on Christmas break, and I thought maybe I could make the best use of this break in addition to preparing my final project.


Deep Learning in Neural Networks: An Overview

#artificialintelligence

What a wonderful treasure trove this paper is! Schmidhuber provides all the background you need to gain an overview of deep learning (as of 2014) and how we got there through the preceding decades. Starting from recent DL results, I tried to trace back the origins of relevant ideas through the past half century and beyond. The main part of the paper runs to 35 pages, and then there are 53 pages of references. Now, I know that many of you think I read a lot of papers – just over 200 a year on this blog – but if I did nothing but review these key works in the development of deep learning it would take me about 4.5 years to get through them at that rate! And when I'd finished I'd still be about 6 years behind the then current state of the art!


ClusterNet : Semi-Supervised Clustering using Neural Networks

arXiv.org Machine Learning

Clustering using neural networks has recently demon- strated promising performance in machine learning and computer vision applications. However, the performance of current approaches is limited either by unsupervised learn- ing or their dependence on large set of labeled data sam- ples. In this paper, we propose ClusterNet that uses pair- wise semantic constraints from very few labeled data sam- ples (< 5% of total data) and exploits the abundant un- labeled data to drive the clustering approach. We define a new loss function that uses pairwise semantic similarity between objects combined with constrained k-means clus- tering to efficiently utilize both labeled and unlabeled data in the same framework. The proposed network uses con- volution autoencoder to learn a latent representation that groups data into k specified clusters, while also learning the cluster centers simultaneously. We evaluate and com- pare the performance of ClusterNet on several datasets and state of the art deep clustering approaches.