Industry
Lower Bounding Klondike Solitaire with Monte-Carlo Planning
Bjarnason, Ronald (Oregon State University) | Fern, Alan (Oregon State University) | Tadepalli, Prasad (Oregon State University)
Despite its ubiquitous presence, very little is known about the odds of winning the simple card game of Klondike Solitaire. The main goal of this paper is to investigate the use of probabilistic planning to shed light on this issue. Unfortunatley, most probabilistic planning techniques are not well suited for Klondike due to the difficulties of representing the domain in standard planning languages and the complexity of the required search. Klondike thus serves as an interesting addition to the complement of probabilistic planning domains. In this paper, we study Klondike using several sampling-based planning approaches including UCT, hindsight optimization, and sparse sampling, and establish lower bounds on their performance. We also introduce novel combinations of these approaches and evaluate them in Klondike. We provide a theoretical bound on the sample complexity of a method that naturally combines sparse sampling and UCT. Our results demonstrate that there is a policy that within tight confidence intervals wins over 35% of Klondike games. This result is the first reported lower bound of an optimal Klondike policy.
Multi-Goal Planning for an Autonomous Blasthole Drill
Elinas, Pantelis (The University of Sydney)
This paper presents multi-goal planning for an autonomous blasthole drill used in open pit mining operations. Given a blasthole pattern to be drilled and constraints on the vehicle's motion and orientation when drilling, we wish to compute the best order in which to drill the given pattern. Blasthole pattern drilling is an asymmetric Traveling Salesman Problem with precedence constraints specifying that some holes must be drilled before others. We wish to find the minimum cost tour according to criteria that minimize the distance travelled satisfying the precedence and vehicle motion constraints. We present an iterative method for solving the blasthole sequencing problem using the combination of a Genetic Algorithm and motion planning simulations that we use to determine the true cost of travel between any two holes.
Thinking Ahead in Real-Time Search
Nau, Dana S. (University of Maryland) | Kuter, Ugur (University of Maryland) | Sefer, Emre (University of Maryland)
We consider real-time planning problems in which some states are unsolvable, i.e., have no path to a goal. Such problems are difficult for real-time planning algorithms such as RTA* in which all states must be solvable. We identify a property called k-safeness, in which the consequences of a bad choice become apparent within k moves after the choice is made. When k is not too large, this makes it possible to identify unsolvable states in real time. We provide a modified version of RTA* that is provably complete on all k -safe problems. We derive k -safeness conditions for real-time deterministic versions of the well-known Tireworld and Racetrack domains, and provide experimental results showing that our modified version of RTA* works quite well in these domains.
Exploiting N-Gram Analysis to Predict Operator Sequences
Muise, Christian (University of Toronto) | McIlraith, Sheila (University of Toronto) | Baier, Jorge A. (University of Toronto) | Reimer, Michael (University of Toronto)
N-gram analysis provides a means of probabilistically predicting the next item in a sequence. Due originally to Shannon, it has proven an effective technique for word prediction in natural language processing and for gene sequence analysis. In this paper, we investigate the utility of n-gram analysis in predicting operator sequences in plans. Given a set of sample plans, we perform n-gram analysis to predict the likelihood of subsequent operators, relative to a partial plan. We identify several ways in which this information might be integrated into a planner. In this paper, we investigate one of these directions in further detail. Preliminary results demonstrate the promise of n-gram analysis as a tool for improving planning performance.
Clustering Based on Pairwise Distances When the Data is of Mixed Dimensions
In the context of clustering, we consider a generative model in a Euclidean ambient space with clusters of different shapes, dimensions, sizes and densities. In an asymptotic setting where the number of points becomes large, we obtain theoretical guaranties for a few emblematic methods based on pairwise distances: a simple algorithm based on the extraction of connected components in a neighborhood graph; the spectral clustering method of Ng, Jordan and Weiss; and hierarchical clustering with single linkage. The methods are shown to enjoy some near-optimal properties in terms of separation between clusters and robustness to outliers. The local scaling method of Zelnik-Manor and Perona is shown to lead to a near-optimal choice for the scale in the first two methods. We also provide a lower bound on the spectral gap to consistently choose the correct number of clusters in the spectral method.
A Nonconformity Approach to Model Selection for SVMs
Hardoon, David R., Hussain, Zakria, Shawe-Taylor, John
We investigate the issue of model selection and the use of the nonconformity (strangeness) measure in batch learning. Using the nonconformity measure we propose a new training algorithm that helps avoid the need for Cross-Validation or Leave-One-Out model selection strategies. We provide a new generalisation error bound using the notion of nonconformity to upper bound the loss of each test example and show that our proposed approach is comparable to standard model selection methods, but with theoretical guarantees of success and faster convergence. We demonstrate our novel model selection technique using the Support Vector Machine.
Resource Matchmaking Algorithm using Dynamic Rough Set in Grid Environment
Ataollahi, Iraj, Analoui, Mortza
Grid environment is a service oriented infrastructure in which many heterogeneous resources participate to provide the high performance computation. One of the bug issues in the grid environment is the vagueness and uncertainty between advertised resources and requested resources. Furthermore, in an environment such as grid dynamicity is considered as a crucial issue which must be dealt with. Classical rough set have been used to deal with the uncertainty and vagueness. But it can just be used on the static systems and can not support dynamicity in a system. In this work we propose a solution, called Dynamic Rough Set Resource Discovery (DRSRD), for dealing with cases of vagueness and uncertainty problems based on Dynamic rough set theory which considers dynamic features in this environment. In this way, requested resource properties have a weight as priority according to which resource matchmaking and ranking process is done. We also report the result of the solution obtained from the simulation in GridSim simulator. The comparison has been made between DRSRD, classical rough set theory based algorithm, and UDDI and OWL S combined algorithm. DRSRD shows much better precision for the cases with vagueness and uncertainty in a dynamic system such as the grid rather than the classical rough set theory based algorithm, and UDDI and OWL S combined algorithm.
On Ranking Senators By Their Votes
The problem of ranking a set of objects given some measure of similarity is one of the most basic in machine learning. Recently Agarwal proposed a method based on techniques in semi-supervised learning utilizing the graph Laplacian. In this work we consider a novel application of this technique to ranking binary choice data and apply it specifically to ranking US Senators by their ideology.
Structured Sparse Principal Component Analysis
Jenatton, Rodolphe, Obozinski, Guillaume, Bach, Francis
We present an extension of sparse PCA, or sparse dictionary learning, where the sparsity patterns of all dictionary elements are structured and constrained to belong to a prespecified set of shapes. This \emph{structured sparse PCA} is based on a structured regularization recently introduced by [1]. While classical sparse priors only deal with \textit{cardinality}, the regularization we use encodes higher-order information about the data. We propose an efficient and simple optimization procedure to solve this problem. Experiments with two practical tasks, face recognition and the study of the dynamics of a protein complex, demonstrate the benefits of the proposed structured approach over unstructured approaches.
A multiagent urban traffic simulation Part I: dealing with the ordinary
Tranouez, Pierrick, Langlois, Patrice, Daudé, Eric
We describe in this article a multiagent urban traffic simulation, as we believe individual-based modeling is necessary to encompass the complex influence the actions of an individual vehicle can have on the overall flow of vehicles. We first describe how we build a graph description of the network from purely geometric data, ESRI shapefiles. We then explain how we include traffic related data to this graph. We go on after that with the model of the vehicle agents: origin and destination, driving behavior, multiple lanes, crossroads, and interactions with the other vehicles in day-to-day, ?ordinary? traffic. We conclude with the presentation of the resulting simulation of this model on the Rouen agglomeration.