Goto

Collaborating Authors

 cdh


Distributed Model Predictive Control for Heterogeneous Platoons with Affine Spacing Policies and Arbitrary Communication Topologies

Shaham, Michael H., Padir, Taskin

arXiv.org Artificial Intelligence

This paper presents a distributed model predictive control (DMPC) algorithm for a heterogeneous platoon using arbitrary communication topologies, as long as each vehicle is able to communicate with a preceding vehicle in the platoon. The proposed DMPC algorithm is able to accommodate any spacing policy that is affine in a vehicle's velocity, which includes constant distance or constant time headway spacing policies. By analyzing the total cost for the entire platoon, a sufficient condition is derived to guarantee platoon asymptotic stability. Simulation experiments with a platoon of 50 vehicles and hardware experiments with a platoon of four 1/10th scale vehicles validate the algorithm and compare performance under different spacing policies and communication topologies.



The Compressed Differential Heuristic

Goldenberg, Meir (Ben-Gurion University) | Sturtevant, Nathan (University of Denver) | Felner, Ariel (Ben-Gurion University) | Schaeffer, Jonathan (University of Alberta)

AAAI Conferences

The differential heuristic (DH) is an effective memory-based heuristic for explicit state spaces. In this paper we aim to improve its performance and memory usage. We introduce a compression method for DHs which stores only a portion of the original uncompressed DH, while preserving enough information to enable efficient search. Compressed DHs (CDH) are flexible and can be tuned to fit any size of memory, even smaller than the size of the state space. Furthermore, CDHs can be built without the need to create and store the entire uncompressed DH. Experimental results across different domains show that, for a given amount of memory, a CDH significantly outperforms an uncompressed DH.


The Compressed Differential Heuristic

Goldenberg, Meir (Ben-Gurion University) | Sturtevant, Nathan (University of Denver) | Felner, Ariel (Ben-Gurion University) | Schaeffer, Jonathan (University of Alberta)

AAAI Conferences

The differential heuristic (DH) is an effective memory-based heuristic for explicit state spaces. In this paper, we aim to improve its performance and memory usage. We introduce a compression method for DHs which stores only a portion of the original uncompressed DH, while preserving enough information to enable efficient search. Compressed DHs (CDH) can be tuned to fit any size of memory, even smaller than the size of the state space.Experimental results across different domains show that, for a given amount of memory, a CDH significantly outperforms an uncompress