Building Predictive Models from Fractal Representations of Symbolic Sequences
–Neural Information Processing Systems
We propose a novel approach for building finite memory predictive mod(cid:173) els similar in spirit to variable memory length Markov models (VLMMs). The models are constructed by first transforming the n-block structure of the training sequence into a spatial structure of points in a unit hypercube, such that the longer is the common suffix shared by any two n-blocks, the closer lie their point representations. Such a transformation embodies a Markov assumption - n-blocks with long common suffixes are likely to produce similar continuations. Finding a set of prediction contexts is formulated as a resource allocation problem solved by vector quantizing the spatial n-block representation. We compare our model with both the classical and variable memory length Markov models on three data sets with different memory and stochastic components.
Neural Information Processing Systems
Feb-16-2024, 21:23:03 GMT
- Technology: