Discrete continuous optimization via representation
Generative representations evolve directions for constructing a solution rather than directly encoding solutions. This talk will demonstrate generative methods for representing continuous real parameter optimization as a series of discrete actions. Inspired by the Nelder-Mead technique [4] the walking triangle representation [2] can index all of Euclidean space. A second representation, the rescaling vector jump technique, currently in review, is presented as well.Changing to a discrete representation opens up a number of opportunities. Estimated collections of real parameters are presented as a sequence of symbols over a finite alphabet, making it possible to store them in dictionaries. This in turn eases the task of enumerating the optima of a function [1]. It is possible to use techniques for playing games, like Monte-Carlo tree search [3] to perform optimization when a discrete representation is available. A new technique called lightning optimization is enabled by
discrete representation. This technique evaluates all intermediate steps in the expression of a discrete representation, thus smoothing search spaces.
References
[1] D. Ashlock, K.M. Bryden, and S. Gent. Multiscale feature location with a fractal representation. In Intelligent Engineering Systems Through Artificial Neural Networks, 19:173–180, 2009.
[2] D. Ashlock and J. Gilbert. A discrete representation for real optimization with unique search properties. In Proceedings of the IEEE Symposium on the Foundations of Computational Intelligence, pages 54–61, 2014.
[3] Daniel Beard, Philip Hingston, and Martin Masek. Using monte carlo tree search for replanning in a multistage simultaneous game. In Proceedings of the IEEE Congress on Evolutionary Computation, pages 1–8, 2012.
[4] J. A. Nelder and R. Mead. A simplex method for function minimization. Computer Journal, 7(4):308313, 1965.