Learning sequential and parallel runtime distributions for randomized algorithms

Loading...
Thumbnail Image
Files
O'Sullivan.pdf(737.71 KB)
Accepted Version
Date
2016-11
Authors
Arbelaez, Alejandro
Truchet, Charlotte
O'Sullivan, Barry
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Published Version
Research Projects
Organizational Units
Journal Issue
Abstract
In cloud systems, computation time can be rented by the hour and for a given number of processors. Thus, accurate predictions of the behaviour of both sequential and parallel algorithms has become an important issue, in particular in the case of costly methods such as randomized combinatorial optimization tools. In this work, our objective is to use machine learning to predict performance of sequential and parallel local search algorithms. In addition to classical features of the instances used by other machine learning tools, we consider data on the sequential runtime distributions of a local search method. This allows us to predict with a high accuracy the parallel computation time of a large class of instances, by learning the behaviour of the sequential version of the algorithm on a small number of instances. Experiments with three solvers on SAT and TSP instances indicate that our method works well, with a correlation coefficient of up to 0.85 for SAT instances and up to 0.95 for TSP instances.
Description
Keywords
Runtime , Algorithm design and analysis , Prediction algorithms , Parallel algorithms , Machine learning algorithms , Approximation algorithms , Search problems
Citation
Arbelaez, A., Truchet, C. and O'Sullivan, B. (2016) 'Learning sequential and parallel runtime distributions for randomized algorithms', 2016 IEEE 28th International Conference on Tools with Artificial Intelligence (ICTAI), San Jose, CA, USA, 6-8 November. doi:10.1109/ICTAI.2016.0105
Link to publisher’s version
Copyright
© 2016, IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.