ABSTRACT
Parameter configuration is a common procedure used in large-scale network protocols to support multiple operational goals. It can be formulated as a black-box optimization problem and solved with an efficient search algorithm. This paper proposes a new heuristic search algorithm, Recursive Random Search(RRS), for large-scale network parameter optimization. The RRS algorithm is based on the initial high-efficiency feature of random sampling and it attempts to maintain this high efficiency by constantly "restarting" random sampling with adjusted sample spaces. Besides the high efficiency, the RRS algorithm is robust to the effect of random noise and trivial parameters in the objective function because of its root in random sampling. These features are very important for the efficient optimization of network protocol configuration. The performance of RRS is demonstrated with the tests on a suite of benchmark functions. The algorithm has been applied to the configuration of several network protocols, such as RED, OSPF and BGP. One example application in OSPF routing algorithm is presented.
- Ratul Mahajan, David Wetherall, and Tom Anderson. Understanding bgp misconfiguration. In Proceedings of ACM SIGCOMM, 2002. Google ScholarDigital Library
- Tao Ye and et al. Traffic management and network control using collaborative on-line simulation. In Proc. of IEEE ICC'01, pages 204--209, Helsinki, Finland, June 2001.Google Scholar
- W. L. Price. Global optimization by controlled random search. Journal of Optimization Theory and Applications, 40:333--348, 1978.Google ScholarCross Ref
- J. H. Holland. Adaptation in Natural and Artificial Systems. Univ. of Michigan Press: Ann Arbor, 1975. Google ScholarDigital Library
- S. Kirkpatrick, D.C. Gelatt, and M.P. Vechhi. Optimization by simulated annealing. Science, 220:671--680, 1983.Google ScholarCross Ref
- D. h. Wolpert and W. G. Macready. No free lunch theorems for optimization. IEEE Transaction on Evolutionary Computing, 1:67--82, 1997. Google ScholarDigital Library
- Aimo Törn and Antanas Zilinskas. Global Optimization, volume 350 of Lecture Notes in Computer Science. Springer-Verlag, 1989.Google Scholar
- T. C. Hu, V. Klee, and D. Larman. Optimization of globally convex functions. SIAM Journal on Control and Optimization, 27(5):1026--1047, 1989. Google ScholarDigital Library
- K. D. Boese, A. B. Kahng, and S. Muddu. On the big valley and adaptive multi-start for discrete global optimizations. Technical Report TR-930015, UCLA CS Department, 1993.Google Scholar
- K. D. Boese, A. B. Kahng, and S. Muddu. a new adaptive multi-start technique for combinatorial global optimizations. Operation Research Letters, 16:101--113, 1994.Google ScholarDigital Library
- Justin A. Boyan and Andrew W. Moore. Learning evaluation functions to improve optimization by local search. Journal of Machine Learning Research, 1(2000):77--112, 2000. Google ScholarDigital Library
- Robert H. Leary. Global optimization on funneling landscapes. Journal of Global Optimization, 18(4):367--383, December 2000. Google ScholarDigital Library
- P. Brachetti, M. De Felice Ciccoli, G. Di Pillo, and S. Lucidi. A new version of the price's algorithm for global optimization. Journal of Global Optimization, 10:165--184, 1997. Google ScholarDigital Library
- Zelda B. Zabinsky. Stochastic methods for practical global optimization. Journal of Global Optimization, 13:433--444, 1998. Google ScholarDigital Library
- Melanie Mitchell. An Introduction to Genetic Algorithms. The MIT Press, 1996. Google ScholarDigital Library
- V. P. Plagianakos and M. N. Vrahatis. A derivative-free minimization method for noisy functions. In P.M. Pardalos A. Migdalas and R. Burkard, editors, Advances in Combinatorial and Global Optimization, pages 283--296. 2001.Google Scholar
- L. Armijo. Minimization of functions having lipschitz continuous first partial derivatives. Pacific Journal of Mathematics, 16:1--3, 1966.Google ScholarCross Ref
- R. Hooke and T. Jeeves. Direct search solution of numerical and statistical problems. Journal of the ACM, 8(2):212--229, April 1961. Google ScholarDigital Library
- Soraya Rana, L. Darrell Whitley, and Ronald Cogswell. Searching in the presence of noise. In H. Voigt, W. Ebeling, I. Rechenberg, and Hans-Paul Schwefel, editors, Parallel Problem Solving from Nature -- PPSN IV (Berlin, 1996) (Lecture Notes in Computer Science 1141), pages 198--207, Berlin, 1996. Springer. Google ScholarDigital Library
- A. H. Kan and G. T. Timmer. Stochastic global optimization methods part I: Clustering methods. Mathematical Programming, 39:27--56, 1987. Google ScholarDigital Library
- W. C. Davidon. Variable metric method for minimization. SIAM Journal on Optimization, 1:1--17, 1991. The article was originally published as Argonne National Laboratory Research and Development Report May 1959(revised November 1959).Google ScholarCross Ref
- R. MEAD and J. A. NELDER. A simplex method for function minimization. Computer Journal, 7(4):308--313, 1965.Google ScholarCross Ref
- H. Mühlenbein M. Schomisch and J. Born. The parallel genetic algorithm as function optimizer. In Richard K. Belew and Lashon B. Booker, editors, Proceedings of the Fourth Intl. Conf. on Genetic Algorithms, pages 271--278. Morgan-Kaufman, 1991.Google Scholar
- M. Ali, C. Storey, and A. Törn. Application of stochastic global optimization algorithms to practical problems. Journal of Optimization Theory and Applications, 95(3):545--563, 1997. Google ScholarDigital Library
- J. Beveridge, C. Graves, and C. E. Lesher. Local search as a tool for horizon line matching. Technical Report CS-96-109, Colorado State University, 1996.Google Scholar
- D. S. Johnson and L. A. McGeoch. The travelling salesman problem: a case study in local optimizaiton. In E. H. L. Aarts and J. K. Lenstra, editors, Local Search in Combinatorial Optimization. Wiley and Sons, 1997.Google Scholar
- A. Juels and M. Wattenberg. Stochastic hillclimbing as a baseline method for evaluating generic algorithms. In D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo, editors, Advances in Neural Information Processing Systems, volume 8, pages 430--436. 1996.Google Scholar
- Michael W. Trosset. On the use of direct search methods for stochastic optimization. Technical report, Department of Computational and Applied Mathematics, Rice University, 2000.Google Scholar
- Tao Ye and Shivkumar Kalyanaraman. A recursive random search for optimizing network protocol parameters. Technical report, ECSE Department, Rensslaer Polytechnique Institute, Dec 2002.Google Scholar
- Tao Ye, Hema T. Kaur, and Shivkumar Kalyanaraman. Automatic network performance management with on-line simulation. Technical report, ECSE Department, Rensslaer Polytechnique Institute, 2002.Google Scholar
- Bernard Fortz and Mikkel Thorup. Internet traffic engineering by optimizing ospf weights. In Proceedings of the INFOCOM 2000, pages 519--528, 2000.Google ScholarCross Ref
Index Terms
- A recursive random search algorithm for large-scale network parameter configuration
Recommendations
A recursive random search algorithm for network parameter optimization
This paper proposes a new heuristic search algorithm, Recursive Random Search(RRS), for black-box optimization problems. Specifically, this algorithm is designed for the dynamical parameter optimization of network protocols which emphasizes on obtaining ...
A recursive random search algorithm for large-scale network parameter configuration
Parameter configuration is a common procedure used in large-scale network protocols to support multiple operational goals. It can be formulated as a black-box optimization problem and solved with an efficient search algorithm. This paper proposes a new ...
Artificial bee colony algorithm with efficient search strategy based on random neighborhood structure
AbstractAs a popular swarm intelligence algorithm, artificial bee colony (ABC) achieves excellent optimization performance, but it has some shortcomings. In order to strengthen the performance of ABC, a new ABC with efficient search strategy ...
Highlights- A new random K-neighborhood structure (RNS) is designed.
- Each solution has an ...
Comments