Computational & Technology Resources
an online resource for computational,
engineering & technology publications |
|
Civil-Comp Proceedings
ISSN 1759-3433 CCP: 93
PROCEEDINGS OF THE TENTH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STRUCTURES TECHNOLOGY Edited by:
Paper 209
Convergence of Successive Regression Approximations for Solving Noisy Equations I. Deák
Department of Computer Science, Corvinus University of Budapest, Hungary , "Convergence of Successive Regression Approximations for Solving Noisy Equations", in , (Editors), "Proceedings of the Tenth International Conference on Computational Structures Technology", Civil-Comp Press, Stirlingshire, UK, Paper 209, 2010. doi:10.4203/ccp.93.209
Keywords: equation solving, noisy functions, convergence, optimization.
Summary
The basic problem we consider in this paper is finding the root of a one-dimensional function, when noisy function values are available only. The iterative method we developed to solve this problem is called successive regression approximations (SRA), and it produces a sequence of points, converging to the root. As the name indicates the method relies on a series of regression approximations, that are updated at each step of the iterative procedure. The n-dimensional version of the method proved to be quite succesfull in solving different types of stochastic programming problems including problems with probability constraint, two-stage problems and mixed stochastic problems, according to extensive numerical computations [1,2].
The convergence of the basic method for solving one dimensional root-finding problems when exact function values are available has been proven in [3]. In this paper the stochastic convergence of the points, produced by SRA is proved in case of noisy function values. First we have shown that the steplength (the distance of the consecutive points) in our algorithm converges to zero, no matter what kind of point sequence is produced by the algorithm. Then we used a simple random walk (with discrete parameter space and continuous state space) to construct two composite random processes, bracketing the point sequence generated by our algorithm. Since the two composite random processes were ergodic, thus the bracketed process also posesses this property, by which the stochastic convergence of the point sequence generated by SRA to the root could be easily shown. Successively applying regression functions is a natural extension of the celebrated result of Robbins and Monro [4]. SRA is a simple extension of the so called Response Surface Methodology [5]: we use all previous points and function values. The main conclusion of this paper is, that using successively updated regression approximations leads to a sequence of convergent points. References
purchase the full-text of this paper (price £20)
go to the previous paper |
|