Maquelin, Eva (2018) Optimizing Parameters of Iterative Methods. Bachelor's Thesis, Mathematics.
|
Text
bMATH_2018_MaquelinEI.pdf Download (1MB) | Preview |
|
Text
Toestemming.pdf Restricted to Registered users only Download (94kB) |
Abstract
Numerical optimization methods provide a way of computing the optimum of a function, even when the function is not differentiable. There are many numerical methods and it is important to choose the right one for your situation. Not only functions, but also iterative methods can be optimized. If an algorithm depends on some parameters, then the optimal parameter values, giving the minimum number of iterations required for convergence, can be found by applying a minimization method. This study discusses the idea behind, and convergence behaviour of, the Downhill Simplex Method, the Powell Methods and Particle Swarm Optimization. The methods are applied to the Rosenbrock and the Rastrigin function, two well known test functions for numerical optimization, and hereafter to some numerical algorithms depending on various parameters. The convergence behaviour of a numerical optimization method can depend highly on the given starting point. In this study, we see that especially Particle Swarm Optimization works well for the test cases where function evaluations are computationally cheap. However, for other optimization problems another method might be preferred, as the quality of the convergence behaviour of the numerical optimization method ultimately depends on the problem being optimized.
Item Type: | Thesis (Bachelor's Thesis) |
---|---|
Supervisor name: | Luppes, R. |
Degree programme: | Mathematics |
Thesis type: | Bachelor's Thesis |
Language: | English |
Date Deposited: | 11 Jul 2018 |
Last Modified: | 13 Jul 2018 08:34 |
URI: | https://fse.studenttheses.ub.rug.nl/id/eprint/17777 |
Actions (login required)
View Item |