cst 四個優(yōu)化算法中各有什么優(yōu)點(diǎn)?
CST優(yōu)化CST的四個優(yōu)化算法中都用在什么情況,什么遺傳算法,微粒群算法,還有什么牛頓迭代和什么的,一般用哪個可以比較快的得到最優(yōu)結(jié)果啊?
CST幫助文件里有很詳細(xì)的解釋:
Choose between the five optimizer types. The Interpolated Quasi Newton algorithm makes use of approximated gradient information and the Powell optimizer of partial derivatives to achieve faster convergence rates. However these algorithms are sensitive to the choice of the starting point in the parameter space. If the starting point is close to the desired optimum or the (unknown) goal function is sufficiently smooth then the local algorithms will converge quickly.
The Interpolated Quasi Newton optimizer is fast due to its support of interpolation of primary data, but in some cases it may be not as accurate as the slower Classic Powell optimizer.
The Nelder Mead Simplex Algorithm has a set of starting points and does not need gradient information to determine it's search direction. This is an advantage over the other local algorithms as soon as the number of variables grows.
If a non-smooth goal function is expected, the starting point is far away from the optimum or a large parameter space is going to be explored then a global algorithm should be preferred. For the featured global optimizers a maximal number of iterations can be specified. Therefore the maximal number of goal function evaluations, and thus optimization time, can be determined a priori. Another advantage of the global optimizers is that the number of evaluations is independent from the number of parameters. Therefor the choice of a global optimizer over a local one can pay off if the optimization problem has a large number of parameters.
Genetic Algorithm: Selects the global genetic optimizer.
Particle Swarm Optimization: Selects the global particle swarm optimizer.
Nelder Mead Simplex Algorithm: Selects the local Simplex optimization algorithm by Nelder and Mead. This method is a local optimization technique. If N is the number of parameters, it starts with N+1 points distributed in the parameter space.
Interpolated Quasi Newton: Selects the local optimizer supporting interpolation of primary data. This optimizer is fast in comparison to the Classic Powell optimizer but may be less accurate. In addition, you can set the number N of optimizer passes (1 to 10) for this optimizer type. A number N greater than 1 forces the optimizer to start over (N-1) times. Within each optimizer pass the minimum and maximum settings of the parameters (see Optimizer Parameters) are changed approaching the optimal parameter setting. Increase the number of passes to values greater than 1 (e.g., 2 or 3)to obtain more accurate results.
Classic Powell: Selects the local optimizer without interpolation of primary data. In addition, it is necessary to set the accuracy, which effects the accuracy of the optimal parameter settings and the time of termination of the optimization process. For optimizations with more than one parameter the Interpolated Quasi Newton or the Nelder Mead Simplex Algorithm should be preferred to this technique.
沒用過Optimisation,所以沒有什么經(jīng)驗(yàn)可以分享。就上面這段幫助內(nèi)容,有兩個Global Optimiser(Genetic Algorithm、Particle Swarm Optimisation)和三個Local Optimiser(Nelder Mead Simplex Algorithm、Interpolated Quasi Newton、Classic Powell)。IQN的優(yōu)化速度比CP來得快,但是準(zhǔn)確性可能沒有CP高。如果優(yōu)化過程用到很多參數(shù),那么最好同時使用一個Global Optimiser。
恩,謝謝啊!
不客氣??匆郧暗奶?,樓主現(xiàn)在應(yīng)該是在嘗試用MWS和DS做優(yōu)化仿真。個人建議你最好能將這幾種優(yōu)化算法都試一下,看看對于特定的案例(比如你現(xiàn)在正在做的例子),用什么優(yōu)化算法的組合可以得到最滿意的結(jié)果(兼顧速度和準(zhǔn)確性)。
希望樓主試驗(yàn)成功后可以專門開貼描述流程,總結(jié)經(jīng)驗(yàn)。
實(shí)話不滿您說啊,結(jié)果一直不好,我都郁悶死了,這優(yōu)化的時間很長,哎,但結(jié)果改善的不明顯,我一直在懷疑我的模型,或者參數(shù)設(shè)置有問題,有經(jīng)驗(yàn)一定會分享?。」?p class="mwqa">用CST的優(yōu)化工具優(yōu)化時間很長,尤其是電尺寸較大,模型較復(fù)雜優(yōu)化時間實(shí)在讓人等待心急。采用遺傳算法和粒子群算法是基于全局優(yōu)化的,迭代次數(shù)需要很多才能較收斂,而且跟變量采樣數(shù)有關(guān)。后面的兩種算法是局部算法,對于變量較少的,用局部算法來優(yōu)化方法較快收斂,對于多變量盡量選擇全局算法收斂較塊。在CST里面目前還是沒有找到混合方法優(yōu)化。總之,場優(yōu)化是需要很長時間的。這是個人的使用CST優(yōu)化的一點(diǎn)感覺而已。
總之, CST的優(yōu)化能力是比較弱的。
很少用optimiser,所以這個我沒有什么個人意見。不過我自己更傾向于使用parameter sweep手動調(diào)整參數(shù)。個人習(xí)慣,僅供參考!
先掃描,再優(yōu)化吧,建議這樣~