• PSO in forth

    From melahi_ahmed@melahi_ahmed@yahoo.fr (ahmed) to comp.lang.forth on Sun Mar 31 09:31:57 2024
    From Newsgroup: comp.lang.forth

    Hi,
    Here is my attempt to write a PSO programme and some applications. It is written for gforth.
    Save the code for PSO as pso.fs and include it in application files.

    =======Here begins the code for PSO: pso.fs

    include random.fs
    \ : random choose ;

    : frand 1000000000000 random s>f 1e-12 f* ;
    : frandom ( f: xmin xmax -- ) fover f- frand f* f+ ;

    20 value nvars_max
    0 value nvars
    0 value Cost
    0 value particle_size
    0 value population_size
    0 value population
    0 value print_iter
    10 value iter_max

    10 value max_iter_without_change
    0 value num_iter_without_change

    2.05e fvalue c1
    2.05e fvalue c2
    0.7298e fvalue ki

    fvariable Jgb
    1e9 Jgb f!

    fvariable Jgb_pre
    1e9 Jgb_pre f!

    1e-5 fvalue ftol

    create X nvars_max floats allot
    create Xgb nvars_max floats allot
    create (r1) nvars_max floats allot
    create (r2) nvars_max floats allot
    create xmin nvars_max floats allot
    create xmax nvars_max floats allot

    : r1 nvars 0 do frand i (r1) swap floats + f! loop ;
    : r2 nvars 0 do frand i (r2) swap floats + f! loop ;
    : >xmin ( f: x1min ... -- ) nvars 0 do nvars 1- i - floats xmin + f! loop ;
    : >xmax ( f: x1max ... -- ) nvars 0 do nvars 1- i - floats xmax + f! loop ;

    : cr1th (r1) swap floats + f@ c1 f* ;
    : cr2th (r2) swap floats + f@ c2 f* ;

    : problem: dup to nvars 3 * 2 + floats to particle_size ' to Cost ;
    : >options to print_iter to ftol to max_iter_without_change to iter_max ;
    : particle_swarm: to population_size create population_size particle_size * allot does> to population ;

    : (particle) ( n -- p) particle_size * + ;

    : xth ( i -- ) floats + ;
    : vth ( i -- ) nvars + xth ;
    : J nvars 2* xth ;
    : Xlbth ( i -- ) nvars 2* 1+ + xth ;
    : Jlb nvars 3 * 1+ xth ;

    : Xgbth ( i -- Xgbtha) floats Xgb + ;


    : particle ( n -- p ) population swap (particle) ;
    : particleXth ( i n -- xtha ) particle swap xth ;
    : particleVth ( i n -- vtha ) particle swap vth ;
    : particleJ ( n -- Ja ) particle J ;
    : particleXlbth ( i n -- Xlbtha ) particle swap Xlbth ;
    : particleJlb ( n -- Jlba ) particle Jlb ;

    : updateXth ( i n -- ) 2dup particleXth f@ 2dup particleVth f@ f+ particleXth f! ;
    : updateX ( n -- ) nvars 0 do i over updateXth loop drop ;

    : updateVth ( i n -- )
    2dup particleVth f@
    over Xgbth f@ 2dup particleXth f@ f- over cr1th f* f+
    2dup particleXlbth f@ 2dup particleXth f@ f- over cr2th f* f+
    ki f*
    particleVth f! ;

    : updateV ( n -- )
    nvars 0
    do
    r1 r2
    i over updateVth
    loop
    drop ;

    : updateJ ( n --)
    dup
    particle x nvars floats move
    x Cost execute particleJ f! ;

    : updateJlbXlb ( n --)
    dup particleJlb f@
    dup particleJ f@ f>
    if
    dup particleJ f@ dup particleJlb f!
    nvars 0
    do
    i over particleXth f@ i over particleXlbth f!
    loop
    then
    drop ;

    : update_particle ( n --) dup updateV dup updateX dup updateJ updateJlbXlb ;

    : update_population
    population_size 0
    do
    i update_particle
    loop ;

    : init_population
    population_size 0
    do
    i
    nvars 0
    do
    0e i over particleVth f!
    i floats xmin + f@
    i floats xmax + f@
    frandom
    fdup i over particleXth f!
    i over particleXlbth f!
    loop
    i particle x nvars floats move
    x Cost execute fdup
    particleJ f!
    i particleJlb f!
    loop ;

    : updateJgbXgb
    population_size 0
    do
    i particleJlb f@ fdup Jgb f@ f<
    if
    Jgb f!
    i
    nvars 0
    do
    i over particleXlbth f@ i Xgbth f!
    loop
    drop
    else
    fdrop
    then
    loop ;

    : solve
    1e9 Jgb f!
    0 to num_iter_without_change
    init_population
    updateJgbXgb
    iter_max 0
    do
    update_population
    updateJgbXgb

    Jgb f@ Jgb_pre f@ f- fabs ftol f<
    if
    num_iter_without_change 1+ to num_iter_without_change
    else
    0 to num_iter_without_change
    then

    num_iter_without_change max_iter_without_change =
    if
    leave ( unloop)
    then

    Jgb f@ Jgb_pre f!

    print_iter
    if
    cr i . ." iteration: "
    nvars 0
    do
    3 spaces
    i xgbth f@ f.
    loop
    3 spaces Jgb f@ f.
    then
    loop

    cr cr cr ." The solution is : "
    nvars 0
    do
    i xgbth f@ f. 3 spaces
    loop
    cr ." The correspondant Cost is : "
    Jgb f@ f.
    cr cr cr ;

    =======Here the code ends

    Some applications (tests) are given below:

    ========1st application: pso_test1.fs

    include pso.fs

    \ The true solution is x1 = 0.5e, x2 = 8.3e, x3 = 5.8e and x4 = 2.6e and the minimal cost is 0.5e
    : Cost1 ( x -- ) ( f: -- C) dup 3 xth f@ 2.6e f- fdup f* dup 2 xth f@ 5.8e f- fdup f* f+ dup 1 xth f@ 8.3e f- fdup f* f+ f@ 0.5e f- fdup f* f+ 0.5e f+ ;

    4 problem: Cost1
    1000 particle_swarm: PS1
    50 10 1e-2 true >options
    -10e -10e -10e -10e >xmin
    10e 10e 10e 10e >xmax

    PS1 solve

    \ cr ." Done. Bye


    ========2nd application: pso_test2.fs

    include pso.fs

    \ The true solution is x1 = 8.3e, x2 = 5.8e and x3 = 2.6e and the minimal cost is 0.5e
    : Cost1 ( x -- ) ( f: -- C) dup 2 xth f@ 2.6e f- fdup f* dup 1 xth f@ 5.8e f- fdup f* f+ f@ 8.3e f- fdup f* f+ 0.5e f+ ;

    3 problem: Cost1
    1000 particle_swarm: PS1
    50 10 1e-2 false >options
    -10e -10e -10e >xmin
    10e 10e 10e >xmax

    PS1 solve

    \ cr ." Done. Bye"


    ========3rd application:pso_test3.fs

    include pso.fs

    \ The true solution is x1 = 5.8e and x2 = 2.6e and the minimal cost is 0.5e
    : Cost1 ( x -- ) ( f: -- C) dup 1 xth f@ 2.6e f- fdup f* f@ 5.8e f- fdup f* f+ 0.5e f+ ;

    2 problem: Cost1
    1000 particle_swarm: PS1
    50 10 1e-2 false >options
    -10e -10e >xmin
    10e 10e >xmax

    PS1 solve

    \ cr ." Done. Bye"


    ========4th application: pso_test4.fs

    include pso.fs

    \ The true solution is x1 = 2e and x2 = 2e and the minimal cost is about -16.841358408616
    : f3_1() ( f: x1 x2 -- res) fswap -2e f- fdup f* fswap -2e f- fdup f* f+ fsqrt 3e f/ fnegate fexp 5e f* ;
    : f3_2() ( f: x1 x2 -- res) fswap 2e f- fdup f* fswap 2e f- fdup f* f+ fsqrt 2e f/ fnegate fexp 15e f* ;
    : f3_3() ( f: x1 x2 -- res) fswap 2e f- fdup f* fswap -2e f- fdup f* f+ fsqrt 2e f/ fnegate fexp 12e f* ;
    : f3_4() ( f: x1 x2 -- res) fswap -2e f- fdup f* fswap 2e f- fdup f* f+ fsqrt 2e f/ fnegate fexp -4e f* ;


    : f3() ( x --) ( f: -- res)
    dup f@ 1 xth f@
    fover fover f3_1() frot frot
    fover fover f3_2() frot frot
    fover fover f3_3() frot frot
    f3_4() f+ f+ f+ ;

    : Cost3 f3() fnegate ; \ here we maximize f3() (the use of fnegate to minimize)


    2 problem: Cost3
    1000 particle_swarm: PS1
    100 10 1e-2 false >options
    -4e -4e >xmin
    4e 4e >xmax

    PS1 solve

    \ cr ." Done. Bye"



    ========5th application: pso_test5.fs

    include pso.fs

    \ The true solution is x = 0.56714329.. and the minimal cost is 0e, solution of x-exp(-x)=0 (or x*exp(x)=1)
    : f1() ( f: x -- y) fdup fnegate fexp f- ;
    : Cost1 ( x -- ) ( f: -- C) f@ f1() fabs ;

    1 problem: Cost1
    1000 particle_swarm: PS1
    500 10 1e-3 false >options
    1e >xmin
    2e >xmax

    PS1 solve

    \ cr ." Done. Bye"



    =======6th application: pso_test6.fs


    include pso.fs

    \ The true solution is x_i = i^2, i = 3 ... 12

    : Cost1 ( x -- )( f: -- c) 0e 10 0 do dup i xth f@ i 3 + dup * s>f f- fabs f+ loop drop ;


    10 problem: Cost1
    100 particle_swarm: PS1
    500 10 1e-3 false >options

    0e 0e 0e 0e 0e 0e 0e 0e 0e 0e >xmin
    200e 200e 200e 200e 200e 200e 200e 200e 200e 200e >xmax

    PS1 solve

    \ cr ." Done. Bye"


    ========= Here, the applications (tests) finish

    Any comments and remarks are appreciated

    Ahmed
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From mhx@mhx@iae.nl (mhx) to comp.lang.forth on Sun Mar 31 10:03:40 2024
    From Newsgroup: comp.lang.forth

    Can you describe the main idea of Particle Swarm Optimization in a few words? In particular: does it run quicker because of the swarm (why), or is just easier to write than a single particle trying all possibilities (and possibly marking unsuccessful areas like, e.g., Ant Colony Optimization)?

    -marcel
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From melahi_ahmed@melahi_ahmed@yahoo.fr (Ahmed) to comp.lang.forth on Sun Mar 31 10:43:01 2024
    From Newsgroup: comp.lang.forth

    Can you describe the main idea of Particle Swarm Optimization in a few words?

    A group of individuals (initially randomly positionned) randomly searching for the optimal (suboptimal) solution for a problem by (minimizing or maximizing a certain criterion; the cost) and the search is informed (the individuals know the best solutions given by the others). The search is performed by moving the individuals towards those who are the best (with random steps). Examples of problems: identification of dynamical systems (induction motors, PMSM, ... training neural nets, ..., obtaining the best controllers, etc).


    In particular: does it run quicker because of the swarm (why),

    No in general. but yes; when considering the random initialization for each individual.
    And surely yes when using parallel computing (GPU). The PSO algorithm is parallelizable.


    or is just easier to write than a single particle trying all possibilities

    Exhaustive search is impossible when the dimension of the search space is high (number of unkowns grows: e.g. for Induction motor: we have 7 paramters: Rs, Ld, Lq, f, J, P, and for PMSM: Rs, Ld, Lq, phi, f, J, P, and neural nets with hundreds or thousands of weights).



    (and possibly marking unsuccessful areas like, e.g., Ant Colony Optimization)?

    I think that PSO algorithm always converges given the good choice of its parameters : C1, C2 and ki and the sufficient number of individuals and iterations.
    Another hint: we can restart the algorithm with the last obtained best solution (use it for randomly initializing the individuals in the neighborhood (near at some extent) of that best solution).
    Also, we can avoid local traps (local minima or maxima). The PSO gives the global best solution or solutions (optimal or suboptimal with a certain given tolerance).

    Ahmed
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From mhx@mhx@iae.nl (mhx) to comp.lang.forth on Sun Mar 31 11:58:29 2024
    From Newsgroup: comp.lang.forth

    Ahmed wrote:

    Can you describe the main idea of Particle Swarm Optimization
    in a few words?
    [..]
    The search is performed by moving the individuals towards those
    who are the best (with random steps).

    This is possible for a continuously differentiable goal function.

    How then does the method compare to globally convergent Newton-type
    methods for multi-objective optimization?

    Exhaustive search is impossible when the dimension of the search
    space is high (number of unkowns grows: e.g. for Induction motor:
    we have 7 paramters: Rs, Ld, Lq, f, J, P, and for PMSM: Rs, Ld,
    Lq, phi, f, J, P, and neural nets with hundreds or thousands of
    weights).

    Knowing that one is searching for PMSM equations allows using the
    PMSM model structure. That is way more efficient than assuming
    relationships. (I think this is the idea for Physics Inspired
    NN's or PINNs)

    Another hint: we can restart the algorithm with the last obtained
    best solution (use it for randomly initializing the individuals in
    the neighborhood (near at some extent) of that best solution).
    Also, we can avoid local traps (local minima or maxima). The PSO
    gives the global best solution or solutions (optimal or suboptimal
    with a certain given tolerance).

    That is also how these Newton methods work.
    They won't fit in the space of your posting though :--)

    -marcel
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From melahi_ahmed@melahi_ahmed@yahoo.fr (ahmed) to comp.lang.forth on Sun Mar 31 12:54:20 2024
    From Newsgroup: comp.lang.forth

    PSO can be used for problems that are not differentiable (no gradient or hessian is needed).

    See the web pages below: https://www.physicsforums.com/threads/particle-swarm-optimization-vs-newtons-method.659526/#:~:text=PSO%20is%20a%20population%2Dbased,function%20to%20find%20the%20minimum.

    For multi-objective optimization, generally I use weighed sum of individual costs to form the multi-objective cost (like in optimal control, predictive control, etc). see this web page (the article)

    https://www.tandfonline.com/doi/full/10.1080/23311916.2018.1502242#:~:text=The%20MOO%20or%20the%20multi,which%20consequently%20simplifies%20the%20problem.

    for the PMSM example, I've already done it with Matlab that way (I used the model of the PMSM and got the parameters without using neural nets or Scientific Machine Learning as proposed by Julia language guys).

    And there is a paper using this approach (but with another meta-heuristic nautre inspired method and not PSO) to get the PMSM parameters online.

    Ahmed
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From minforth@minforth@gmx.net (minforth) to comp.lang.forth on Sun Mar 31 17:06:12 2024
    From Newsgroup: comp.lang.forth

    Thanks for this interesting example and for doing it in Forth!

    Many years ago, we had investigated various optimisation concepts
    for adaptive controllers with parameter adaptation "in-the-loop"
    as as a background task. In the end we settled on two concepts:
    stupid Monte Carlo for processes with too many or too non-linear characteristics, and Kalman (combined with fead-forward controls)
    where (some) linear modelling was feasible.

    MMO or genetic algorithms required too much mathematics up front,
    but the main reason for non-applicability was: if something went
    wrong or didn't stabilise, the beasts were virtually impossible
    to debug/tune by third parties or service technicians.
    "Too academic" was the general feeling.

    I am not writing this to belittle your work. Far from it.
    I just wanted to add some real life experience of an old engineer.
    ;-)
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From melahi_ahmed@melahi_ahmed@yahoo.fr (Ahmed) to comp.lang.forth on Sun Mar 31 17:52:39 2024
    From Newsgroup: comp.lang.forth

    I wrote this programe first in matlab (realy in matlab, I hadn't used the optimtool toolbox which contain the PSO in its tool belts).
    Then, I wrote it in Maple for a colleague (He use Maple for his work). He used it to solve a system of nonlinear equations (15 nonlinear algebraic equations and 12 unkowns with coeficients varying from too small to too huge). He needed a global solver that he hadn't and couldn't get it from maplesoft (must purchase it).
    So the solver based on PSO worked fine.
    Then the idea to do it in forth came to Me.
    And this is the result.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From minforth@minforth@gmx.net (minforth) to comp.lang.forth on Sun Mar 31 19:13:03 2024
    From Newsgroup: comp.lang.forth

    Matlab and others are THE tools of the trade. They are fantastic for lab/desktop system design, analysis, simulation and optimisation.
    And if you can use available toolboxes, the better. There are even
    compilers with hardware support for I/O and control.

    However, such controllers are extremely fat, expensive and power hungry,
    which more or less rules them out for use in the field. OTOH Forth is
    tiny and extremely efficient, making it a wonderful tool for building
    e.g. embedded devices.

    IMO your work is a good example of how, with enough understanding of
    the problem and dedication, even mathematically complex solutions can
    be brought down from those gigantic lab tools to the world of small
    systems where Forth shines. Congratulations!
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Ron AARON@clf@8th-dev.com to comp.lang.forth on Thu Apr 4 16:42:56 2024
    From Newsgroup: comp.lang.forth

    On 31/03/2024 12:31, ahmed wrote:
    Hi, Here is my attempt to write a PSO programme and some applications.
    It is written for gforth.
    Save the code for PSO as pso.fs and include it in application files.

    Thank you, Ahmed, you inspired me to come up with a version in 8th:

    https://8th-dev.com/forum/index.php?topic=2829.0

    As it's a generally useful tool, I've added it to the next version's libraries.
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From minforth@minforth@gmx.net (minforth) to comp.lang.forth on Thu Apr 11 15:17:23 2024
    From Newsgroup: comp.lang.forth

    For those with an FB account, see PSO live: https://www.facebook.com/pablohugo.reda/videos/388748543980891
    --- Synchronet 3.20a-Linux NewsLink 1.114