Page 1 of 1

## Intro to Particle Swarm Optimization

### #1 modi123_1

• Suitor #2

Reputation: 12583
• Posts: 49,569
• Joined: 12-June 08

Posted 28 August 2016 - 01:41 PM

Tools: Visual Studio 2015, .NET 4.5.2, console project

Note: This is a high level over view and shouldn't be taken as the end-all, be-all gospel on PSO.

Introduction

Particle Swarm Optimization (PSO) is a novel way to optimize a problem over a series of steps to find a solution to a measurable problem. That is to say you have a solution goal, a given set of unknown blocks that make up that goal, and you want to find the pieces that fit for that goal.

A set of 'particles' (widgets, objects, etc) are given values (position), and are converge (velocity) to a local and global best solution each iteration.

You will need to do a bit of outside-the-box thinking on this one.. A particle is one of many possible solutions objects you are trying to push to a goal. The 'position' is just the data; be it two numbers, a picture, five variables, whatever. It is what you will be modifying with the 'velocity'. Velocity is just a fancy way of saying 'fitness' and helps you modify the particle's data to reach the goal.

Folks tend to use a flock of birds as an example. You have a flock of birds and they are trying to find food. The birds are the particles, and the food is the goal.

The birds do not know exactly where the food is, but they know how close they are (fitness). This means each time slice a few things happen.
1. All the birds check how close they are to the food
2. The birds closest to each other (local neighborhood) say who is closest
3. Globally the closest bird to the food yells out their position.
4. All the birds also start moving (changing their data of x and y) towards the local and global best positions (velocity)
5. Repeat until the food is found.

This is considered a 'global search metaheuristic', and keep in mind PSO is not guaranteeing an optimal solution. It may certainly find a solution, but nothing says it is the 'best of all solutions'. In the same vein as Genetic Algorithms, but unlike Genetic Algorithms that operate with crossovers, mutations, etc PSO continuously tweak the data values they start with.

VB.NET GA example

General PSO Guidelines
The three wiz-bangs that thought up PSO were Kennedy, Eberhart and Shi and they provided three core principles to structure your code by.

1. Evaluate fitness of each particle
2. Update each individual and 'global best' fitness and their position.
3. Update each particle's velocity and position.

In relation to the bird example
1. Figure out each bird's closeness to food.
2. Local neighborhood bird and global bird who are closest yell out their position
3. Each bird tweaks their velocity towards the global and local birds and uses that to tweak their position (aka their data).

The Velocity Equation

This may not show up right, but here's the general velocity equation used:
V[t+1]=w∗V[t]+c1∗Rand1()∗(pb[t]-X[t])+c2∗Rand2()∗(pg[t]-x[t])

V[t+1]: the updated velocity for the next time epoch of this given particle
w : inertia weight
V[t] : current velocity
c1 and c2: Both are constants called "cognitive confidence coefficients", and typically c1 == c2 == 2. They helps determine size of movement to the best position.

Rand1() and Rand2() : random numbers from 0 to 1
pb[t] : local neighborhood best
X[t] : current particle's position
pg[t] : global particle best
x[t] : current particle position

The equation is seen as being three parts in one.
1. w∗V[t] is inertia portion. inertia to keep particle going in same direction. The lower the w vals the faster convergence to any solution, but the higher w vals the more solution space looked at. As far as I have seen the values is typically 0.8 to 1.2 and doesn't change at all.

2. c1∗Rand1()∗(pb[t]-X[t]) is the "cognitive component". Since tracking 'local neighborhood best' try and keep to where the particle's been.

3. c2∗Rand2()∗(pg[t]-x[t]) is the "social part".. The particles wants to head towards best area found over all. Hang out with the cool kids.

The Problem

Birds are great, but how does this apply to any problem you may want to solve. With most metaheuristic half the battle is figuring out the encoding of the problem in some data structure, and tweaking 'fitness' to drive velocity to a goal.

When trying out a new search concept I am always a fan of trying to solve pesky polynomial equations. You know x + 4 = 8 and you have to solve for x. They can be made complex enough to have a large solution space, and are easy enough to verify the math is valid.

In this case I am looking to find the five values that add up to 72 using the equation of: x + y + z + a + b = 72

The values could be positive or negative, or even the same values. All I am concerned is getting five values to add up to 72.

The values make up the particle's "position" in the solution space. Less so the actual physics definition but more of a concept.

I could approach this problem in a few ways. I could start out with a bunch of small numbers and gradually find their maximum until I hit the goal, or start out with a bunch of large numbers and find the smallest until I hit my goal. Here I opted to go for starting off large and driving towards a minimum.

A few constants to keep the problem in line..
• This is a simpler problem so my number of particles will be low.. like ten.
• Since the population is small I am going to treat the global and local neighborhood best as the same.
• I do not want each particle to go crazy on velocity so I cap the maximum at 10. If a new velocity is too much, or too little, use 10.
• My only other constraint is to set the high random values - I have a range I want to start with before driving home to the goal.

Example Walk Through
A quick time out for a walk through how this would work with a simpler format.

Say I want to find a solution to: x + y = 72. Two unknown variables I want to drive towards a goal.

My constants:
goal: 72
max number of polynomials: 2
max number of particles: 10
inital min/max: 40, 110 (arbitrarily picked by me)
max time: 100

Epoch: 0

Here is my list of particles coming out of the gate.

This is read as particle's index: x + y = total (velocity applied to last epoch's values to get here)

```  0:  65 +  98 = 163 (v:  0)
1:  71 +  76 = 147 (v:  0)
2: 106 +  80 = 186 (v:  0)
3: 101 +  91 = 192 (v:  0)
4:  68 +  54 = 122 (v:  0)
5:  63 +  80 = 143 (v:  0)
6:  76 +  64 = 140 (v:  0)
7: 100 +  44 = 144 (v:  0)
8:  74 +  66 = 140 (v:  0)
9: 102 + 109 = 211 (v:  0)
```

Remember I am looking for the best which, in this case is the minimum and that happens to be 4. 122 is the closest to the goal of 72.

No one has velocities yet as this is the initial run.

Epoch: 1 happens. Here I apply the velocities to each variable assuming they are different from the best.
-------
```  0:  55 +  88 = 143 (v:-10)
1:  61 +  66 = 127 (v:-10)
2:  96 +  70 = 166 (v:-10)
3:  98 +  88 = 186 (v: -3)
4:  68 +  54 = 122 (v:  0)
5:  53 +  70 = 123 (v:-10)
6:  66 +  54 = 120 (v:-10)
7:  99 +  43 = 142 (v: -1)
8:  64 +  56 = 120 (v:-10)
9:  92 +  99 = 191 (v:-10)
```

Take particle 0.
Epoch 0:
0: 65 + 98 = 163 (v: 0)
The best:
4: 68 + 54 = 122 (v: 0)

The code compares data values to the current best's value in the same index position.

is 65 == 68? No so 65 minus velocity gets us the new value of 55.
is 98 == 54? No so 98 minus velocity gets us the new value of 88.

This gives us the new particle 0 (with the velocity applied) for Epoch 1:
0: 55 + 88 = 143 (v:-10)

In this case both values of particle's 0 data have been changed.

On the other hand - particle 6 only had one value changed.
Epoch 0
6: 76 + 64 = 140 (v: 0)
The best:
4: 68 + 54 = 122 (v: 0)

is 76 == 68? No so 76 minus velocity gets us the new value of 66.
is 54 == 54? Yes, so no change because that guy's on the right track.

This givs us the new particle 6 (with the velocity applied) for Epoch 1:
6: 66 + 54 = 120 (v:-10)

This continues on for a few iterations until a best value is found.
Spoiler

Epoch: 9
-------
```  0:  20 +  53 =  73 (v:  4)
1:  38 +  48 =  86 (v:  2)
2:  53 +  37 =  90 (v:  8)
3:  48 +  38 =  86 (v:  9)
4:  42 +  31 =  73 (v:-10)
5:  16 +  33 =  49 (v: -8)
6:  53 +  41 =  94 (v:  7)
7:  64 +   8 =  72 (v:-10)
8:  23 +  25 =  48 (v: -4)
9:  20 +  27 =  47 (v: -5)
!!Found!! at index 7
64 + 8 = 72
```

Fancy! Let's get back to are problem at hand.

The Code

Particle class
Small and compact. A few values on the particle's velocity, a global best, and the data (aka position).

```class Particle
{
public int pGlobalBest { get; set; }// best global position
public int Velocity { get; set; }
public int[] Data { get; set; }

// in this case treating global best as local.
//public int pBestLocal { get; set; }//best local position
public Particle()
{
pGlobalBest = 0;
Velocity = 0;
}

public Particle(int maxPoly)
{
Data = new int[maxPoly];
pGlobalBest = 0;
Velocity = 0;
}
}

```

PSO class

All of the constants mentioned in the problem break down, initialization, and operation.
``` class PSO
{
private const int _lGoal = 72;
private const int _lMaxPoly = 2;

private const int _lMaxParticles = 10;
private const int _lVelocityMaximum = 10;

private const int _lInitRangeMin = 40;
private const int _lInitRangeMax = 110;

private const int _lMaxEpoch = 100; //time cut off.

private const int _lW = 1;

private List<Particle> _myParticles = null;

private Random _r;

public PSO()
{
int tempTotal = 0;
Particle tempPart = null;

_r = new Random();

_myParticles = new List<Particle>();

// randomize the input and make the list of particles.
for (int i = 0; i < _lMaxParticles; i++)
{
tempTotal = 0;
tempPart = new Particle(_lMaxPoly);

for (int z = 0; z < _lMaxPoly; z++)
{
tempPart.Data[z] = _r.Next(_lInitRangeMin, _lInitRangeMax);
tempTotal += tempPart.Data[z];
}
tempPart.pGlobalBest = tempTotal;
}
}

public void DoPSO()
{
bool bFound = false;
int lEpoch = 0;
int lBestIndex = 0;
int lBestIndexTemp = 0;

//found or time runs out.
while ((!bFound) && (lEpoch < _lMaxEpoch))
{
//-- 1. Print out data so far.
Console.WriteLine("Epoch: " + lEpoch);
Console.WriteLine("-------");
PrintParticles(true);

//-- 2. check to see if a solution was found.
for (int i = 0; i < _lMaxParticles; i++)
{
if (GetResults(i) == _lGoal)
{
bFound = true;
Console.WriteLine("!!Found!! at index " + i.ToString());
PrintParticle(i);
break;
}
}

//-- 3.  If nothing is found then find new best, update velocities, and update particle values.
if (!bFound)
{
//-- 3.1.  get the new minimum.
lBestIndexTemp = GetMinimumIndex();

//-- 3.2.  compare the new minimum to the current best minimum.
if (Math.Abs(_lGoal - GetResults(lBestIndexTemp)) < Math.Abs(_lGoal - GetResults(lBestIndex)))
{
// in the new is better, update who the best is.
lBestIndex = lBestIndexTemp;
}

Console.WriteLine("Current Best: " + lBestIndex.ToString());

//-- 3.3.  Using the new global best, update everyone's velocities
// -- note, for simplicitiy's sake 'local neighborhood best' is not used.. just global.
SetNewVelocity(lBestIndex);

//-- 3.4.  Start updating all the particle's data to zero in on the goal.
UpdateParticleValues(lBestIndex);

//-- 3.5.
lEpoch += 1;
Console.WriteLine(string.Empty);

}
}
}

//tweak the particle values by the velocity to be closer to a result.
// if the current particle's poly matches the current best's poly at that index don't updated it..
// this means other values can change and certain ones stay the same.
private void UpdateParticleValues(int lBestIndex)
{
int lTempCurrentTotal = 0;

for (int i = 0; i < _lMaxParticles; i++)
{
for (int z = 0; z < _lMaxPoly; z++)
{
if (_myParticles[i].Data[z] != _myParticles[lBestIndex].Data[z])
{
_myParticles[i].Data[z] += _myParticles[i].Velocity;
}
}

lTempCurrentTotal = GetResults(i);

if (Math.Abs(_lGoal - lTempCurrentTotal) < _myParticles[i].pGlobalBest)
{
_myParticles[i].pGlobalBest = lTempCurrentTotal;
}
}
}

//Based on the PSO equation (see at the top) - determine all the particles the new velocity based on the best, but keep inside the constraints so it doesn't wander too far.
//Note to keep this simple 'w' is 1 and is not a factor.
private void SetNewVelocity(int lBestIndex)
{
int lCurrentResult = 0;
int lBestResult = 0;
int lTempNewVelocity = 0;

lBestResult = GetResults(lBestIndex);

for (int i = 0; i < _lMaxParticles; i++)
{
lCurrentResult = GetResults(i);
lTempNewVelocity = (int)(_lW * _myParticles[i].Velocity + // inertia
2 * _r.NextDouble() * (_myParticles[i].pGlobalBest - lCurrentResult) + //cognative
2 * _r.NextDouble() * (lBestResult - lCurrentResult)); // social

// keep the velocity constrained so it is not going wild.
if (lTempNewVelocity > _lVelocityMaximum)
_myParticles[i].Velocity = _lVelocityMaximum;
else if (lTempNewVelocity < -_lVelocityMaximum)
_myParticles[i].Velocity = -_lVelocityMaximum;
else
_myParticles[i].Velocity = lTempNewVelocity;
}
}

// unique to finding this solution because I want to narrow in from large numbers to the closest matching our goal
// In this stripped down example I am treating local neighborhood best and global best as the same thing.
// This would be considered 'global best'.  You could implement a 'neighborhood best' in a different function.
private int GetMinimumIndex()
{
int lLowestIndex = 0;

//examine all the particles and find the one now closest to the goal.
for (int i = 0; i < _lMaxParticles; i++)
{
if (Math.Abs(_lGoal - GetResults(i)) < Math.Abs(_lGoal - GetResults(lLowestIndex)))
{
lLowestIndex = i;
}
}

return lLowestIndex;
}

// Get the current 'fitness' based on the equation we are trying to solve
private int GetResults(int index)
{
int lResults = 0;

for (int i = 0; i < _lMaxPoly; i++)
{
lResults += _myParticles[index].Data[i];
}

return lResults;
}

private void PrintParticles(bool lineNumbers)
{
string sTemp = string.Empty;

for (int i = 0; i < _lMaxParticles; i++)
{
sTemp = string.Empty;

if (lineNumbers)
sTemp +=  i.ToString().PadLeft(3) + ": ";

for (int z = 0; z < _lMaxPoly; z++)
{
if (z != _lMaxPoly - 1)
sTemp += " + ";
}

sTemp += " = " + GetResults(i).ToString().PadLeft(3);
sTemp += " (v:" + _myParticles[i].Velocity.ToString().PadLeft(3) + ")";
Console.WriteLine(sTemp);
}
}

private void PrintParticle(int index)
{
string sTemp = string.Empty;

sTemp = string.Empty;
for (int z = 0; z < _lMaxPoly; z++)
{
sTemp += _myParticles[index].Data[z].ToString();
if (z != _lMaxPoly - 1)
sTemp += " + ";
}

sTemp += " = " + GetResults(index);
Console.WriteLine(sTemp);
}

}

```

A sample run:
Spoiler

Not too bad for applying some general concepts to a problem. I want to stress, again, that how you determine to encode your problem and determine fitness is coupled to your solution you are looking for. This tutorial is not a set-in-stone boiler plate. Not all problems will be searching for the most minimal each time or not using local neighborhood best. The Particle class may grow bigger, have more parts, or have less.

Also I want to reiterate that basic PSO is not guarenteeing to find the most optimal solution, but it will certainly try and find _a_ solution. You certainly can tweak your route to collect solutions and determine a 'most optimal', but that is on a problem-by-problem basis.

FYI - I made the code tweakable enough in the constants you can try for different goals, number of polynomials, ranges, etc.

https://github.com/m...armOptimization

• use local neighborhood best
• try to make visualization for it
• use on finding best path in a graph
• Investigate a multi-objective problem

Swarm Intelligence by Russell C. Eberhart (Author), Yuhui Shi (Author), James Kennedy (

Applications:

Is This A Good Question/Topic? 2

## Replies To: Intro to Particle Swarm Optimization

### #2 ge∅

Reputation: 145
• Posts: 974
• Joined: 21-November 13

Posted 03 September 2016 - 10:57 AM

I didn't get everything. I will have to read it a couple more times but I find it very interesting. And as a graphic designer I'm almost more interested by the steps than by the solution, for a graphical representation (or animation) of the process could make for nice visuals.

### #3 modi123_1

• Suitor #2

Reputation: 12583
• Posts: 49,569
• Joined: 12-June 08

Posted 03 September 2016 - 11:43 AM

I'm not following... huh?

### #4 ge∅

Reputation: 145
• Posts: 974
• Joined: 21-November 13

Posted 03 September 2016 - 12:41 PM

I guess it's normal. I understand myself. Everything is fine />

The process of finding solutions by iterations is very exciting. There are some interesting properties that emerge when you try to represent the process (graphically or by the mean of sound or whatever). For example you can find interesting visuals representing the process of finding solutions for the shortest path problem. I've also seen a lot of self-organising algorithms mimicking for example magnetism which produce very impressive images when you watch it happen. I remember one which was modelling the emergence of vowels : there were independent cells emitting a sound at random but were influenced by the sound emitted by there neighbours. You would think that, given some time, they would eventually all emit the same sound but in fact clusters of "vowels" emerged, which is cool enough in itself, but I've actually heard an audio representation of the different steps allowing me to hear the pattern appearing and it was awesome. On a side note, I was also very excited by Google DeepMind for similar reasons.

So, yes, it's completely out of topic, but particle systems are just as cool as other algorithms for that matter. I think what makes all of these interesting artistically is that there is a transition between chaos and order, randomness and patterns, and art is all about making and disturbing patterns.

This post has been edited by ge∅: 03 September 2016 - 12:41 PM