School of Computing. Dublin City University. Home Blog Teaching Research Contact My big idea: Ancient Brain 


Let us have n items. The "fitness" score of item i is f(i). Let the probability of picking item i, p(i), be defined as:
Then by varying the parameter T, we can vary the selection from picking a random item (T infinite) to: having higher probabilities for items with higher fitness (T small finite) to: strictly picking the item with the best fitness (T goes to 0).
T is called "temperature"  analogy to cooling. See Simulated Annealing.
T exactly = 0 will cause divide by 0 error.
Also, in software:
Have to figure out safe T ranges and f ranges for your software system.
We will be keeping T > 0
and looking at T > 0
If T > 0, and all f(i) are
finite (positive or negative),
then f(i)/T is
finite (positive or negative).
all e^{(f(i)/T)} > 0
all probabilities are in range:
0 < p(i) < 1
i.e.
This is a probability distribution.
As T > 0
a probability can > 0
but as long as T > 0
all probabilities are > 0
Prove:
If there are m joint best items, we pick them with probability 1/m, all others with probability 0.
To see what happens as T > 0 consider 2 items:
i 1 2 f(i) 1 2If T=1, probabilities are:
If T=1/20, probabilities are:
e^{20} / (e^{20} + e^{40})
and:
e^{40} / (e^{20} + e^{40})
i.e. pretty much 0 and 1.
In general:
Let us have any two items
where one has a higher fitness value:
i 1 2 f(i) c dwhere d > c
p(1) = e^{cN} / ( e^{cN} + e^{dN} )
= 1 / ( 1 + e^{(dc)N} )
→ 0
as
N → ∞
since
(dc) > 0
So p(1) > 0 as T > 0