# Fly algorithm

## History

The Fly Algorithm is a type of cooperative coevolution based on the Parisian approach.[1] The Fly Algorithm has first been developed in 1999 in the scope of the application of Evolutionary algorithms to computer stereo vision.[2][3] Unlike the classical image-based approach to stereovision, which extracts image primitives then matches them in order to obtain 3-D information, the Fly Agorithm is based on the direct exploration of the 3-D space of the scene. A fly is defined as a 3-D point described by its coordinates (x, y, z). Once a random population of flies has been created in a search space corresponding to the field of view of the cameras, its evolution (based on the Evolutionary Strategy paradigm) used a fitness function that evaluates how likely the fly is lying on the visible surface of an object, based on the consistency of its image projections. To this end, the fitness function uses the grey levels, colours and/or textures of the calculated fly's projections.

The first application field of the Fly Algorithm has been stereovision.[2][3][4][5] While classical image priority' approaches use matching features from the stereo images in order to build a 3-D model, the Fly Algorithm directly explores the 3-D space and uses image data to evaluate the validity of 3-D hypotheses. A variant called the "Dynamic Flies" defines the fly as a 6-uple (x, y, z, x’, y’, z’) involving the fly's velocity.[6][7] The velocity components are not explicitly taken into account in the fitness calculation but are used in the flies' positions updating and are subject to similar genetic operators (mutation, crossover).

The application of Flies to obstacle avoidance in vehicles[8] exploits the fact that the population of flies is a time compliant, quasi-continuously evolving representation of the scene to directly generate vehicle control signals from the flies. The use of the Fly Algorithm is not strictly restricted to stereo images, as other sensors may be added (e.g. acoustic proximity sensors, etc.) as additional terms to the fitness function being optimised. Odometry information can also be used to speed up the updating of flies' positions, and conversely the flies positions can be used to provide localisation and mapping information.[9]

Another application field of the Fly Algorithm is reconstruction for emission Tomography in nuclear medicine. The Fly Algorithm has been successfully applied in single-photon emission computed tomography[10] and positron emission tomography[11] .[12] Here, each fly is considered a photon emitter and its fitness is based on the conformity of the simulated illumination of the sensors with the actual pattern observed on the sensors. Within this application, the fitness function has been re-defined to use the new concept of 'marginal evaluation'. Here, the fitness of one individual is calculated as its (positive or negative) contribution to the quality of the global population. It is based on the leave-one-out cross-validation principle. A global fitness function evaluates the quality of the population as a whole; only then the fitness of an individual (a fly) is calculated as the difference between the global fitness values of the population with and without the particular fly whose individual fitness function has to be evaluated.[13][14] In [15] the fitness of each fly is considered as a level of confidence'. It is used during the voxelisation process to tweak the fly's individual footprint using implicit modelling (such as metaballs). It produces smooth results that are more accurate.

More recently it has been used in digital art to generate mosaic-like images or spray paint.[16] Examples of images can be found on YouTube

## Parisian Evolution

Here, the population of individuals is considered as a society where the individuals collaborate toward a common goal. This is implemented using an evolutionary algorithm that includes all the common genetic operators (e.g. mutation, cross-over, selection). The main difference is in the fitness function. Here two levels of fitness function are used:

• A local fitness function to assess the performance of a given individual (usually used during the selection process).
• A global fitness function to assess the performance of the whole population. Maximising (or minimising depending on the problem considered) this global fitness is the goal of the population.

In addition, a diversity mechanism is required to avoid individuals gathering in only a few areas of the search space. Another difference is in the extraction of the problem solution once the evolutionary loop terminates. In classical evolutionary approaches, the best individual corresponds to the solution and the rest of the population is discarded. Here, all the individuals (or individuals of a sub-group of the population) are collated to build the problem solution. The way the fitness functions are constructed and the way the solution extraction is made are of course problem-dependent.

Examples of Parisian Evolution applications include:

## Disambiguation

### Parisian approach vs cooperative coevolution

Cooperative coevolution is a broad class of evolutionary algorithms where a complex problem is solved by decomposing it into subcomponents that are solved independently. The Parisian approach shares many similarities with the cooperative coevolutionary algorithm. The Parisian approach makes use of a single-population whereas multi-species may be used in cooperative coevolutionary algorithm. Similar internal evolutionary engines are considered in classical evolutionary algorithm, cooperative coevolutionary algorithm and Parisian evolution. The difference between cooperative coevolutionary algorithm and Parisian evolution resides in the population's semantics. Cooperative coevolutionary algorithm divides a big problem into sub-problems (groups of individuals) and solves them separately toward the big problem.[17] There is no interaction/breeding between individuals of the different sub-populations, only with individuals of the same sub-population. However, Parisian evolutionary algorithms solve a whole problem as a big component. All population's individuals cooperate together to drive the whole population toward attractive areas of the search space.

### Fly Algorithm vs particle swarm optimisation

Cooperative coevolution and particle swarm optimisation (PSO) share many similarities. PSO is inspired by the social behaviour of bird flocking or fish schooling.[18][19] It was initially introduced as a tool for realistic animation in computer graphics. It uses complex individuals that interact with each other in order to build visually realistic collective behaviours through adjusting the individuals' behavioural rules (which may use random generators). In mathematical optimisation, every particle of the swarm somehow follows its own random path biased toward the best particle of the swarm. In the Fly Algorithm, the flies aim at building spatial representations of a scene from actual sensor data; flies do not communicate or explicitly cooperate, and do not use any behavioural model.

Both algorithms are search methods that start with a set of random solutions, which are iteratively corrected toward a global optimum. However, the solution of the optimisation problem in the Fly Algorithm is the population (or a subset of the population): The flies implicitly collaborate to build the solution. In PSO the solution is a single particle, the one with the best fitness. Another main difference between the Fly Algorithm and with PSO is that the Fly Algorithm is not based on any behavioural model but only builds a geometrical representation.

## Example: Tomography reconstruction

Sinogram ${\displaystyle (Y)}$ of ${\displaystyle f}$, which is known.
Example of reconstruction of a hot rod phantom using the Fly Algorithm.

Tomography reconstruction is an inverse problem that is often ill-posed due to missing data and/or noise. The answer to the inverse problem is not unique, and in case of extreme noise level it may not even exist. The input data of a reconstruction algorithm may be given as the Radon transform or sinogram ${\displaystyle \left(Y\right)}$ of the data to reconstruct ${\displaystyle \left(f\right)}$. ${\displaystyle f}$ is unknown; ${\displaystyle Y}$ is known. The data acquisition in tomography can be modelled as:

${\displaystyle Y=P[f]+\epsilon }$

where ${\displaystyle P}$ is the system matrix or projection operator and ${\displaystyle \epsilon }$ corresponds to some Poisson noise. In this case the reconstruction corresponds to the inversion of the Radon transform:

${\displaystyle f=P^{-1}[Y]}$

Note that ${\displaystyle P^{-1}}$ can account for noise, acquisition geometry, etc. The Fly Algorithm is an example of iterative reconstruction. Iterative methods in tomographic reconstruction are relatively easy to model:

${\displaystyle {\hat {f}}=\operatorname {arg\,min} ||Y-{\hat {Y}}||_{2}^{2}}$

where ${\displaystyle {\hat {f}}}$ is an estimate of ${\displaystyle f}$, that minimises an error metrics (here 2-norm, but other error metrics could be used) between ${\displaystyle Y}$ and ${\displaystyle {\hat {Y}}}$. Note that a regularisation term can be introduced to prevent overfitting and to smooth noise whilst preserving edges. Iterative methods can be implemented as follows:

Iterative correction in tomography reconstruction.
  (i) The reconstruction starts using an initial estimate of the image (generally a constant image),
(ii) Projection data is computed from this image,
(iii) The estimated projections are compared with the measured projections,
(iv) Corrections are made to correct the estimated image, and
(v) The algorithm iterates until convergence of the estimated and measured projection sets.


The pseudocode below is a step-by-step description of the Fly Algorithm for tomographic reconstruction. The algorithm follows the steady-state paradigm. For illustrative purposes, advanced genetic operators, such as mitosis, dual mutation, etc.[22][23] are ignored. A JavaScript implementation can be found on Fly4PET.

algorithm fly-algorithm is
input: number of flies (N),
input projection data (preference)

output: the fly population (F),
the projections estimated from F (pestimated)
the 3-D volume corresponding to the voxelisation of F (VF)

postcondition: the difference between pestimated and preference is minimal.

START

1.   // Initialisation
2.   // Set the position of the N flies, i.e. create initial guess
3.   for each fly i in fly population F do
4.       F(i)x ← random(0, 1)
5.       F(i)y ← random(0, 1)
6.       F(i)z ← random(0, 1)
7.       Add F(i)'s projection in pestimated
8.
9.   // Compute the population's performance (i.e. the global fitness)
10.   Gfitness(F) ← Errormetrics(preference, pestimated)
11.
12.   fkill ← Select a random fly of F
13.
14.   Remove fkill's contribution from pestimated
15.
16.   // Compute the population's performance without fkill
17.   Gfitness(F-{fkill}) ← Errormetrics(preference, pestimated)
18.
19.   // Compare the performances, i.e. compute the fly's local fitness
20.   Lfitness(fkill) ← Gfitness(F-{fkill}) - Gfitness(F)
21.
22.   If the local fitness is greater than 0, // Thresholded-selection of a bad fly that can be killed
23.       then go to Step 26.   // fkill is a good fly (the population's performance is better when fkill is included): we should not kill it
24.       else go to Step 28.   // fkill is a bad fly (the population's performance is worse when fkill is included): we can get rid of it
25.
26.   Restore the fly's contribution, then go to Step 12.
27.
28.   Select a genetic operator
29.
30.   If the genetic operator is mutation,
31.       then go to Step 34.
32.       else go to Step 50.
33.
34.   freproduce ← Select a random fly of F
35.
14.   Remove freproduce's contribution from pestimated
37.
38.   // Compute the population's performance without freproduce
39.   Gfitness(F-{freproduce}) ← Errormetrics(preference, pestimated)
40.
41.   // Compare the performances, i.e. compute the fly's local fitness
42.   Lfitness(freproduce) ← Gfitness(F-{freproduce}) - Gfitness(F)
43.
44.   Restore the fly's contribution'
45.
46.   If the local fitness is lower than or equal to 0, // Thresholded-selection of a good fly that can reproduce
47.       else go to Step 34.   // fkill is a bad fly: we should not allow it to reproduce
48.       then go to Step 53.   // fkill is a good fly: we can allow it to reproduce
49.
50.   // New blood / Immigration
51.   Replace fkill by a new fly with a random position, go to Step 57.
52.
53.   // Mutation
54.   Copy freproduce into fkill
55.   Slightly and randomly alter fkill's position
56.
57.   Add the new fly's contribution to the population
58.
59.   If stop the reconstruction,
60.       then go to Step 63.
61.       else go to Step 10.
62.
63.   // Extract solution
64.   VF ← voxelisation of F
65.
66.   return VF

END


## Example: Digital arts

Evolutionary search.
Image reconstructed after optimisation using a set of stripes as the pattern for each tile.

In this example, an input image is to be approximated by a set of tiles (for example as in an ancient mosaic). A tile has an orientation (angle θ), a three colour components (R, G, B), a size (w, h) and a position (x, y, z). If there are N tiles, there are 9N unknown floating point numbers to guess. In other words for 5,000 tiles, there are 45,000 numbers to find. Using a classical evolutionary algorithm where the answer of the optimisation problem is the best individual, the genome of an individual would made of 45,000 genes. This approach would be extremely costly in term of complexity and computing time. The same applies for any classical optimisation algorithm. Using the Fly Algorithm, every individual mimics a tile and can be individually evaluated using its local fitness to assess its contribution to the population's performance (the global fitness). Here an individual has 9 genes instead of 9N, and there are N individuals. It can be solved as a reconstruction problem as follows:

${\displaystyle reconstruction=\operatorname {arg\,min} {\overset {x

where ${\displaystyle input}$ is the input image, ${\displaystyle x}$ and ${\displaystyle y}$ are the pixel coordinates along the horizontal and vertical axis respectively, ${\displaystyle W}$ and ${\displaystyle H}$ are the image width and height in number of pixels respectively, ${\displaystyle F}$ is the fly population, and ${\displaystyle P}$ is a projection operator that creates an image from flies. This projection operator ${\displaystyle P}$ can take many forms. In her work, Z. Ali Aboodd [16] uses OpenGL to generate different effects (e.g. mosaics, or spray paint). For speeding up the evaluation of the fitness functions, OpenCL is used too. The algorithm starts with a population ${\displaystyle F}$ that is randomly generated (see Line 3 in the algorithm above). ${\displaystyle F}$ is then assessed using the global fitness to compute ${\displaystyle G_{fitness}(F)={\overset {x (see Line 10). ${\displaystyle G_{fitness}}$ is an error metrics, it has to be minimised.

## References

1. ^ Collet, Pierre; Louchet,, Jean (Oct 2009). "Artificial evolution and the Parisian approach: applications in the processing of signals and images". In Siarry, Patrick. Optimization in Signal and Image Processing. Wiley-ISTE. ISBN 9781848210448.
2. ^ a b c Louchet, Jean (Feb 2000). L’algorithme des mouches : une stratégie d’évolution individuelle appliquée en stéréovision. Reconnaissance des Formes et Intelligence Artificielle (RFIA2000).
3. ^ a b c Louchet, Jean (Sep 2000). Stereo analysis using individual evolution strategy. Proceedings of 15th International Conference on Pattern Recognition, 2000 (ICPR’00). Barcelona, Spain: IEEE. pp. 908–911. doi:10.1109/ICPR.2000.905580. ISBN 0-7695-0750-6.
4. ^ a b Louchet, Jean (Jun 2001). "Using an Individual Evolution Strategy for Stereovision". Genetic Programming and Evolvable Machines. Kluwer Academic Publishers. 2 (2): 101–109. doi:10.1023/A:1011544128842.
5. ^ a b Boumaza, Amine; Louchet, Jean (Apr 2003). "Mobile robot sensor fusion using flies". Lecture Notes on Computer Science. European Conference on Genetic Programming (EuroGP 2003). 2611. Essex, UK: Springer. pp. 357–367. doi:10.1007/3-540-36605-9_33. ISBN 978-3-540-00976-4.
6. ^ a b Louchet,, Jean; Guyon, Maud; Lesot, Marie-Jeanne; Boumaza, Amine (Mar 2002). "L'algorithme des mouches dynamiques: guider un robot par évolution artificielle en temps réel". In Lattaud, Claude. Apprentissage Automatique et Evolution Artificielle (PDF) (in French). Hermes Sciences Publications. ISBN 274620360X.
7. ^ a b Louchet,, Jean; Guyon, Maud; Lesot, Marie-Jeanne; Boumaza, Amine (Jan 2002). "Dynamic Flies: a new pattern recognition tool applied to stereo sequence processing" (PDF). Pattern Recognition Letters. Elsevier Science B.V. 23 (1–3): 335–345. doi:10.1016/S0167-8655(01)00129-5.
8. ^ a b Boumaza, Amine; Louchet, Jean (Apr 2001). "Dynamic Flies: Using Real-time evolution in Robotics". Lecture Notes on Computer Science. Artificial Evolution in Image Analysis and Signal Processing (EVOIASP2001). 2037. Como, Italy: Springer. pp. 288–297. doi:10.1007/3-540-45365-2_30. ISBN 978-3-540-41920-4.
9. ^ a b Louchet, Jean; Sapin, Emmanuel (2009). "Flies Open a Door to SLAM.". Lecture Notes in Computer Science. Applications of Evolutionary Computation (EvoApplications 2009). 5484. Tübingen, Germany: Springer. pp. 385–394. doi:10.1007/978-3-642-01129-0_43.
10. ^ a b Bousquet, Aurélie; Louchet, Jean-Marie; Rocchisani, Jean (Oct 2007). "Fully Three-Dimensional Tomographic Evolutionary Reconstruction in Nuclear Medicine" (PDF). Lecture Notes in Computer Science. Proceedings of the 8th international conference on Artificial Evolution (EA’07). 4926. Tours, France: Springer, Heidelberg. pp. 231–242. doi:10.1007/978-3-540-79305-2_20. ISBN 978-3-540-79304-5.
11. ^ a b Vidal, Franck P.; Lazaro-Ponthus, Delphine; Legoupil, Samuel; Louchet, Jean; Lutton, Évelyne; Rocchisani, Jean-Marie (Oct 2009). "Artificial evolution for 3D PET reconstruction" (PDF). Lecture Notes in Computer Science. Proceedings of the 9th international conference on Artificial Evolution (EA’09). 5975. Strasbourg, France: Springer, Heidelberg. pp. 37–48. doi:10.1007/978-3-642-14156-0_4. ISBN 978-3-642-14155-3.
12. ^ a b Vidal, Franck P.; Louchet, Jean; Lutton, Évelyne; Rocchisani, Jean-Marie (Oct–Nov 2009). "PET reconstruction using a cooperative coevolution strategy in LOR space". IEEE Nuclear Science Symposium Conference Record (NSS/MIC), 2009. Medical Imaging Conference (MIC). Orlando, Florida: IEEE. pp. 3363–3366. doi:10.1109/NSSMIC.2009.5401758.
13. ^ a b Vidal, Franck P.; Louchet, Jean; Rocchisani, Jean-Marie; Lutton, Évelyne (Apr 2010). "New genetic operators in the Fly Algorithm: application to medical PET image reconstruction" (PDF). Lecture Notes in Computer Science. European Workshop on Evolutionary Computation in Image Analysis and Signal Processing (EvoIASP’10). 6024. Istanbul, Turkey: Springer, Heidelberg. pp. 292–301. doi:10.1007/978-3-642-12239-2_30. ISBN 978-3-642-12238-5.
14. ^ a b Vidal, Franck P.; Lutton, Évelyne; Louchet, Jean; Rocchisani, Jean-Marie (Sep 2010). "Threshold selection, mitosis and dual mutation in cooperative coevolution: application to medical 3D tomography" (PDF). Lecture Notes in Computer Science. International Conference on Parallel Problem Solving From Nature (PPSN'10). 6238. Krakow, Poland: Springer, Heidelberg. pp. 414–423. doi:10.1007/978-3-642-15844-5_42.
15. ^ a b >Ali Abbood, Zainab; Lavauzelle, Julien; Lutton, Évelyne; Rocchisani, Jean-Marie; Louchet, Jean; Vidal, Franck P. (2017). "Voxelisation in the 3-D Fly Algorithm for PET" (PDF). Swarm and Evolutionary Computation. Elsevier. ??? (???): ???. doi:10.1016/j.swevo.2017.04.001. ISSN 2210-6502.
16. ^ a b c Ali Abbood, Zainab; Amlal, Othman; Vidal, Franck P. (Apr 2017). "Evolutionary Art Using the Fly Algorithm" (PDF). Lecture Notes in Computer Science. Applications of Evolutionary Computation (EvoApplications 2017). 10199. Amsterdam, The Netherlands: Springer. pp. 455–470. doi:10.1007/978-3-319-55849-3_30.
17. ^ Mesejo, Pablo; Ibanez, Oscar; Fernandez-blanco, Enrique; Cedron, Francisco; Pazos, Alejandro; Porto-pazos, Ana (2015). "Artificial Neuron – Glia Networks Learning Approach Based on Cooperative Coevolution". International Journal of Neural Systems. 25 (4): 1550012. doi:10.1142/S0129065715500124.
18. ^ Kennedy,, J; Eberhart, R (1995). Particle swarm optimization. Proceedings of IEEE International Conference on Neural Networks. IEEE. pp. 1942–1948. doi:10.1109/ICNN.1995.488968.
19. ^ Shi,, Y; Eberhart, R (1998). A modified particle swarm optimizer. Proceedings of IEEE International Conference on Evolutionary Computation. IEEE. pp. 69–73. doi:10.1109/ICEC.1998.699146.
20. ^ Abbood, Zainab Ali; Vidal, Franck P. (2017). "Basic, Dual, Adaptive, and Directed Mutation Operators in the Fly Algorithm". Lecture Notes in Computer Science. 13th Biennal International Conference on Artificial Evolution (EA-2017). Paris, France. pp. 106–119. ISBN 978-2-9539267-7-4.
21. ^ Abbood, Zainab Ali; Vidal, Franck P. (Oct 2017). "Fly4Arts: Evolutionary Digital Art with the Fly Algorithm". Art and Science. ISTE OpenScience. 17- 1 (1): 1–6. doi:10.21494/ISTE.OP.2017.0177.
22. ^ Vidal, Franck P.; Lutton, Évelyne; Louchet, Jean; Rocchisani, Jean-Marie (Sep 2010). "Threshold selection, mitosis and dual mutation in cooperative co-evolution: Application to medical 3D tomography" (PDF). Lecture Notes in Computer Science. Parallel Problem Solving from Nature - PPSN XI. 6238. Kraków, Poland: Springer Berlin / Heidelberg. pp. 414–423. doi:10.1007/978-3-642-15844-5_42. ISBN 978-3-642-15843-8.
23. ^ Ali Abbood, Zainab; Vidal, Franck P. (Oct 2017). "Basic, Dual, Adaptive, and Directed Mutation Operators in the Fly Algorithm". Lecture Notes in Computer Science. 13th Biennal International Conference on Artificial Evolution. Paris, France: Springer-Verlag.