Fast Artificial Neural Network

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Fast Artificial Neural Network
Original author(s)Steffen Nissen
Initial releaseNovember 2003; 15 years ago (2003-11)
Stable release
2.2.0 / 24 January 2012; 7 years ago (2012-01-24)
Written inC
Operating systemCross-platform
Size~2 MB
TypeLibrary
LicenseLGPL
Websiteleenissen.dk/fann/wp/,
github.com/libfann/fann

Fast Artificial Neural Network (FANN) is cross-platform open-source programming library for developing multilayer feedforward Artificial Neural Networks.

Characteristics[edit]

FAN supports cross-platform execution of single and multilayer networks. It also supports fixed point and floating point arithmetic. It includes functions that simplify the creating, training and testing of neural networks. It has bindings for over 20 programming languages, including commonly used languages such as C# and python.
In the FANN website multiple graphical user interfaces are available for use with the library such as FANNTool, Agiel Neural Network, Neural View, FannExeplorer, sfann and others. These graphical interface facilitate the use of FANN for users that are not very familiar with programming or for users who are seeking for a simple out of the box solution.
Training for FANN is carried out through backpropagation. The internal training functions are optimized to decrease the training time.
Trained Artificial Neural Networks can be stores as .net files to quickly saved and load ANNs for future use or future training. This allows the user to partition the training in multiple steps which can be useful when dealing with large training datasets or sizable neural networks.

History[edit]

FANN was originally written by Steffen Nissen. Its original implementation is described in Nissen's 2003 report Implementation of a Fast Artificial Neural Network Library (FANN).[1] This report was submitted to the computer science department at the University of Copenhagen (DIKU). In his original report Nissen describes that one of his primary motivations in writing FANN was developing a neural network library that was friendly to both, fixed point, and floating point arithmetic. Nissen wanted to develop an autonomous agent that can learn from experience. His goal was to use this autonomous agent to create a virtual player in Quake III Arena that can learn from game play.
Since its original 1.0.0 version release, the library's functionality has been expanded by the creator and its many contributors to include more practical constructors, different activation functions, simpler access to parameters and bindings to multiple programming languages. It has been downloaded 450,000 times since its move to Source Forge in 2003 and 29,000 times in 2016 alone.

The source code is now hosted on GitHub. The project was inactive from Nov 2015 to May 2018; in the issue section some users mentioned that the author was no longer contactable. Starting from 2018, development has become active again with contribution from several collaborators.[2]

Research[edit]

The original FANN report written by Steffen Nissen has been cited 337 times per google scholar. The library has been used for research in image recognition, machine learning, biology, genetics, aerospace engineering, environmental sciences and artificial intelligence.
Some notable publications that have cited FANN include:

  • Supervised pattern classification based on optimum-path forest[3]
  • Efficient supervised optimum-path forest classification for large datasets[4]
  • A Multilevel Mixture-of-Experts Framework for Pedestrian Classification[5]
  • A stochastic model updating technique for complex aerospace structures[6]
  • Prediction of Local Structural Stabilities of Proteins from Their Amino Acid Sequences[7]

Language Bindings[edit]

While FANN was originally written in C, the following language bindings have been created by FANN contributors:

See also[edit]

References[edit]

  1. ^ Nissed, Steffen. Implementation of a Fast Artificial Neural Network Library (FANN). Department of Computer Science University of Copnhagen (DIKU), 2003
  2. ^ https://github.com/libfann/fann/issues/95
  3. ^ Papa, J.P. Supervised pattern classification based on optimum-path forest. International Journal of Imaging Systems and Technology, 2009
  4. ^ Papa, J.P. Efficient supervised optimum-path forest classification for large datasets. Pattern Recognition, 2012
  5. ^ Enzweiler, M. A Multilevel Mixture-of-Experts Framework for Pedestrian Classification. IEEE Transactions on Image Processing, 2011
  6. ^ Goller, B. A stochastic model updating technique for complex aerospace structures. Finite Elements in Analysis and Design, 2011
  7. ^ Tartaglia, G. G. Prediction of Local Structural Stabilities of Proteins from Their Amino Acid Sequences. Structure, 2006