Fast Artificial Neural Network Library

10 years of FANN

The first version of FANN was created and uploaded to sourceforge 10 years ago. It started out as a very simple ANN project, supporting only the simplest of training algorithms and with very little added functionality. Back then I had no idea if anybody but myself would ever use it, but I thought that it would be selfish not to share the work with the world.

Since then a lot of functionality has been added to the core FANN project and FANN has been downloaded more than 300.000 times from sourceforge and an unknown number of times as part of a Linux distributions or other packaged systems. Besides this, the supporters of FANN has created bindings to more than 20 different programming languages and several different UIs have been made available. A special thanks goes out to all the people who have helped extend FANN.

A few critical design decisions was made when creating the library, which helped make the library as popular as it is today and I would like to share some of the thoughts that went into these decisions.

Speed – FANN was designed with speed in mind from the very beginning, as the main reason for creating the library was the to use it to do image processing on a robot, that was powered by a 206MHz HP/Compaq iPAQ (which did not even have a floating point processor). Still today, the core execution function of FANN has not been altered much and is extremely fast, but it could be made even faster if it got updated to support multiple cores and GPU execution.

Simplicity – FANN was designed as a replacement for an existing neural network library jNeural, which I had used in a few projects. jNeural was C++ and was pretty easy to use, but with my decision to implement the library in C, it was very important for me that FANN was even simpler, so that the transition was smooth and simple. I had my good friend Jesper look through the initial API and his suggestions helped me to clean it up quite a bit before the initial upload.

C – FANN was created in C, as many other cross platform libraries was back then (and even are today). The decision to do that had much more far reaching consequences than I realized back then. On the positive side, this single decision has meant that it has been very easy for people to write bindings from other languages and it has been a deciding factor in how widespread the use of FANN is today. On the negative side, C is a very old language, where concepts like dynamic arrays and easy memory management is unknown, and specifically when I implemented cascade correlation, this gave me quite a bit of headache.

LGPL -I must admit that I originally didn’t put much thought into the licence as I had no idea if people would actually use it. I basically just looked at GPL and LGPL and decided that I would like for people to be able to use FANN commercially, so I chose LGPL. A decision that I haven’t regretted since then.

Thanks to all people who have been using FANN and provided feedback and extensions throughout the last 10 years, let’s make it another 10 years for FANN. Please share your thoughts in the comment section on where you would like FANN to go in the next 10 years.

Steffen

1 Comment

  1. Bob Stoughton Bob Stoughton
    October 31, 2014    

    Not sure where the best place to pose a question is, but here goes. I’d like to try either regularization (optimize to a sum of MSE + weighted sum of weights) or Optimal Brain Damage or both to simplify a net an avoid overtraining. Are there tools for this?

Leave a Reply