Abstract—The Cerebellar Model Articulation Controller(CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. The standard CMAC uses the least mean squares algorithm to train the weights. Recently, the recursive least squares (RLS) algorithm was proposed as a superior algorithm for training the CMAC online as it can converge in just one epoch, and does not require tuning of a learning rate.However, the RLS algorithm was found to be very computationally demanding as its computational complexity is dependent on the square of the number of weights required which can be huge for the CMAC. Here, we show a more efficient RLS algorithm that uses inverse QR decomposition and additionally provides a regularized solution, improving generalization. However, while the inverse QR decomposition based RLS algorithm reduces computation time significantly; it is still not fast enough for use in CMACs greater than two dimensions. To further improve efficiency we show that by using kernel methods the CMAC computational complexity can be transformed to become dependent on the number of unique training data. Additionally, it is shown how modeling error can be improved through use of higher order basis functions.
Index Terms—artificial neural networks, CMAC, kernel methods, recursive least squares
Carl Werner Laufer is a PhD student in the Electrical and Electronic Engineering Department of the University of Auckland, New Zealand.(e-mail: email@example.com)
George Coghill is a Senior Lecturer in the Electrical and Electronic Engineering Department of the University of Auckland, New Zealand.(e-mail: firstname.lastname@example.org)
Cite: C. Laufer, G. Coghill, "Efficient Recursive Least Squares Methods forthe CMAC Neural Network," International Journal of Machine Learning and Computing vol. 1, no. 1, pp. 20-29, 2011.