IJMLC 2018 Vol.8(4): 354-360 ISSN: 2010-3700
DOI: 10.18178/ijmlc.2018.8.4.711

Kernel Function Tuning for Single-Layer Neural Networks

Petra Vidnerov√° and Roman Neruda

Abstract—This paper describes an unified learning framework for kernel networks with one hidden layer, including models like radial basis function networks and regularization networks. The learning procedure consists of meta-parameter tuning wrapping the standard parameter optimization part. Several variants of learning are described and tested on various classification and regression problems. It is shown that meta-learning can improve the performance of models for the price of higher time complexity.

Index Terms—Radial basis function networks, shallow neural networks, kernel methods, hyper-parameter tuning.

P. Vidnerová is with the Czech Academy of Sciences, Institute of Computer Science, Prague (e-mail: petra@cs.cas.cz).
R. Neruda is with the Czech Academy of Sciences, Institute of Computer Science and Charles University, Prague (e-mail: roman@cs.cas.cz).


Cite: Petra Vidnerová and Roman Neruda, "Kernel Function Tuning for Single-Layer Neural Networks," International Journal of Machine Learning and Computing vol. 8, no. 4, pp. 354-360, 2018.

General Information

  • ISSN: 2010-3700 (Online)
  • Abbreviated Title: Int. J. Mach. Learn. Comput.
  • Frequency: Bimonthly
  • DOI: 10.18178/IJMLC
  • Editor-in-Chief: Dr. Lin Huang
  • Executive Editor:  Ms. Cherry L. Chen
  • Abstracing/Indexing: Scopus (since 2017), Inspec (IET), Google Scholar, Crossref, ProQuest, Electronic Journals Library.
  • E-mail: ijmlc@ejournal.net