Home > Archive > 2020 > Volume 10 Number 6 (Nov. 2020) >
IJMLC 2020 Vol.10(6): 783-788 ISSN: 2010-3700
DOI: 10.18178/ijmlc.2020.10.6.1006

Thai Character-Word Long Short-Term Memory Network Language Models with Dropout and Batch Normalization

Nuttanit Keskomon and Jaturon Harnsomburana

Abstract—Due to the emerging of Long Short-Term Memory neuron network (LSTM) which is a variation of deep neuron network, it is proven to be essential to the improvement of Natural Language Processing, especially Language Modelling. Many researches applied LSTM to model many well-defined languages and gain performance in term of accuracy. However, this new approach is rarely applied to Thai language. Unfortunately, the characteristic of Thai language is significantly different than other well-defined languages, particularly English or Latin-based languages. In this work, we applied LSTM in Language Modelling to predict the next word in the sequence. We designed seven variation of LSTM models and compared the result with word-level LSTM model. The experiment showed that character-word LSTM can improve the performance of Natural Language Modelling (NLM) on Thai dataset. Especially when using character-word LSTM with dropout value of 0.75 and batch normalization, the perplexity is lower than baseline word-level LSTM up to 21.10%.

Index Terms—Deep learning, language modelling, long short-term memory network, Thai language, word prediction.

The authors are with the Computer Engineering Department, King Mongkut’s University of Technology Thonburi, Thung Khru, Bangkok, 10140 Thailand (e-mail: nuttanit.k@mail.kmutt.ac.th, jaturon.harnsomburana@mail.kmutt.ac.th).

[PDF]

Cite: Nuttanit Keskomon and Jaturon Harnsomburana, "Thai Character-Word Long Short-Term Memory Network Language Models with Dropout and Batch Normalization," International Journal of Machine Learning and Computing vol. 10, no. 6, pp. 783-788, 2020.

Copyright © 2020 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

 

General Information

  • E-ISSN: 2972-368X
  • Abbreviated Title: Int. J. Mach. Learn.
  • Frequency: Quaterly
  • DOI: 10.18178/IJML
  • Editor-in-Chief: Dr. Lin Huang
  • Executive Editor:  Ms. Cherry L. Chen
  • Abstracing/Indexing: Inspec (IET), Google Scholar, Crossref, ProQuest, Electronic Journals LibraryCNKI.
  • E-mail: ijml@ejournal.net


Article Metrics in Dimensions