Research Trend on Network Machine Learning

Journal of Advanced Technology Research, Vol. 2, No. 2, pp. 13-20, Dec. 2017
10.11111/JATR.2017.2.2.013, Full Text:
Keywords: Machine-Learning, Deep Learning, Convolution Neural Network, Neural Network, Optimizer

As interest in Machine-Learning has increased in recent years, interest in Deep-Learning Neural Networks is growing. Especially, the appearance of various Neural Networks has increased the interest of optimization algorithms on network parameters used in networks. Algorithms for optimizing network parameters are also emerging as various Neural Networks are introduced. However, since the new optimizers are not optimized for all Neural Networks, there is a problem that the learning result value varies depending on the input values of the learning data and the network parameter values. In this paper, we introduce network parameter optimization algorithms to minimize the value of loss function. For this, we compare the accuracy of CNN learning results of MNIST according to Gradient Descent, Momentum, Adagrad, Adadelta, and Adam Optimizers Respectively.

Show / Hide Statistics

Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.

Cite this article
[IEEE Style]
H. Lim, J. Kim and Y. Han, "Research Trend on Network Machine Learning," Journal of Advanced Technology Research, vol. 2, no. 2, pp. 13-20, 2017. DOI: 10.11111/JATR.2017.2.2.013.

[ACM Style]
Hyun-Kyo Lim, Ju-Bong Kim, and Youn-Hee Han. 2017. Research Trend on Network Machine Learning. Journal of Advanced Technology Research, 2, 2, (2017), 13-20. DOI: 10.11111/JATR.2017.2.2.013.