Minimizing Average of Loss Functions Using Gradient Descent and Stochastic Gradient Descent

Authors

  • Md Rajib Arefin Department of Mathematics, Dhaka University, Dhaka-1000, Bangladesh
  • M Asadujjaman Department of Mathematics, Dhaka University, Dhaka-1000, Bangladesh

DOI:

https://doi.org/10.3329/dujs.v64i2.54490

Keywords:

Gradient Descent, Stochastic Gradient Descent, Convex Function, Unconstrained Optimization Problems.

Abstract

This paper deals with minimizing average of loss functions using Gradient Descent (GD) and Stochastic Gradient Descent (SGD). We present these two algorithms for minimizing average of a large number of smooth convex functions. We provide some discussions on their complexity analysis, also illustrate the algorithms geometrically. At the end, we compare their performance through numerical experiments.

Dhaka Univ. J. Sci. 64(2): 141-145, 2016 (July)

Downloads

Download data is not yet available.
Abstract
8
PDF
12

Downloads

Published

2016-07-31

How to Cite

Arefin, M. R., & Asadujjaman, M. (2016). Minimizing Average of Loss Functions Using Gradient Descent and Stochastic Gradient Descent. Dhaka University Journal of Science, 64(2), 141–145. https://doi.org/10.3329/dujs.v64i2.54490

Issue

Section

Articles