全國中小學科展

依全國中小學科展屆次查詢

依相關評語查詢

Enhancement of Online Stochastic Gradient Descent using Backward Queried Images

科展類別

臺灣國際科展作品

屆次

2022年

科別

電腦科學與資訊工程

得獎情形

四等獎

學校名稱

Shanghai American School

作者

Gio Huh

關鍵字

catastrophic interference、backward query、online learning

摘要或動機

Stochastic gradient descent (SGD) is one of the preferred online optimization algorithms. However, one of its major drawbacks is its predisposition to forgetting previous data when optimizing through a data stream, also known as catastrophic interference. In this project, we attempt to mitigate this drawback by proposing a new low-cost approach which incorporates backward queried images with SGD during online training. Under this new approach, we propose that for every new training sample through the data stream, the neural network is optimized using the corresponding backward queried image from the initial dataset. After compiling the accuracy of the proposed method and SGD under a data-stream of 50,000 training cases with 10,000 test cases and comparing our algorithm to SGD, we see substantial improvements in the performance of the neural network with two different MNIST datasets (Fashion and Kuzushiji), classifying the MNIST datasets at a high accuracy for the mean, minimum, lower quartile, median, and upper quartile, while maintaining lower standard deviation in performance, demonstrating that our proposed algorithm can be a potential alternative to online SGD.

190040.pdf

Adobe Reader(Pdf)檔案