Meta-learner with sparsified backpropagation

Rohan Paithankar, Aayushi Verma, Manish Agnihotri, Sanjay Singh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like speech recognition, computer vision, natural language processing and several automated systems. Training deep neural networks involves careful selection of appropriate training examples, tuning of hyperparameters and scheduling step sizes, finding a proper combination of all these is a tedious and time-consuming task. In the recent times, a few learning-to-learn models have been proposed, that can learn automatically. The time and accuracy of the model are exceedingly important. A technique named meProp was proposed to accelerate Deep Learning with reduced over-fitting. meProp is a method that proposes a sparsified back propagation method which reduces the computational cost. In this paper, we propose an application of meProp to the learning-to-learn models to focus on learning of the most significant parameters which are consciously chosen. We demonstrate improvement in accuracy of the learning-to-learn model with the proposed technique and compare the performance with that of the unmodified learning-to-learn model.

Original languageEnglish
Title of host publicationProceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages315-319
Number of pages5
ISBN (Electronic)9781538659335
DOIs
Publication statusPublished - 01-01-2019
Event9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019 - Uttar Pradesh, India
Duration: 10-01-201911-01-2019

Publication series

NameProceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019

Conference

Conference9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019
CountryIndia
CityUttar Pradesh
Period10-01-1911-01-19

Fingerprint

Backpropagation
Speech recognition
Computer vision
Tuning
Scheduling
Neural networks
Processing
Costs
Deep learning

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Computer Networks and Communications
  • Software
  • Information Systems

Cite this

Paithankar, R., Verma, A., Agnihotri, M., & Singh, S. (2019). Meta-learner with sparsified backpropagation. In Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019 (pp. 315-319). [8776608] (Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/CONFLUENCE.2019.8776608
Paithankar, Rohan ; Verma, Aayushi ; Agnihotri, Manish ; Singh, Sanjay. / Meta-learner with sparsified backpropagation. Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 315-319 (Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019).
@inproceedings{884b865dbc3b45829ee4143a11f8707b,
title = "Meta-learner with sparsified backpropagation",
abstract = "In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like speech recognition, computer vision, natural language processing and several automated systems. Training deep neural networks involves careful selection of appropriate training examples, tuning of hyperparameters and scheduling step sizes, finding a proper combination of all these is a tedious and time-consuming task. In the recent times, a few learning-to-learn models have been proposed, that can learn automatically. The time and accuracy of the model are exceedingly important. A technique named meProp was proposed to accelerate Deep Learning with reduced over-fitting. meProp is a method that proposes a sparsified back propagation method which reduces the computational cost. In this paper, we propose an application of meProp to the learning-to-learn models to focus on learning of the most significant parameters which are consciously chosen. We demonstrate improvement in accuracy of the learning-to-learn model with the proposed technique and compare the performance with that of the unmodified learning-to-learn model.",
author = "Rohan Paithankar and Aayushi Verma and Manish Agnihotri and Sanjay Singh",
year = "2019",
month = "1",
day = "1",
doi = "10.1109/CONFLUENCE.2019.8776608",
language = "English",
series = "Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "315--319",
booktitle = "Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019",
address = "United States",

}

Paithankar, R, Verma, A, Agnihotri, M & Singh, S 2019, Meta-learner with sparsified backpropagation. in Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019., 8776608, Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019, Institute of Electrical and Electronics Engineers Inc., pp. 315-319, 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019, Uttar Pradesh, India, 10-01-19. https://doi.org/10.1109/CONFLUENCE.2019.8776608

Meta-learner with sparsified backpropagation. / Paithankar, Rohan; Verma, Aayushi; Agnihotri, Manish; Singh, Sanjay.

Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019. Institute of Electrical and Electronics Engineers Inc., 2019. p. 315-319 8776608 (Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Meta-learner with sparsified backpropagation

AU - Paithankar, Rohan

AU - Verma, Aayushi

AU - Agnihotri, Manish

AU - Singh, Sanjay

PY - 2019/1/1

Y1 - 2019/1/1

N2 - In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like speech recognition, computer vision, natural language processing and several automated systems. Training deep neural networks involves careful selection of appropriate training examples, tuning of hyperparameters and scheduling step sizes, finding a proper combination of all these is a tedious and time-consuming task. In the recent times, a few learning-to-learn models have been proposed, that can learn automatically. The time and accuracy of the model are exceedingly important. A technique named meProp was proposed to accelerate Deep Learning with reduced over-fitting. meProp is a method that proposes a sparsified back propagation method which reduces the computational cost. In this paper, we propose an application of meProp to the learning-to-learn models to focus on learning of the most significant parameters which are consciously chosen. We demonstrate improvement in accuracy of the learning-to-learn model with the proposed technique and compare the performance with that of the unmodified learning-to-learn model.

AB - In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like speech recognition, computer vision, natural language processing and several automated systems. Training deep neural networks involves careful selection of appropriate training examples, tuning of hyperparameters and scheduling step sizes, finding a proper combination of all these is a tedious and time-consuming task. In the recent times, a few learning-to-learn models have been proposed, that can learn automatically. The time and accuracy of the model are exceedingly important. A technique named meProp was proposed to accelerate Deep Learning with reduced over-fitting. meProp is a method that proposes a sparsified back propagation method which reduces the computational cost. In this paper, we propose an application of meProp to the learning-to-learn models to focus on learning of the most significant parameters which are consciously chosen. We demonstrate improvement in accuracy of the learning-to-learn model with the proposed technique and compare the performance with that of the unmodified learning-to-learn model.

UR - http://www.scopus.com/inward/record.url?scp=85070583942&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070583942&partnerID=8YFLogxK

U2 - 10.1109/CONFLUENCE.2019.8776608

DO - 10.1109/CONFLUENCE.2019.8776608

M3 - Conference contribution

T3 - Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019

SP - 315

EP - 319

BT - Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Paithankar R, Verma A, Agnihotri M, Singh S. Meta-learner with sparsified backpropagation. In Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019. Institute of Electrical and Electronics Engineers Inc. 2019. p. 315-319. 8776608. (Proceedings of the 9th International Conference On Cloud Computing, Data Science and Engineering, Confluence 2019). https://doi.org/10.1109/CONFLUENCE.2019.8776608