Generalization capability of artificial neural network incorporated with pruning method

Siddhaling Urolagin, K. V. Prema, N. V.Subba Reddy

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.

Original languageEnglish
Title of host publicationAdvanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers
Pages171-178
Number of pages8
DOIs
Publication statusPublished - 16-04-2012
EventInternational Conference on Advanced Computing, Networking and Security, ADCONS 2011 - Surathkal, India
Duration: 16-12-201118-12-2011

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7135 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceInternational Conference on Advanced Computing, Networking and Security, ADCONS 2011
CountryIndia
CitySurathkal
Period16-12-1118-12-11

Fingerprint

Pruning
Artificial Neural Network
Neural networks
Multilayer neural networks
Real-world Applications
Degree of freedom
Numeral
Generalization
Evaluate
Experiments
Perceptron
Multilayer
Rate of Convergence
Neural Networks
Estimate

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Urolagin, S., Prema, K. V., & Reddy, N. V. S. (2012). Generalization capability of artificial neural network incorporated with pruning method. In Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers (pp. 171-178). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7135 LNCS). https://doi.org/10.1007/978-3-642-29280-4_19
Urolagin, Siddhaling ; Prema, K. V. ; Reddy, N. V.Subba. / Generalization capability of artificial neural network incorporated with pruning method. Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers. 2012. pp. 171-178 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{5399f0cadaf0490dbecd4f067b4d69c9,
title = "Generalization capability of artificial neural network incorporated with pruning method",
abstract = "In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.",
author = "Siddhaling Urolagin and Prema, {K. V.} and Reddy, {N. V.Subba}",
year = "2012",
month = "4",
day = "16",
doi = "10.1007/978-3-642-29280-4_19",
language = "English",
isbn = "9783642292798",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "171--178",
booktitle = "Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers",

}

Urolagin, S, Prema, KV & Reddy, NVS 2012, Generalization capability of artificial neural network incorporated with pruning method. in Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 7135 LNCS, pp. 171-178, International Conference on Advanced Computing, Networking and Security, ADCONS 2011, Surathkal, India, 16-12-11. https://doi.org/10.1007/978-3-642-29280-4_19

Generalization capability of artificial neural network incorporated with pruning method. / Urolagin, Siddhaling; Prema, K. V.; Reddy, N. V.Subba.

Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers. 2012. p. 171-178 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7135 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Generalization capability of artificial neural network incorporated with pruning method

AU - Urolagin, Siddhaling

AU - Prema, K. V.

AU - Reddy, N. V.Subba

PY - 2012/4/16

Y1 - 2012/4/16

N2 - In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.

AB - In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.

UR - http://www.scopus.com/inward/record.url?scp=84859625584&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84859625584&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-29280-4_19

DO - 10.1007/978-3-642-29280-4_19

M3 - Conference contribution

SN - 9783642292798

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 171

EP - 178

BT - Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers

ER -

Urolagin S, Prema KV, Reddy NVS. Generalization capability of artificial neural network incorporated with pruning method. In Advanced Computing, Networking and Security - International Conference, ADCONS 2011, Revised Selected Papers. 2012. p. 171-178. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-642-29280-4_19