ecms_neu_mini.png

Digital Library

of the European Council for Modelling and Simulation

 

Title:

Reducing Overconfidence In Neural Networks By Dynamic Variation of Recognizer Relevance

Authors:

Konstantin B. Bulatov, Dmitry V. Polevoy

Published in:

 

 

(2015).ECMS 2015 Proceedings edited by: Valeri M. Mladenov, Grisha Spasov, Petia Georgieva, Galidiya Petrova, European Council for Modeling and Simulation. doi:10.7148/2015

 

 

ISBN: 978-0-9932440-0-1

 

29th European Conference on Modelling and Simulation,

Albena (Varna), Bulgaria, May 26th – 29th, 2015

 

Citation format:

Konstantin B. Bulatov, Dmitry V. Polevoy (2015). Reducing Overconfidence In Neural Networks By Dynamic Variation of Recognizer Relevance, ECMS 2015 Proceedings edited by: Valeri M. Mladenov, Petia Georgieva, Grisha Spasov, Galidiya Petrova  European Council for Modeling and Simulation. doi:10.7148/2015-0488

DOI:

http://dx.doi.org/10.7148/2015-0488

Abstract:

Contemporary recognition systems use various methods of symbol recognition and post-processing methods designed for enhancing the quality of text recognition. For some recognition problems it may be difficult to create an adequate dataset for training symbol recognizers, so several symbol recognizers are used to ensure better performance. In this paper the concept of recognizer relevance is introduced as a way of analysing the recognizer output. A method is described using this concept, allowing to use external information about the input samples in order to balance the contributions of the recognizer and the post-processing subsystem.

 

Full text: