Collective mutual information maximization to unify passive and positive approaches for improving interpretation and generalization. (June 2017)
- Record Type:
- Journal Article
- Title:
- Collective mutual information maximization to unify passive and positive approaches for improving interpretation and generalization. (June 2017)
- Main Title:
- Collective mutual information maximization to unify passive and positive approaches for improving interpretation and generalization
- Authors:
- Kamimura, Ryotaro
- Abstract:
- Abstract: The present paper aims to propose a simple method to realize mutual information maximization for better interpretation and generalization. To train neural networks and obtain better performance, neurons should impartially consider as many input patterns as possible. Simultaneously, and especially for ease of interpretation, they should represent characteristics specific to certain input patterns as faithfully as possible. This contradiction can be solved by introducing mutual information between neurons and input patterns. However, because of the complicated computational procedures associated with mutual information maximization, it has been difficult to apply mutual information maximization to actual problems. Though many simplified methods have been developed so far, they have not necessarily been applied with success, in particular to large-scale practical problems. To further aid simplification, here we propose a new computational method to realize mutual information. One of the main characteristics of this new method is the consideration of multiple neural networks when defining mutual information, thereby simplifying the method. In addition, learning is also simplified by using the indirect, independent, and fast learning of the potential method. This method was applied to two well known data sets: the Australian credit data and on-line popularity data set. The experimental results showed that mutual information could be increased via the present method. InAbstract: The present paper aims to propose a simple method to realize mutual information maximization for better interpretation and generalization. To train neural networks and obtain better performance, neurons should impartially consider as many input patterns as possible. Simultaneously, and especially for ease of interpretation, they should represent characteristics specific to certain input patterns as faithfully as possible. This contradiction can be solved by introducing mutual information between neurons and input patterns. However, because of the complicated computational procedures associated with mutual information maximization, it has been difficult to apply mutual information maximization to actual problems. Though many simplified methods have been developed so far, they have not necessarily been applied with success, in particular to large-scale practical problems. To further aid simplification, here we propose a new computational method to realize mutual information. One of the main characteristics of this new method is the consideration of multiple neural networks when defining mutual information, thereby simplifying the method. In addition, learning is also simplified by using the indirect, independent, and fast learning of the potential method. This method was applied to two well known data sets: the Australian credit data and on-line popularity data set. The experimental results showed that mutual information could be increased via the present method. In addition, mutual information maximization was accompanied by an increase in generalization and interpretation performance, mainly due to the simple internal representations. … (more)
- Is Part Of:
- Neural networks. Volume 90(2017)
- Journal:
- Neural networks
- Issue:
- Volume 90(2017)
- Issue Display:
- Volume 90, Issue 2017 (2017)
- Year:
- 2017
- Volume:
- 90
- Issue:
- 2017
- Issue Sort Value:
- 2017-0090-2017-0000
- Page Start:
- 56
- Page End:
- 71
- Publication Date:
- 2017-06
- Subjects:
- Passive -- Positive -- Potentiality -- Interpretation -- Generalization -- Mutual information
Neural computers -- Periodicals
Neural networks (Computer science) -- Periodicals
Neural networks (Neurobiology) -- Periodicals
Nervous System -- Periodicals
Ordinateurs neuronaux -- Périodiques
Réseaux neuronaux (Informatique) -- Périodiques
Réseaux neuronaux (Neurobiologie) -- Périodiques
Neural computers
Neural networks (Computer science)
Neural networks (Neurobiology)
Periodicals
006.32 - Journal URLs:
- http://www.sciencedirect.com/science/journal/08936080 ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1016/j.neunet.2017.03.001 ↗
- Languages:
- English
- ISSNs:
- 0893-6080
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 6081.280800
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 96.xml