Neural Networks: Tricks of the Trade [E-Book] : Second Edition / edited by Grégoire Montavon, Geneviève B. Orr, Klaus-Robert Müller.
Montavon, Grégoire.
Orr, Geneviève B. / Müller, Klaus-Robert.
Berlin, Heidelberg : Springer Berlin Heidelberg : 2012
Imprint: Springer,
XII, 769 p. 223 illus. digital.
englisch
9783642352898
10.1007/978-3-642-35289-8
Lecture notes in computer science ; 7700
Full Text
Table of Contents:
  • Introduction
  • Preface on Speeding Learning
  • 1. Efficient BackProp
  • Preface on Regularization Techniques to Improve Generalization
  • 2. Early Stopping — But When?
  • 3. A Simple Trick for Estimating the Weight Decay Parameter
  • 4. Controlling the Hyperparameter Search in MacKay’s Bayesian Neural Network Framework.- 5. Adaptive Regularization in Neural Network Modeling
  • 6. Large Ensemble Averaging
  • Preface on Improving Network Models and Algorithmic Tricks
  • 7. Square Unit Augmented, Radially Extended, Multilayer Perceptrons
  • 8. A Dozen Tricks with Multitask Learning
  • 9. Solving the Ill-Conditioning in Neural Network Learning
  • 10. Centering Neural Network Gradient Factors
  • 11. Avoiding Roundoff Error in Backpropagating Derivatives.- 12. Transformation Invariance in Pattern Recognition –Tangent Distance and Tangent Propagation
  • 13. Combining Neural Networks and Context-Driven Search for On-line, Printed Handwriting Recognition in the Newtons
  • 14. Neural Network Classification and Prior Class Probabilities
  • 15. Applying Divide and Conquer to Large Scale Pattern Recognition Tasks
  • Preface on Tricks for Time Series
  • 16. Forecasting the Economy with Neural Nets: A Survey of Challenges and Solutions
  • 17. How to Train Neural Networks
  • Preface on Big Learning in Deep Neural Networks
  • 18. Stochastic Gradient Descent Tricks.- 19. Practical Recommendations for Gradient-Based Training of Deep Architectures
  • 20. Training Deep and Recurrent Networks with Hessian-Free Optimization
  • 21. Implementing Neural Networks Efficiently
  • Preface on Better Representations: Invariant, Disentangled and Reusable
  • 22. Learning Feature Representations with K-Means
  • 23. Deep Big Multilayer Perceptrons for Digit Recognition
  • 24. A Practical Guide to Training Restricted Boltzmann Machines
  • 25. Deep Boltzmann Machines and the Centering Trick
  • 26. Deep Learning via Semi-supervised Embedding
  • Preface on Identifying Dynamical Systems for Forecasting and Control
  • 27. A Practical Guide to Applying Echo State Networks
  • 28. Forecasting with Recurrent Neural Networks: 12 Tricks
  • 29. Solving Partially Observable Reinforcement Learning Problems with Recurrent Neural Networks
  • 30. 10 Steps and Some Tricks to Set up Neural Reinforcement Controllers.