Location and Venue: Monday, 12.9.2016, 14:00-15:00 in the the Hotel Dormero in Hanover.
Neural Simpletrons - Minimalistic Deep Neural Networks for Probabilistic Learning with Few Labels
Jörg Lücke
(University of Oldenburg):
Deep learning is intensively studied using supervised and unsupervised learning, and by applying probabilistic, deterministic, and bio-inspired approaches. Comparisons of different approaches such as generative and discriminative neural networks is made difficult, however, because of differences in the semantics of their graphical descriptions, different learning methods, different benchmarking objectives and different scalability. In this talk I will discuss novel neural networks that are derived from generative modeling approaches but can be formulated as neural networks, i.e., they take a form similar to standard discriminative networks such as perceptrons. These novel networks, which we term Neural Simpletrons, are especially well suited for applications to data with no or few labels because of their roots in generative models. The weakly labelled setting is also well suited for a quantitative comparison with standard and recent state-of-the-art neural networks. Empirical evaluations on common benchmarks show that for weakly labeled data, Neural Simpletrons improve on all standard deep learning approaches and are competitive with their recent variants. As models for neural information processing, our research results suggest neural bottom-up / top-down integration for optimal processing and it assigns important functional roles to synaptic plasticity, synaptic scaling, and intrinsic plasticity.