![]() ![]() ![]() Deep learning face attributes in the wild. The Prefrontal Cortex (Academic Press, 2015). ![]() In 12th International Conference on Document Analysis and Recognition (ICDAR) 1464–1470 (IEEE, 2013).įuster, J. ICDAR 2013 Chinese handwriting recognition competition. In Chinese Conference on Pattern Recognition (CCPR) 1–5 (IEEE, 2010). Chinese handwriting recognition contest 2010. In International Conference on Machine Learning 6072–6082 (PMLR, 2017). Continual learning through synaptic intelligence. In Advances in Neural Information Processing Systems 4652–4662 (Curran Associates, 2017). Overcoming catastrophic forgetting by incremental moment matching. Overcoming catastrophic forgetting in neural networks. In International Conference on Machine Learning 807–814 (PMLR, 2010). ![]() Rectified linear units improve restricted Boltzmann machines. In International Conference on Learning Representations (ICLR, 2018). Overcoming catastrophic interference using conceptor-aided backpropagation. Controlling recurrent neural networks by conceptors. Generating coherent patterns of activity from chaotic neural networks. Optimal filtering algorithms for fast learning in feedforward neural networks. In International Conference on Acoustics, Speech, and Signal Processing 1187–1190 (IEEE, 1989). Training feed-forward networks with the extended kalman algorithm. Adaptive Filter theory (Pearson Education India, 2008). Continual lifelong learning with neural networks: a review. An empirical investigation of catastrophic forgetting in gradient-based neural networks. Connectionist models of recognition memory-constraints imposed by learning and forgetting functions. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem Vol. The prefontral cortex and cognitive control. An integrative theory of prefrontal cortex function. The Frontal Lobes and Voluntary Action (Oxford Univ. The frontal cortex basal ganglia system in primates. The prefrontal cortex: complex neural properties for complex behavior. Cortical information flow during flexible sensorimotor decisions. Context-dependent computation by recurrent dynamics in prefrontal cortex. Neural mechanisms of selective visual-attention. The intelligibility of speech as a function of the context of the test materials. Unified Theories of Cognition (Harvard Univ. Our approach should enable highly compact systems to gradually learn myriad regularities of the real world and eventually behave appropriately within it. We demonstrated that with orthogonal weights modification to overcome catastrophic forgetting, and the context-dependent processing module to learn how to reuse a feature representation and a classifier for different contexts, a single network could acquire numerous context-dependent mapping rules in an online and continual manner, with as few as approximately ten samples to learn each. To lift such limits, we developed an approach involving a learning algorithm, called orthogonal weights modification, with the addition of a context-dependent processing module. Deep neural networks are powerful tools in learning sophisticated but fixed mapping rules between inputs and outputs, thereby limiting their application in more complex and dynamic situations in which the mapping rules are not kept the same but change according to different contexts. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |