Cargando…

Supervised Sequence Labelling with Recurrent Neural Networks

Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning tools-robust to input noise and disto...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Graves, Alex (Autor)
Autor Corporativo: SpringerLink (Online service)
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Berlin, Heidelberg : Springer Berlin Heidelberg : Imprint: Springer, 2012.
Edición:1st ed. 2012.
Colección:Studies in Computational Intelligence, 385
Temas:
Acceso en línea:Texto Completo

MARC

LEADER 00000nam a22000005i 4500
001 978-3-642-24797-2
003 DE-He213
005 20220119011920.0
007 cr nn 008mamaa
008 120205s2012 gw | s |||| 0|eng d
020 |a 9783642247972  |9 978-3-642-24797-2 
024 7 |a 10.1007/978-3-642-24797-2  |2 doi 
050 4 |a Q342 
072 7 |a UYQ  |2 bicssc 
072 7 |a TEC009000  |2 bisacsh 
072 7 |a UYQ  |2 thema 
082 0 4 |a 006.3  |2 23 
100 1 |a Graves, Alex.  |e author.  |4 aut  |4 http://id.loc.gov/vocabulary/relators/aut 
245 1 0 |a Supervised Sequence Labelling with Recurrent Neural Networks  |h [electronic resource] /  |c by Alex Graves. 
250 |a 1st ed. 2012. 
264 1 |a Berlin, Heidelberg :  |b Springer Berlin Heidelberg :  |b Imprint: Springer,  |c 2012. 
300 |a XIV, 146 p.  |b online resource. 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
347 |a text file  |b PDF  |2 rda 
490 1 |a Studies in Computational Intelligence,  |x 1860-9503 ;  |v 385 
505 0 |a Introduction -- Supervised Sequence Labelling -- Neural Networks -- Long Short-Term Memory -- A Comparison of Network Architectures -- Hidden Markov Model Hybrids -- Connectionist Temporal Classification -- Multidimensional Networks -- Hierarchical Subsampling Networks. 
520 |a Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning tools-robust to input noise and distortion, able to exploit long-range contextual information-that would seem ideally suited to such problems. However their role in large-scale sequence labelling systems has so far been auxiliary.    The goal of this book is a complete framework for classifying and transcribing sequential data with recurrent neural networks only. Three main innovations are introduced in order to realise this goal. Firstly, the connectionist temporal classification output layer allows the framework to be trained with unsegmented target sequences, such as phoneme-level speech transcriptions; this is in contrast to previous connectionist approaches, which were dependent on error-prone prior segmentation. Secondly, multidimensional recurrent neural networks extend the framework in a natural way to data with more than one spatio-temporal dimension, such as images and videos. Thirdly, the use of hierarchical subsampling makes it feasible to apply the framework to very large or high resolution sequences, such as raw audio or video.   Experimental validation is provided by state-of-the-art results in speech and handwriting recognition. 
650 0 |a Computational intelligence. 
650 0 |a Artificial intelligence. 
650 1 4 |a Computational Intelligence. 
650 2 4 |a Artificial Intelligence. 
710 2 |a SpringerLink (Online service) 
773 0 |t Springer Nature eBook 
776 0 8 |i Printed edition:  |z 9783642432187 
776 0 8 |i Printed edition:  |z 9783642247989 
776 0 8 |i Printed edition:  |z 9783642247965 
830 0 |a Studies in Computational Intelligence,  |x 1860-9503 ;  |v 385 
856 4 0 |u https://doi.uam.elogim.com/10.1007/978-3-642-24797-2  |z Texto Completo 
912 |a ZDB-2-ENG 
912 |a ZDB-2-SXE 
950 |a Engineering (SpringerNature-11647) 
950 |a Engineering (R0) (SpringerNature-43712)