Transfer learning for natural language processing.
Paul Azunre uses Kaggle kernels to experiment with some key NLP architectures, like BERT, GPT etc to generate some text, perform some classification, etc. Time permitting. He also discusses some major advances in NLP since his book "Transfer Learning for NLP" was published (August 2021).
Clasificación: | Libro Electrónico |
---|---|
Formato: | Electrónico Video |
Idioma: | Inglés |
Publicado: |
[Place of publication not identified] :
Manning Publications,
[2022]
|
Edición: | [First edition]. |
Temas: | |
Acceso en línea: | Texto completo (Requiere registro previo con correo institucional) |
Ejemplares similares
-
Deep learning for natural language processing : applications of deep neural networks to machine learning tasks /
Publicado: (2017) -
Natural language annotation for machine learning /
por: Pustejovsky, J. (James), et al.
Publicado: (2013) -
Train Word embeddings from scratch with Nessvec and PyTorch.
Publicado: (2022) -
Use PyNNDescent and `nessvec` to index high dimensional vectors (word embeddings).
Publicado: (2022) -
Natural language processing recipes : unlocking text data with machine learning and deep learning using Python /
por: Kulkarni, Akshay
Publicado: (2021)