Working with transformer-based embeddings for text similarity matching.
Embeddings from transformer models such as BERT can be used as representations of sentences. In this session, Matteus Tanha works with these embeddings to match similar sentences or paragraphs by exploring a few different distance metrics. The focus is on the application of transformer models but he...
Clasificación: | Libro Electrónico |
---|---|
Formato: | Electrónico Video |
Idioma: | Inglés |
Publicado: |
[Place of publication not identified] :
Manning Publications,
2021.
|
Edición: | [First edition]. |
Temas: | |
Acceso en línea: | Texto completo (Requiere registro previo con correo institucional) |
Ejemplares similares
-
Mastering transformers : build SOTA models from scratch with advanced natural language processing techniques /
por: Yıldırım, Savaş, et al.
Publicado: (2021) -
Natural language processing /
Publicado: (2021) -
Natural language processing from scratch /
Publicado: (2019) -
Adding large files for NL datasets.
Publicado: (2022) -
Train Word embeddings from scratch with Nessvec and PyTorch.
Publicado: (2022)