Cargando…

Packaging Machine Learning Models with Docker /

One of the important aspects of MLOps, also known as Machine Learning Operations or Operationalizing Machine learning, is to package ML models. How exactly do you package ML models? In this video I show you exactly what that means, and go through the process of packaging an ONNX model taken from the...

Descripción completa

Detalles Bibliográficos
Autores principales: Deza, Alfredo (Autor, VerfasserIn.), Gift, Noah (Autor, VerfasserIn.)
Autor Corporativo: Safari, an O'Reilly Media Company (Contribuidor, MitwirkendeR.)
Formato: Video
Idioma:Inglés
Publicado: [Erscheinungsort nicht ermittelbar] : Pragmatic AI Solutions, 2021
Edición:1st edition.
Acceso en línea:Texto completo (Requiere registro previo con correo institucional)
Descripción
Sumario:One of the important aspects of MLOps, also known as Machine Learning Operations or Operationalizing Machine learning, is to package ML models. How exactly do you package ML models? In this video I show you exactly what that means, and go through the process of packaging an ONNX model taken from the ONNX Model Zoo. I end up with a docker container that can be shared, exposing an API that is ready to consume and perform live predictions for sentiment analysis. Topics include: * What are the concepts behind packaging Machine Learning Models * Create a sentiment analysis API tool with Flask * Define dependencies and a Dockerfile for packaging * Create a container with an ONNX model that can be deployed anywhere with an HTTP API A few resources that are helpful if you are trying to get started with SBOMs, generating them and using them to capture vulnerabilities: * The RoBERTa ONNX Model * Schema labeling concetps for Docker containers * The Practical MLOps code respository full of examples.
Notas:Online resource; Title from title screen (viewed May 27, 2021).
Descripción Física:1 online resource (1 video file, circa 37 min.)