How We Built an ML inference Platform with Knative - Dan Sun, Bloomberg LP & Animesh Singh, IBM
Building Machine Learning Inference Through Knative Serverless...- Shivay Lamba & Rishit Dagli
Knative Serverless for AI/ML Applications | Ian Lawson
Exploring ML Model Serving with KServe (with fun drawings) - Alexa Nicole Griffith, Bloomberg
What is vLLM? Efficient AI Inference for Large Language Models
Use Knative When You Can, and Kubernetes When You Must - David Hadas & Michael Maximilien, IBM
Machine Learning Model Serving and Pipeline Using KNative - Animesh Singh & Tommy Li, IBM
Evolving Deep Learning Platform with Knative - Ti Zhou, Baidu
Kubernetes Explained in 6 Minutes | k8s Architecture
Open-source Chassis.ml - Deploy Model to KServe
Serverless Machine Learning Model Inference on Kubernetes with KServe by Stavros Kontopoulos
Creating event-driven workflows using Knative & Direktiv
Metacontroller with Knative Functions - Kubernetes Controllers Made Simple - Lance Ball, Red Hat
Parallel inferencing with KServe Ray integration
Lightning Talk: What is the Knative Asynchronous Component?- Angelo Danducci II & Michael Maximilien
Connecting the World to Knative with Kamelets - Roland Huß, Red Hat
Introducing KFServing: Serverless Model Serving on Kubernetes - Ellis Bigelow & Dan Sun
Knative goes beyond serverless | Alexandre Roman
How to Create a Custom Serving Runtime in KServe ModelMesh to S... Rafael Vasquez & Christian Kadner
Полина Окунева | Causal Inference. Advanced методы моделирования