The most flexible way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Inference Graph/Pipelines, Compound AI systems, Multi-Modal, RAG as a Service, and more!
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.11...v1.2.12
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.10...v1.2.11
bentoml get
output in YAML format by @frostming in https://github.com/bentoml/BentoML/pull/4638
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.9...v1.2.10
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.8...v1.2.9
__init__
construction by @aarnphm in https://github.com/bentoml/BentoML/pull/4576
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.7...v1.2.8
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.6...v1.2.7
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.5...v1.2.6
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.4...v1.2.5
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.3...v1.2.4
Full Changelog: https://github.com/bentoml/BentoML/compare/v1.2.2...v1.2.3