Someone: How's everything going in your production?
Engineers: π
Let's start this blog with basics as some of you don't know what MLOps is, right?
MLOps, or Machine Learning Operations, is a set of practices that streamline the process of developing, deploying, and maintaining machine learning (ML) models.
In short, MLOps is an extension of DevOps.
In this blog, I'll be covering the top 5 MLOps tools that will definitely be open-source and help you to do on top of some other products!
So, let's start. 3... 2... 1... π’
Β
1οΈβ£ KitOps π¦
KitOps is an innovative open-source project designed to enhance collaboration among data scientists, application developers, and SREs working on integrating or managing self-hosted AI/ML models.
Β
But, Why KitopsβοΈ
There is no standard and versioned packaging system for AI/ML projects.
To solve this, Kitops is built. The goal of KitOps is to be a library of versioned packages for your AI project, stored in an enterprise registry you already use.
πͺ Here's why Kit's ModelKits are the better solution:
- Combine models, datasets, code, and all the context teams need to integrate, test, or deploy:
- Training code
- Model code
- Serialized model
- Training, validation, and other datasets
- Metadata
- Let teams reuse their existing container registries by packaging everything as an OCI-compliant artifact.
- Support unpacking only a piece of the model package to your local machine (saving time and space).
- Remove tampering risks by using an immutable package.
- Reduce risks by including the provenance of the model and datasets.
Use kit pack
to package up your Jupyter notebook, serialized model, and datasets (based on a Kitfile).
Then kit push
it to any OCI-compliant registry, even a private one.
Most people won't need everything, so just kit unpack
from the remote registry to get just the model, only the datasets, or just the notebook. Or, if you need everything then a kit pull
will grab everything.
Β
What's Inside KitOps?
π ModelKit
At the heart of KitOps is the ModelKit, an OCI-compliant packaging format that enables the seamless sharing of all necessary artifacts involved in the AI/ML model lifecycle.
π Kitfile
Complementing the ModelKit is the Kitfile, a YAML-based configuration file that simplifies the sharing of model, dataset, and code configurations.
π₯οΈ Kit CLI
Bringing everything together is the Kit Command Line Interface (CLI). The Kit CLI is a powerful tool that enables users to create, manage, run, and deploy ModelKits using Kitfiles.
Β
Some Useful Links | |
---|---|
Topic | Link |
Getting started with KitOps π | https://kitops.ml/docs/get-started.html |
How to make your own Kitfile π·ββοΈ | https://kitops.ml/docs/next-steps.html |
How KitOps Is Used π οΈ | https://kitops.ml/docs/use-cases.html |
How KitOps is Different π | https://kitops.ml/docs/versus.html |
KitOps Modelkit | https://kitops.ml/docs/modelkit/intro.html |
KitOps Kitfile | https://kitops.ml/docs/kitfile/kf-overview.html |
KitOps CI references | https://kitops.ml/docs/cli/cli-reference.html |
Β
Don't forget to join KitOps Official Discord Channel: https://discord.gg/EtmEN5gyV9
2οΈβ£ Kubeflow π
Kubeflow is an open-source platform for machine learning and MLOps on Kubernetes introduced by Google. The different stages in a typical machine learning lifecycle are represented with different software components in Kubeflow, including model development, model training, model serving, and automated machine learning.
Β
- Kubeflow is compatible with cloud services (AWS, GCP, Azure) and self-hosted services.
- It allows machine learning engineers to integrate all kinds of AI frameworks for training, finetuning, scheduling, and deploying the models.
- It provides a centralized dashboard for monitoring and managing the pipelines, editing the code using Jupyter Notebook, experiment tracking, model registry, and artifact storage.
Β
Some Useful Links | |
---|---|
Topic | Link |
Introduction to KubeFlow π | https://www.kubeflow.org/docs/started/introduction/ |
kubeFlow Architecture π·ββοΈ | https://www.kubeflow.org/docs/started/architecture/ |
Installing KubeFlow π οΈ | https://www.kubeflow.org/docs/started/installing-kubeflow/ |
KubeFlow Concepts π | https://www.kubeflow.org/docs/concepts/ |
KubeFlow Components | https://www.kubeflow.org/docs/components/ |
KubeFlow External Add-ons | https://www.kubeflow.org/docs/components/ |
3οΈβ£ MLflow π
MLflow is an open-source platform, purpose-built to assist machine learning practitioners and teams in handling the complexities of the machine learning process. MLflow focuses on the full lifecycle for machine learning projects, ensuring that each phase is manageable, traceable, and reproducible.
It is generally used for experiment tracking and logging. However, with time, it has become an end-to-end MLOps tool for all kinds of machine learning models, including LLMs (Large Language Models).
You can manage the entire machine learning ecosystem using CLI, Python, R, Java, and REST API.
Β
The MLFlow has 6 core components:
- Tracking: version and store parameters, code, metrics, and output files. It also comes with interactive metric and parametric visualizations.
- Projects: packaging data science source code for reusability and reproducibility.
- Models: store machine learning models and metadata in a standard format that can be used later by the downstream tools. It also provides model serving and deployment options.
- Model Registry: a centralized model store for managing the life cycle of MLflow Models. It provides versioning, model lineage, model aliasing, model tagging, and annotations.
- Recipes (Pipelines): machine learning pipelines that let you quickly train high-quality models and deploy them to production.
- LLMs: provide support for LLMs evaluation, prompt engineering, tracking, and deployment.
Β
Some Useful Links | |
---|---|
Topic | Link |
MLflow Overview π | https://mlflow.org/docs/latest/index.html |
Getting Started with MLflow π·ββοΈ | https://mlflow.org/docs/latest/tracking.html |
MLflow Tracing π οΈ | https://mlflow.org/docs/latest/llms/tracing/index.html |
MLflow Models π | https://mlflow.org/docs/latest/models.html |
MLflow tracking | https://mlflow.org/docs/latest/tracking.html |
MLflow Model Registry | https://mlflow.org/docs/latest/models.html |
MLflow Recipies | https://mlflow.org/docs/latest/recipes.html |
MLflow Projects | https://mlflow.org/docs/latest/recipes.html |
4οΈβ£ MetaFlow π
MetaFlow is a human-friendly Python library that makes it straightforward to develop, deploy, and operate various kinds of data-intensive applications, in particular those involving data science, ML, and AI. Metaflow was originally developed at Netflix to boost the productivity of data scientists who work on a wide variety of projects, from classical statistics to state-of-the-art deep learning.
Β
Metaflow was initially developed at Netflix to increase the productivity of data scientists. It has now been made open source, so everyone can benefit from it.
It provides a unified API for data management, versioning, orchestration, mode training and deployment, and computing. It is compatible with major Cloud providers and machine learning frameworks.
Β
Some Useful Links | |
---|---|
Topic | Link |
MetaFlow Python Docs π | https://docs.metaflow.org/ |
MetaFlow R Docs π·ββοΈ | https://docs.metaflow.org/v/r |
MetaFlow Admin Docs π οΈ | https://docs.outerbounds.com/engineering/welcome/ |
5οΈβ£ MLRun π¦
MLRun is an open-source AI orchestration framework for managing ML and generative AI applications across their lifecycle. It automates data preparation, model tuning, customization, validation and optimization of ML models, LLMs, and live AI applications over elastic resources. MLRun enables the rapid deployment of scalable real-time serving and application pipelines while providing built-in observability and flexible deployment options, supporting multi-cloud, hybrid, and on-prem environments.
Β
π Core Components:
- Project Management: a centralized hub that manages various project assets such as data, functions, jobs, workflows, secrets, and more.
- Data and Artifacts: connect various data sources, manage metadata, catalog, and version the artifacts.
- Feature Store: store, prepare, catalog, and serve model features for training and deployment.
- Batch Runs and Workflows: runs one or more functions and collects, tracks, and compares all their results and artifacts.
- Real-Time Serving Pipeline: fast deployment of scalable data and machine learning pipelines.
- Real-time monitoring: monitors data, models, resources, and production components.
Β
Some Useful Links | |
---|---|
Topic | Link |
MLRun Architecture π | https://docs.mlrun.org/en/stable/architecture.html |
MLRun Tutorials and Examples π·ββοΈ | https://docs.mlrun.org/en/stable/tutorials/index.html |
MLRun Installation and Setup Guide π οΈ | https://docs.mlrun.org/en/stable/install.html |
MLRun GenAI Development Workflow | https://docs.mlrun.org/en/stable/genai/genai-flow.html |
MLRun MLOps Development Workflow | https://docs.mlrun.org/en/stable/mlops-dev-flow.html |
Moving to the end... π₯Ή
Each project carries some similarities and some differences. And every product is different, and thus the need.
If you're an open-source enthusiast and have an interest or knowledge in MLOps/DevOps, you can contribute to these Awesome Repos.
And as I say always, thanks for reading till here/~
You're awesome! Have a good day... π
Top comments (13)
Tell me your favorite MLOps tool!
Don't forget to start KITOPS on Github: github.com/jozu-ai/kitops
Check this one out: github.com/SnapperML/SnapperML
That's cool! It ultimately uses MLFlow though.
Why don't you guys improve it?? I see the new changes done 2-5 days back. Open new issues, and let the contributors work. (sadly python is not my preferred stack)
"How to make it live again?"
Participate in OS events like hacktoberfest or related events. This will boost your project.
If you have a discord server/related communication channels, please share the link.
By the way, here's my Github: github.com/RS-labhub
Edit: I starred the repo!
Great read!
Thanks for reading!
Great read!
Kitops!!! π
Nice One....!!!!
Thank you!
Simple and straightforward!
Thanks for the article.
Thanks for the read!
Dope!
Thank you!!!