Skip to content

Welcome to FlowyML 🌊

flowyml Logo
The Enterprise-Grade ML Pipeline Framework for Humans


FlowyML is designed for the modern MLOps team that values speed, reproducibility, and simplicity. We combine the simplicity of a Python script with the scaling power of an enterprise platform.

[!TIP] The FlowyML Promise: Write your code once locally. Scale it to Vertex AI or SageMaker in production by changing a single environment variable. No refactoring, no infrastructure code in your business logic.


πŸ’Ž The FlowyML Philosophy

Why do teams choose FlowyML over traditional orchestrators?

  1. Pure Python, Zero DSLs: If you can write a Python function, you can write a FlowyML pipeline. No complex YAML structures or rigid DSLs to learn.
  2. Infrastructure as a Detail: Treat your cloud infrastructure as a configuration choice, not a coding requirement.
  3. Asset-First Lineage: We don't just track files; we track Assets. Models, Datasets, and Metrics are first-class citizens with automatic lineage and metadata.
  4. Developer Happiness: Intelligent caching, local debugging, and a beautiful UI make iteration loops significantly faster.

🌟 Next-Gen Execution Engine

🧠 Type-Based Artifact Routing

New in 1.8.0. Define WHAT an artifact is, and let FlowyML handle WHERE it goes.

@step
def train_model(...) -> Model:
    # Automatically saved to GCS/S3 and registered
    # to your Model Registry (Vertex AI, SageMaker, etc.)
    return Model(obj, name="classifier", version="1.0.0")

🌍 Multi-Stack Context

Switch between Local, Staging, and Production environments instantly. Your code remains clean while FlowyML handles the infrastructure heavy lifting.

# Locally: uses local disk and orchestrator
python pipeline.py

# Production: uses Vertex AI, GCS, and Model Registry
FLOWYML_STACK=gcp-prod python pipeline.py

⚑ Intelligent Caching & Observability

Never re-run the same computation twice. Our smart caching system (code hash + input hash) saves time and money. Monitor everything in real-time with our premium dark-mode UI.


πŸ”Œ The Universal Plugin Ecosystem

FlowyML features a powerful native plugin system that allows you to integrate with ANY ML tool without adding heavy framework dependencies to your core project.

  • Orchestrators --- Vertex AI, SageMaker, Kubernetes, Ray, Airflow.

  • Storage --- GCS, S3, Azure Blob, Local FS.

  • Trackers --- MLflow, Weights & Biases, Neptune, TensorBoard.

  • Registries & Deployers --- Vertex AI, SageMaker, MLflow, Kubernetes Endpoints.


⚑️ Quick Start in 30 Seconds

This is a complete, multi-step ML pipeline with auto-injected context and typed outputs.

from flowyml import Pipeline, step, context, Model

@step(outputs=["dataset"])
def load_data():
    return [1, 2, 3, 4, 5]

@step(inputs=["dataset"], outputs=["model"])
def train_model(dataset, learning_rate: float = 0.01) -> Model:
    # 'learning_rate' is automatically injected from context!
    print(f"Training on {len(dataset)} items with lr={learning_rate}")
    return Model(data="weights", name="mnist_model", version="1.0.0")

# Configure and Run
ctx = context(learning_rate=0.05)
pipeline = Pipeline("quickstart", context=ctx)
pipeline.add_step(load_data).add_step(train_model)

pipeline.run()

πŸ—ΊοΈ Master the Platform

  • πŸš€ Getting Started --- Build your first pipeline in 5 minutes. Learn the basics of Steps and Pipelines.

  • πŸ“– Core Concepts --- Deep dive into the heart of FlowyML: Pipelines, Steps, Context, and Asset Lineage.

  • ⚑ Advanced Features --- Master Caching, Parallelism, Conditional Execution, and Step Grouping.

  • πŸ“ˆ User Guide --- Versioning, scheduling, model leaderboards, and UI metrics.

  • :plug: Plugins & Stacks --- Cloud integrations, model registries, type-based routing, and stack management.

  • πŸ§ͺ Practical Examples --- Browse working code for pipelines, UI integration, and cloud deployments.

  • πŸ›  API Reference --- Full technical documentation for classes, functions, and decorators.


πŸ—οΈ Practical Examples

Explore real-world implementations in the examples/ directory:


Questions? Open an issue on GitHub or join our community of MLOps enthusiasts.