I've used Metaflow for the past 4 years or so on different ML teams. It's really great!
Straightforward for data/ML scientists to pick up, familiar python class API for defining DAGs, and simplifies scaling out parallel jobs on AWS Batch (or k8s). The UI is pretty nice. Been happy to see the active development on it too.
Currently using it at our small biotech startup to run thousands of protein engineering computations (including models like RFDiffusion, ProteinMPNN, boltz, AlphaFold, ESM, etc.).
Data engineering focused DAG tools like Airflow are awkward for doing these kinds of ML computations, where we don't need the complexity of schedules, etc. Metaflow, imho, is also a step up from orchestration tools that were born out of bioinformatics groups, like Snakemake or Nextflow.
Have been looking for an orchestrator for AI workflows including agentic workflows and this seemed to be the most promising (open source, free, can self-host, and supports dynamic workflows).
But have not seen anyone talk about it in that context. What do people use for AI workflow orchestration (aside from langchain)?
I've been curious about this project for a while...
If you squint a bit it's sort of like an Airflow that can run on AWS Step Functions.
Step Functions sort of gives you fully serverless orchestration, which feels like a thing that should exist. But the process for authoring them is very cumbersome - they are crying out for a nice language level library i.e. for Python something that creates steps via decorator syntax.
And it looks like Metaflow basically provides that (as well as for other backends).
The main thing holding me back is lack of ecosystem. A big chunk of what I want to run on an orchestrator are things like dbt and dlt jobs, both of which have strong integrations for both Airflow and Dagster. Whereas Metaflow feels like not really on the radar, not widely used.
Possibly I have got the wrong end of the stick a bit because Metaflow also provides an Airflow backend, which I sort of wonder in that case why bother with Metaflow?
A while ago I saw a promising Clojure project stepwise [0] which sounds pretty close to what you're describing. It not only allows you to define steps in code, but also implements cool stuff like ability to write conditions, error statuses and resources in a much-less verbose EDN instead of JSON. It also supports code reloading and offloading large payloads to S3.
Metaflow was started to address the needs of ML/AI projects whereas Airflow and Dagster started in data engineering.
Consequently, a major part of Metaflow focuses on facilitating easy and efficient access to (large scale) compute - including dependency management - and local experimentation, which is out of scope for Airflow and Dagster.
Metaflow has basic support for dbt and companies use it increasingly to power data engineering as AI is eating the world, but if you just need an orchestrator for ETL pipelines, Dagster is a great choice
If you are curious to hear how companies navigate the question of Airflow vs Metaflow, see e.g this recent talk by Flexport https://youtu.be/e92eXfvaxU0
I went to the GitHub page. The descriptions of the service seem redundant to what cloud providers offer today. I looked at the documentation and it lacks concrete examples for implementation flows.
Seems like something new to learn, an added layer on top of existing workflows, with no obvious benefit.
All the cloud providers have some hosted / custom version of an AI/ML deployment and training system. Good enough to use, janky enough to probably not meet all your needs if you're serious.
As a fun historical sidebar and an illustration that there are no new names in tech these days, Metaflow was also the name of the company that first introduced out-of-order speculative execution of CISC architectures using micro-ops. [1]
A big deal is that they get packaged automatically for remote execution. And you can attach them on the command line without touching code, which makes it easy to build pipelines with pluggable functionality - think e.g. switching an LLM provider on the fly.
Metaflow tracks all artifacts and allows you to build dashboards with them, so there’s no need to use MLFlow per se. There’s a Metaflow integration in Weights and Biases, CometML etc, if you want pretty off-the-shelf dashboards
Straightforward for data/ML scientists to pick up, familiar python class API for defining DAGs, and simplifies scaling out parallel jobs on AWS Batch (or k8s). The UI is pretty nice. Been happy to see the active development on it too.
Currently using it at our small biotech startup to run thousands of protein engineering computations (including models like RFDiffusion, ProteinMPNN, boltz, AlphaFold, ESM, etc.).
Data engineering focused DAG tools like Airflow are awkward for doing these kinds of ML computations, where we don't need the complexity of schedules, etc. Metaflow, imho, is also a step up from orchestration tools that were born out of bioinformatics groups, like Snakemake or Nextflow.
Just a satisfied customer of Metaflow here. thx
reply