/lzy

Platform for a hybrid execution of ML workflows that transparently integrates local and remote runtimes

Primary LanguageJavaOtherNOASSERTION

Pypi version Tests Java tests coverage Python tests coverage PyPI - Python Version Telegram chat

ʎzy

ʎzy is a platform for a hybrid execution of ML workflows that transparently integrates local and remote runtimes with the following properties:

  • Python-native SDK
  • Automatic env (pip/conda) sync
  • K8s-native runtime
  • Resources allocation on-demand
  • Env-independent results storage

Quick start

ʎzy allows running any python functions on a cluster by annotating them with @op decorator:

@op(gpu_count=1, gpu_type=GpuType.V100.name)
def train(data_set: Bunch) -> CatBoostClassifier:
    cb_model = CatBoostClassifier(iterations=1000, task_type="GPU", devices='0:1', train_dir='/tmp/catboost')
    cb_model.fit(data_set.data, data_set.target, verbose=True)
    return cb_model


# local python function call
model = train(data_set)

# remote call on a cluster
lzy = Lzy()
with lzy.workflow("training"):
    model = train(data_set)

Please read the tutorial for details.

Runtime

Check out our key concepts and architecture intro.

Community

Join our chat on telegram!

Development

Development guide.

Deployment

Deployment guide.