Pinned Repositories
ai-on-z-101
AI on IBM Z Github launch point (https://ibm.github.io/ai-on-z-101)
ai-on-z-tensorflow-zcx
Demonstrating a z/OS program calling TensorFlow Serving
bento-onnxmlir-model-test
Tests to drive onnx-mlir support in bentoML
murmurhash
💥 Cython bindings for MurmurHash2
onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
opt_example
simdjson
Parsing gigabytes of JSON per second
thinc-bigendian-ops
Make Thinc pipelines portable to big endian platorms.
BentoML
The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
andrewsi-z's Repositories
andrewsi-z/ai-on-z-101
AI on IBM Z Github launch point (https://ibm.github.io/ai-on-z-101)
andrewsi-z/ai-on-z-tensorflow-zcx
Demonstrating a z/OS program calling TensorFlow Serving
andrewsi-z/bento-onnxmlir-model-test
Tests to drive onnx-mlir support in bentoML
andrewsi-z/murmurhash
💥 Cython bindings for MurmurHash2
andrewsi-z/onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
andrewsi-z/opt_example
andrewsi-z/simdjson
Parsing gigabytes of JSON per second
andrewsi-z/thinc-bigendian-ops
Make Thinc pipelines portable to big endian platorms.