midjourney
New research lab. Exploring new mediums of thought. Expanding the imaginative powers of the human species.
Pinned Repositories
docs
(deprecated) Source for Midjourney's official wiki
einops
Deep learning operations reinvented (for pytorch, tensorflow, jax and others)
flash-attention
Fast and memory-efficient exact attention
flash-attention-jax
Implementation of Flash Attention in Jax
flax
Flax is a neural network library for JAX that is designed for flexibility.
hf-transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
jax
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
nanobind
nanobind: tiny and efficient C++/Python bindings
TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
midjourney's Repositories
midjourney/docs
(deprecated) Source for Midjourney's official wiki
midjourney/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
midjourney/jax
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
midjourney/flax
Flax is a neural network library for JAX that is designed for flexibility.
midjourney/flash-attention-jax
Implementation of Flash Attention in Jax
midjourney/einops
Deep learning operations reinvented (for pytorch, tensorflow, jax and others)
midjourney/flash-attention
Fast and memory-efficient exact attention
midjourney/hf-transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
midjourney/TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
midjourney/nanobind
nanobind: tiny and efficient C++/Python bindings
midjourney/equinox
Elegant easy-to-use neural networks in JAX. https://docs.kidger.site/equinox/