Pinned Repositories
Enzyme
High-performance automatic differentiation of LLVM and MLIR.
Enzyme.jl
Julia bindings for the Enzyme automatic differentiator
llvm-project
The LLVM Project is a collection of modular and reusable compiler and toolchain technologies.
Polygeist
C/C++ frontend for MLIR. Also features polyhedral optimizations, parallel optimizations, and more!
Enzyme-GPU-Tests
This repo contains the benchmarks for Enzyme on GPU's
Enzyme-Pytorch
Polygeist-Script
Tapir-Clang
Clang frontend to Tapir Compiler, includes implementation of Cilk language in frontend
Tapir-LLVM
Tapir extension to LLVM for optimizing Parallel Programs
Tapir-Meta
wsmoses's Repositories
wsmoses/Enzyme-GPU-Tests
This repo contains the benchmarks for Enzyme on GPU's
wsmoses/binaries
To distribute binaries that don't otherwise have a home
wsmoses/Checkpointing.jl
Checkpointing for Automatic Differentiation
wsmoses/CUDA.jl
CUDA programming in Julia.
wsmoses/DynamicPPL.jl
Implementation of domain-specific language (DSL) for dynamic probabilistic programming
wsmoses/FFTW.jl
Julia bindings to the FFTW library for fast Fourier transforms
wsmoses/flax
Flax is a neural network library for JAX that is designed for flexibility.
wsmoses/Flux.jl
Relax! Flux is the ML library that doesn't make you tensor
wsmoses/jax-md
Differentiable, Hardware Accelerated, Molecular Dynamics
wsmoses/jraph
A Graph Neural Network Library in Jax
wsmoses/julia
The Julia Language: A fresh approach to technical computing.
wsmoses/JuMP.jl
Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
wsmoses/keras
Deep Learning for humans
wsmoses/keras-benchmarks
wsmoses/LinearSolve.jl
LinearSolve.jl: High-Performance Unified Interface for Linear Solvers in Julia. Easily switch between factorization and Krylov methods, add preconditioners, and all in one interface.
wsmoses/llvm
Fork of the LLVM Compiler Infrastructure
wsmoses/llvm-project
The LLVM Project is a collection of modular and reusable compiler and toolchain technologies.
wsmoses/Lux.jl
Explicitly Parameterized Neural Networks in Julia
wsmoses/maxtext
A simple, performant and scalable Jax LLM!
wsmoses/neuralgcm
Hybrid ML + physics model of the Earth's atmosphere
wsmoses/NonlinearSolve.jl
High-performance and differentiation-enabled nonlinear solvers (Newton methods), bracketed rootfinding (bisection, Falsi), with sparsity and Newton-Krylov support.
wsmoses/optax
Optax is a gradient processing and optimization library for JAX.
wsmoses/OptimizationBase.jl
The base package for Optimization.jl, containing the structs and basic functions for it.
wsmoses/orbax
Orbax provides common checkpointing and persistence utilities for JAX users
wsmoses/OrdinaryDiffEq.jl
High performance ordinary differential equation (ODE) and differential-algebraic equation (DAE) solvers, including neural ordinary differential equations (neural ODEs) and scientific machine learning (SciML)
wsmoses/SparseArrays.jl
SparseArrays.jl is a Julia stdlib
wsmoses/StanBlocks.jl
wsmoses/SymbolicRegression.jl
Distributed High-Performance Symbolic Regression in Julia
wsmoses/tensorflow
An Open Source Machine Learning Framework for Everyone
wsmoses/tf-keras
The TensorFlow-specific implementation of the Keras API, which was the default Keras from 2019 to 2023.