Pinned Repositories
composable_kernel
Composable Kernel: Performance Portable Programming Model for Machine Learning Tensor Operators
hcc
HCC is an Open Source, Optimizing C++ Compiler for Heterogeneous Compute currently for the ROCm GPU Computing Platform
hip
HIP: C++ Heterogeneous-Compute Interface for Portability
HIPIFY
HIPIFY: Convert CUDA to Portable C++ Code
MIOpen
AMD's Machine Intelligence Library
rocBLAS
Next generation BLAS implementation for ROCm platform
ROCK-Kernel-Driver
AMDGPU Driver with KFD used by the ROCm project. Also contains the current Linux Kernel that matches this base driver
ROCm
AMD ROCm™ Software - GitHub Home
ROCm-docker
Dockerfiles for the various software layers defined in the ROCm software platform
tensorflow-upstream
TensorFlow ROCm port
AMD ROCm™ Software's Repositories
ROCm/HIP-Examples
Examples for HIP
ROCm/OpenFOAM_HMM
Refactoring OpenFOAM with OpenMP target offloading and use of HMM to offload work onto GPUs
ROCm/pyrsmi
python package of rocm-smi-lib
ROCm/pytorch-micro-benchmarking
ROCm/hipify_torch
ROCm/libhipcxx
The C++ Standard Library for your entire system.
ROCm/hipDF
hipDF - GPU DataFrame Library
ROCm/Paddle
PArallel Distributed Deep LEarning
ROCm/rtg_tracer
ROCm/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
ROCm/hip-testsuite
ROCm/hipCOMP-core
hipCOMP is a library for fast lossless compression/decompression on the GPU. This repository contains the algorithms.
ROCm/rocHPL-MxP
ROCm/benchmarks
Benchmark code
ROCm/CTranslate2
Fast inference engine for Transformer models
ROCm/hip-mpi-testsuite
ROCm/IRFuzzer
AMD fork of IRFuzzer
ROCm/migraphx-benchmark
ROCm/rocm-recipes
Recipes for rocm
ROCm/torchtitan
ROCm/tritonserver-onnxruntime
ROCm/ByteMLPerf
AI Accelerator Benchmark focuses on evaluating AI Accelerators from a practical production perspective, including the ease of use and versatility of software and hardware.
ROCm/tritonserver
ROCm/text-generation-inference
Large Language Model Text Generation Inference
ROCm/triton-kernels
ROCm/tritoninferenceserver-vllm
ROCm/tritonserver-core
ROCm/tritonserver-pytorch
ROCm/tritonserver-third_party
ROCm/tritonserver_backend