Pinned Repositories
AMD-RPP
Radeon Performance Primitives Library
CNTK-1
Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
cub
CUB is a flexible library of cooperative threadblock primitives and other utilities for CUDA kernel programming.
cuda-convnet
Automatically exported from code.google.com/p/cuda-convnet
cunn
cutorch
A CUDA backend for Torch7
darknet
Windows and Linux version of Darknet Yolo v3 & v2 Neural Networks for object detection
DeepCL
OpenCL library to train deep convolutional neural networks
HipThrustSamples
Experimenting hipThrustPort
MCW_CPPAMP_TORCH
C++ AMP backend for Torch7 NN framework
NEELMCW's Repositories
NEELMCW/MCW_CPPAMP_TORCH
C++ AMP backend for Torch7 NN framework
NEELMCW/HipThrustSamples
Experimenting hipThrustPort
NEELMCW/AMD-RPP
Radeon Performance Primitives Library
NEELMCW/CNTK-1
Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
NEELMCW/cub
CUB is a flexible library of cooperative threadblock primitives and other utilities for CUDA kernel programming.
NEELMCW/cuda-convnet
Automatically exported from code.google.com/p/cuda-convnet
NEELMCW/cunn
NEELMCW/cutorch
A CUDA backend for Torch7
NEELMCW/darknet
Windows and Linux version of Darknet Yolo v3 & v2 Neural Networks for object detection
NEELMCW/DeepCL
OpenCL library to train deep convolutional neural networks
NEELMCW/deepstream-plugins
Samples for TensorRT/Deepstream for Tesla & Jetson
NEELMCW/Eigen
NEELMCW/hcRNG
NEELMCW/HIP
HIP : Convert CUDA to Portable C++ Code
NEELMCW/hipBLAS
ROCm BLAS marshalling library
NEELMCW/hipdnn-benchmarks
NEELMCW/hipdnn_test_suite
NEELMCW/hipeigen
NEELMCW/hipSPARSE
ROCm SPARSE marshalling library
NEELMCW/Linear_regression
NEELMCW/maxas
Automatically exported from code.google.com/p/maxas
NEELMCW/MIOpen
AMD's Machine Intelligence Library
NEELMCW/MobileNet-YOLO
A caffe implementation of MobileNet-YOLO detection network
NEELMCW/moderngpu
Patterns and behaviors for GPU computing
NEELMCW/opencv
Open Source Computer Vision Library
NEELMCW/server
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
NEELMCW/tensorflow
Computation using data flow graphs for scalable machine learning