Pinned Repositories
CoNLI_hallucination
CoNLI: a plug-and-play framework for ungrounded hallucination detection and reduction
DoLa
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
factor
Code and data for the FACTOR paper
flash-attention
Fast and memory-efficient exact attention
HuggingFace-Download-Accelerator
利用HuggingFace的官方下载工具从镜像网站进行高速下载。
ic-mlc-MiniCPM
mlc-MiniCPM
MiniCPM on Android platform.
nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and anomaly detection. Generative pretrained transformer for time series trained on over 100B data points. It's capable of accurately predicting various domains such as retail, electricity, finance, and IoT with just a few lines of code 🚀.
qllm-eval
Code Repository of Evaluating Quantized Large Language Models
Time-LLM
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
xkpengl's Repositories
xkpengl/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and anomaly detection. Generative pretrained transformer for time series trained on over 100B data points. It's capable of accurately predicting various domains such as retail, electricity, finance, and IoT with just a few lines of code 🚀.
xkpengl/timesfm
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
xkpengl/Time-LLM
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
xkpengl/ic-mlc-MiniCPM
xkpengl/mlc-MiniCPM
MiniCPM on Android platform.
xkpengl/HuggingFace-Download-Accelerator
利用HuggingFace的官方下载工具从镜像网站进行高速下载。
xkpengl/flash-attention
Fast and memory-efficient exact attention
xkpengl/qllm-eval
Code Repository of Evaluating Quantized Large Language Models
xkpengl/DoLa
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
xkpengl/factor
Code and data for the FACTOR paper
xkpengl/CoNLI_hallucination
CoNLI: a plug-and-play framework for ungrounded hallucination detection and reduction