kardolus/chatgpt-cli
ChatGPT CLI is a versatile tool for interacting with LLM models through OpenAI and Azure, as well as models from Perplexity AI and Llama. It supports prompts and history tracking for seamless, context-aware interactions. With extensive configuration options, it’s designed for both users and developers to create a customized GPT experience.
GoMIT
Issues
- 4
- 3
Feature request: Role files
#99 opened by benjamingorman - 3
Add GPT o1 support
#96 opened by circlesmiler - 3
- 1
Add a "make coverage"
#93 opened by kardolus - 1
Packaging should be domain based
#90 opened by kardolus - 4
Incorrect Version in Release Binary
#91 opened by Xevion - 1
Add a timestamp to the stored history
#88 opened by kardolus - 1
Add the seed parameter
#72 opened by kardolus - 3
- 1
- 6
- 1
Add a --show-history flag to print the history
#70 opened by kardolus - 23
Multi-line interactive mode?
#77 opened by drone1 - 1
Add a linter
#80 opened by kardolus - 6
Fallback to Default Config When OPENAI_CONFIG_HOME is Set and YAML Configuration is missing.
#75 opened by silgon - 4
- 1
Add a --new-thread flag
#79 opened by kardolus - 1
- 1
Add support for prompt files (--prompt)
#71 opened by kardolus - 13
`--set-<value>` should have a `--value` counter part for session based options
#36 opened by nopeless - 0
- 2
Allow the input to be empty in case of a pipe.
#69 opened by kardolus - 3
Feature Request: Add ``clear`` command to clear the screen in the interactive mode
#67 opened by zinwang - 2
Read `config.yaml` from XDG standard paths
#30 opened by erhhung - 3
[Feature Request] Format response header
#66 opened by SkyaTura - 3
Enable FreeBSD builds
#65 opened by depau - 9
Any interest in support Claude?
#56 opened by justengland - 2
File attachment
#54 opened by igormedo - 5
- 2
Count tokens outside of `--interactive` mode
#62 opened by stephen010x - 3
Option to remove line-by-line animation?
#61 opened by stephen010x - 1
can you add a proxy?
#60 opened by IzyI - 3
New Line in interactive mode
#59 opened by mandulaj - 9
- 3
listModule interface design suggestions
#55 opened by zzerding - 4
32 bit's support
#51 opened by Nehc - 3
- 1
Error message should not go to stdout
#45 opened by gzqx - 1
403 not supported
#44 opened by gearonixx - 2
- 2
- 1
Ability to use ChatGPT session token?
#38 opened by jonny-wg2 - 4
- 1
Feature: multi-instance mode
#32 opened by tennox - 3
Feature request: OPENAI_API_ENDPOINT or equivalent CLI parameter to enable FOSS local / self-hosted OenAI API servers like ollama
#29 opened by PieBru - 11
Regarding autocompletion
#26 opened by benbenbang - 4
- 2
OPENAI KEY
#24 opened by TimeToCrack - 6
How to use with Azure?
#18 opened by AlaShibanAtKlo