Since they introduced Cloudflare on December 12. I have been trying to reverse engineer their new method. By first bypassing their Cloudflare page. When I was done, they Introduced Google reCAPTCHA when logging in, this made stuff a bit more complicated. So many things could go wrong in a 12 step process.
I finally did it. It’s all done. I was unsure about publishing the code though. Because if they could patch the first one, they can definitely patch this new one. And I’d like to tell you. THEY ARE WATCHING.
So I found a way to let you guys access their API without the overhead of Installing, Running code. We have an API now, You just make a GET request to the endpoint and It’ll return the ChatGPT response.
Behind the scenes:
- I solve the captcha for you
- I bypass Cloudflare
- I manage accounts
- I rotate Cloudflare keys/ Access Tokens
- I manage code dependencies, and make sure the code is up-to-date, and have the ability to fix the code in case they change their code
While you:
- Just make a GET request to an endpoint
I plan on allowing the same level of customization as PyChatGPT. I’m running all of this on AWS, On money from my pocket. And the total cost is near $80 for a month. While we’re in a waitlist based model, I’ll maintain all the costs myself.
Go ahead and write “I want access” on #waitlist (First **2-8 ** people will be selected for the initial tryout)
Read More - How OpenAI filters requests made by bots/scrapers
⭐️ Like this repo? please star & consider donating to keep it maintained
💡 If OpenAI change their API, I will fix it as soon as possible, so Watch the repo if you want to be notified
- Save Conversations to a file
- Resume conversations even after closing the program
- Proxy Support
- Automatically login without involving a browser
- Automatically grab Access Token
- Get around the login captcha (If you try to log in subsequently, you will be prompted to solve a captcha)
- Saves the access token to a file, so you don't have to log in again
- Automatically refreshes the access token when it expires
- Uses colorama to colorize the output, because why not?
- Smart Conversation Tracking
Integrated into Huggingface Spaces 🤗 using Gradio. Try out the Web Demo
Chatting
Creating a token
You: Hi there, My name is Rawa
Chat GPT: Hello Rawa, nice to meet you. Is there something you would like to talk about or ask me? I'm here to help with any questions you may have.
You: great, now say my name like Heisenberg
Chat GPT: Sure, Rawa like Heisenberg. Is there anything else you would like to talk about? I'm here to help with any questions you may have.
You: Sorry I meant like the episode of Breaking Bad where Walter White says Heisenberg
Chat GPT: Ah, I see. In that case, you could try saying it like this: "My name is Rawa, like Heisenberg." This is a reference to the character Walter White from the TV show Breaking Bad, who often used the pseudonym "Heisenberg" when conducting illegal activities. The character was known for his cool and calculated demeanor, so saying your name like Heisenberg in this context would mean saying it with confidence and authority.
pip install chatgptpy --upgrade
[NEW] Pass a options()
object to the ChatGPT()
constructor to customize the session
[NEW] You can now save your conversations to a file
from pychatgpt import Chat, Options
options = Options()
# [New] Pass Moderation. https://github.com/rawandahmad698/PyChatGPT/discussions/103
# options.pass_moderation = False
# [New] Enable, Disable logs
options.log = True
# Track conversation
options.track = True
# Use a proxy
options.proxies = 'http://localhost:8080'
# Optionally, you can pass a file path to save the conversation
# They're created if they don't exist
# options.chat_log = "chat_log.txt"
# options.id_log = "id_log.txt"
# Create a Chat object
chat = Chat(email="email", password="password", options=options)
answer = chat.ask("How are you?")
print(answer)
[NEW] Resume a conversation
from pychatgpt import Chat
# Create a Chat object
chat = Chat(email="email", password="password",
conversation_id="Parent Conversation ID",
previous_convo_id="Previous Conversation ID")
answer, parent_conversation_id, conversation_id = chat.ask("How are you?")
print(answer)
# Or change the conversation id later
answer, _, _ = chat.ask("How are you?",
previous_convo_id="Parent Conversation ID",
conversation_id="Previous Conversation ID")
print(answer)
Start a CLI Session
from pychatgpt import Chat
chat = Chat(email="email", password="password")
chat.cli_chat()
Ask a one time question
from pychatgpt import Chat
# Initializing the chat class will automatically log you in, check access_tokens
chat = Chat(email="email", password="password")
answer, parent_conversation_id, conversation_id = chat.ask("Hello!")
import time
from pychatgpt import OpenAI
# Manually set the token
OpenAI.Auth(email_address="email", password="password").save_access_token(access_token="", expiry=time.time() + 3600)
# Get the token, expiry
access_token, expiry = OpenAI.get_access_token()
# Check if the token is valid
is_expired = OpenAI.token_expired() # Returns True or False
Change Log
- Fixes an issue when reading from id_log.txt
- Introduces a new
pass_moderation
parameter to theoptions()
class, defaults toFalse
- Adds proxies to moderation.
- If
pass_moderation
is True, the function is invoked in another thread, so it doesn't block the main thread.
- Make a request to the mod endpoint first, otherwise a crippled version of the response is returned
- New option to turn off logs.
- Better Error handling.
- Enhanced conversation tracking
- Ask now returns a tuple of
answer, previous_convo, convo_id
- Better docs
- Pull requests/minor fixes
- Fixes for part 8 of token authentication
- a new
options()
class method to set the options for the chat session - save the conversation to a file
- resume the conversation even after closing the program
- ChatGPT API switches from
action=next
toaction=variant
, frequently. This library is now usingaction=variant
instead ofaction=next
to get the next response from the API. - Sometimes when the server is overloaded, the API returns a
502 Bad Gateway
error. - Added Error handling if the auth.json file is not found/corrupt
- Initial Release via PyPi
If the token creation process is failing:
- Try to use a proxy (I recommend using this always)
- Don't try to log in too fast. At least wait 10 minutes if you're being rate limited.
- If you're still having issues, try to use a VPN. On a VPN, the script should work fine.
I'm planning to add a few more features, such as:
- A python module that can be imported and used in other projects
- A way to save the conversation
- Better error handling
- Multi-user chatting
I have been looking for a way to interact with the new Chat GPT API, but most of the sources here on GitHub require you to have a Chromium instance running in the background. or by using the Web Inspector to grab Access Token manually.
No more. I have been able to reverse engineer the API and use a TLS client to mimic a real user, allowing the script to login without setting off any bot detection techniques by Auth0
Basically, the script logs in on your behalf, using a TLS client, then grabs the Access Token. It's pretty fast.
First, I'd like to tell you that "just making http" requests is not going to be enough, Auth0 is smart, each process is guarded by a
state
token, which is a JWT token. This token is used to prevent CSRF attacks, and it's also used to prevent bots from logging in.
If you look at the auth.py
file, there are over nine functions, each one of them is responsible for a different task, and they all
work together to create a token for you. allow-redirects
played a huge role in this, as it allowed to navigate through the login process
I work at MeshMonitors.io, We make amazing tools (Check it out yo!). I decided not to spend too much time on this, but here we are.
No one has been able to do this, and I wanted to see if I could.
- OpenAI for creating the ChatGPT API
- FlorianREGAZ for the TLS Client