How to build ChatGPT Clone in Python

In this article, we will see the steps involved in building a chat application and an answering bot in Python using the ChatGPT API and gradio.

Developing a chat application in Python provides more control and flexibility over the ChatGPT website. You can customize and extend the chat application as per your needs. It also help you to integrate with your existing systems and other APIs.

Steps to build ChatGPT Clone in Python
What is Gradio?

Gradio is a Python library that makes it easy to create customizable user interfaces for predictive and generative models, allowing you to quickly build interactive applications without needing extensive front-end development experience.

See below some of the benefits of using Gradio.

  1. Quick Mock-up: By using Gradio, you can quickly iterate and experiment with different model configurations and user interfaces without writing extensive code.
  2. No Front-End Development Skills Required: You don't need to be an expert in front-end development to create interactive applications with Gradio. It solves the complexities of building user interfaces, allowing you to focus on the functionality of your models.
  3. Multi-Input and Multi-Output Support: Gradio supports models with multiple inputs and outputs, making it flexible for a wide range of AI applications.
  4. Live Feedback: Gradio provides real-time feedback, allowing users to see the results and interact with the model easily.
  5. Sharing and Deployment: Gradio makes it simple to share and deploy your web apps.
How to get ChatGPT API

To get started, the first and most important step is to sign up using this link: You can easily sign up by using your existing Google or Microsoft account. Once you're signed up, you will need to get a secret API key to use the API. It will look something like below. Make sure to copy your API key and keep it for future reference.


After completing the sign-up process, you will receive a free $5 grant to test the ChatGPT API. This grant will expire after 3 months. Once the grant is exhausted, you will be charged $0.0015 for every 1000 tokens used for GPT-3.5. Tokens are essentially counted as words. It's important to keep your API key confidential and not share it with others, as you will be responsible for any charges incurred by their usage.

For GPT-4 models with 8,000 context lengths (e.g. gpt-4), the pricing is relatively higher than GPT-3.5 model. Pricing of this model is as follows:

  • $0.03/1000 prompt tokens
  • $0.06/1000 sampled tokens

For GPT-4 models with a higher context length of 32,000 (e.g., gpt-4-32k), the pricing is double that of models with an 8K context length.

  • $0.06/1000 prompt tokens
  • $0.12/1000 sampled tokens
Demo : ChatGPT Clone

Please see the GIF image below, which shows how ChatGPT looks and works.

ChatGPT Clone
Install the required packages

Make sure to install these 3 python packages gradio openai kivy. The package kivy will help you to copy ChatGPT output to clipboard on the click of a button.

pip install gradio openai kivy

Python code : ChatGPT Clone

import gradio as gr
import openai
from kivy.core.clipboard import Clipboard

prompt = "Send a message"
def chat(prompt, apiKey, model):
    error_message = ""
        response = openai.ChatCompletion.create(
          model = model,
          api_key = apiKey,
          messages = [{'role': 'user', 'content': prompt}],
          temperature = 0.7  
    except Exception as e:
        error_message = str(e)

    if error_message:
        return "An error occurred: {}".format(error_message)
        return response['choices'][0]['message']['content']

def chatGPT(userMsg, history, modelType, apiKey):
    history = history or []
    comb = list(sum(history, ()))
    prompt = ' '.join(comb)
    output = chat(prompt, apiKey, modelType)
    history.append((userMsg, output))
    return history, history
def lastReply(history):
    if history is None:
        result = ""
        result = history[-1][1]
    return result

with gr.Blocks(theme=gr.themes.Monochrome(), css="pre {background: #f6f6f6} #submit {background-color: #fcf5ef; color: #c88f58;} #stop, #clear, #copy {max-width: 165px;} #myrow {justify-content: center;}") as demo:
    gr.Markdown("""<center><h1>🚀 ChatGPT</h1></center>""")
    with gr.Row():
        with gr.Column(scale=0.5):
            modelType = gr.Dropdown(choices=["gpt-3.5-turbo", "gpt-4"], value="gpt-3.5-turbo", label="Model", info="Select your model type" )
        with gr.Column(scale=0.5, min_width=0):            
            apiKey = gr.Textbox(label="API Key", info="Enter API Key", lines=1, placeholder="sk-xxxxxxxxxxx")
    chatbot = gr.Chatbot().style(height=250)
    state = gr.State()
    with gr.Row():
        with gr.Column(scale=0.85):
            msg = gr.Textbox(show_label=False, placeholder=prompt).style(container=False)
        with gr.Column(scale=0.15, min_width=0):
            submit = gr.Button("Submit", elem_id="submit")
    with gr.Row(elem_id="myrow"):
        stop = gr.Button("🛑 Stop", elem_id="stop")        
        clear = gr.Button("🗑️ Clear History", elem_id="clear")
        copy = gr.Button("📋 Copy last reply", elem_id="copy") (None, None, None), None, outputs=[chatbot, state, msg], queue=False)        
    submit_event =, inputs=[msg, state, modelType, apiKey], outputs=[chatbot, state])
    submit2_event = msg.submit(chatGPT, inputs=[msg, state, modelType, apiKey], outputs=[chatbot, state]), None, None, cancels=[submit_event, submit2_event]), inputs=[state], outputs=None)

demo.queue().launch(inbrowser=True, debug=True)

Features of ChatGPT Clone

The key features of ChatGPT Clone are as follows-

  • Copy last reply: Users can easily copy the previous response generated by ChatGPT, making it convenient for referencing or sharing.
  • Clear History: It offers the option to clear the conversation history, enabling users to start fresh.
  • Ability to stop processing of running code: In case of executing code within the chat, ChatGPT Clone allows users to halt the processing if needed. It is useful to stop processing when it is running for long and not returning anything.
  • Easy switching between model types: ChatGPT Clone allows you to switch between GPT-3.5 and GPT-4.

How to enable ChatGPT's conversation memory

By default, ChatGPT API does not retain the memory of previous conversations. Each API request is treated as a separate chat session, so when ChatGPT responds to your current query, it does not recall any information from your previous questions.

This can be a drawback if you want to improve ChatGPT's responses by asking further questions. It can also help you to design the prompts given to ChatGPT. To make ChatGPT remember prior conversations, you need to provide the context every time you interact with it. Refer the python code below to see how it works.

import openai
import os
os.environ['OPENAI_API_KEY'] = "sk-xxxxxxxxxxxxxxxxxxxxxxx"
openai.api_key = os.getenv("OPENAI_API_KEY")

chatHistory = []
def chat(prompt, modelName="gpt-3.5-turbo", temperature=0.7, top_p=1):
    params = {
        "model": modelName,
        "temperature": temperature,
        "top_p": top_p

    chatHistory.append({"role": "user", "content": prompt})

    response = openai.ChatCompletion.create(

    answer = response["choices"][0]["message"]["content"].strip()
    chatHistory.append({"role": "assistant", "content": answer})

    return answer

chat("square of it")

chat("add 3 to it")
Recall prior conversations in ChatGPT API
Related Posts
Spread the Word!
About Author:
Deepanshu Bhalla

Deepanshu founded ListenData with a simple objective - Make analytics easy to understand and follow. He has over 10 years of experience in data science. During his tenure, he worked with global clients in various domains like Banking, Insurance, Private Equity, Telecom and HR.

0 Response to "How to build ChatGPT Clone in Python"

Post a Comment

Next → ← Prev