ChatGPT Pricing: How Much Does ChatGPT Cost?

ChatGPT has lots of different plans, models and services. But how much do they actually cost? We take a closer look.

ChatGPT’s pricing plans start with a free plan. There’s also a ChatGPT “Plus” plan for users who want the premium experience, which retails at $20 per month and includes access to OpenAI’s most advanced large language model, GPT-4.

ChatGPT also has products and services geared towards businesses, but all of them are priced by the token, rather than per user, per month like most other software is.

So, how much does ChatGPT cost, what’s on offer, and how does it compare to the pricing plans offered by other popular chatbots, such as Anthropic’s Claude? Read on to find out more in this ChatGPT pricing guide.

ChatGPT Pricing and Terms Explained

Before we dive into precisely how much ChatGPT costs, we’ll quickly go through how ChatGPT prices its products and services. ChatGPT’s pricing structure changes depending on what you’re using the chatbot for and the specific services/models you’re using.

If you already know how ChatGPT’s pricing works and you’d prefer to get stuck into ChatGPT’s costs, simply skip this section and continue reading from our rundown of the free version.

Surfshark logo🔎 Want to browse the web privately? 🌎 Or appear as if you're in another country?
Get a huge 86% off Surfshark with this special tech.co offer.See deal button

ChatGPT’s consumer offerings

ChatGPT has a free and premium version geared towards consumer use. These versions of ChatGPT can be used to assist individuals at work with a wide range of tasks, but they can’t be fine-tuned in the same way ChatGPT’s business-focused offerings can, and you can’t edit the training data either. Overall, they’re less customizable.

ChatGPT Enterprise (API)

ChatGPT Enterprise customers don’t pay per account, or even per user, for access to the API. Instead, they pay for “tokens”. When Large Language Models (LLMs) process text, it’s broken down into small units, called “tokens” – which might be entire words, but are typically bits of words.

Enterprise users pay for both input tokens (the information they provide) and output tokens (the responses that ChatGPT generates).

If you’ve been exploring what ChatGPT can do for your business, you’ve probably heard or seen the phrase “context window” (e.g. “GPT 3.5 Turbo has a 16K context window”). The “16K” – or 16,000 – in that sentence refers to the number of tokens

A context window is the maximum amount of text a language model can consider at any one time – which includes the prompt and the response it gives. If you input a prompt that takes a chatbot over its context window limit, it’ll start forgetting the start of any conversation you’ve had with it.

Image generation

Along with ChatGPT, OpenAI also has an image generator called DALL-E. Currently, there’s DALL-E 2, DALL-E 3 standard, and DALL-E 3 HD. From October 19, 2023, DALL-E 3 has been available to all ChatGPT Plus and Enterprise customers.

When using DALL-E, rather than paying for tokens, you’ll instead pay per image generated. The price-per-image will differ depending on the version that you’re using, because the more recent versions are much better at producing accurate imagery of a higher quality.

ChatGPT Free Version

As you’re probably aware by now, ChatGPT has a free version which was launched on November 30, 2022 – and as the previous sentence suggests, you don’t have to pay anything for it. To use it, all you have to do is sign up for an OpenAI account with an email address and a phone number.

ChatGPT’s free version is powered by GPT-3.5 Turbo. This is the latest of the GPT-3.5 class of LLMs. While this is a very advanced LLM, it is not OpenAI’s most advanced effort. In a GPT-3.5 vs GPT-4 head-to-head, GPT-4 comes out on top every time (despite not being as speedy).

Along with access to the chatbot, OpenAI account holders also have access to a free version of DALL-E (DALLE-2) which can generate images, although once more, this isn’t the most advanced version of the software currently available.

ChatGPT Plus Pricing

ChatGPT Plus, which was introduced in February 2023, currently retails at $20 per month. Plus users can use GPT-4, which is more powerful but takes longer to answer, as well as GPT 3.5, which is quicker but not quite as capable.

ChatGPT Plus will let you chat with images and recordings, and you’ll be able to create images without leaving the interface (in the free version, you’ll have to switch to DALL-E 2). You can also use DALLE-3, the most advanced version of the image generator, separately if you like.

Another big difference between the free and paid versions is that you can build your own chatbot using ChatGPT Plus, called GPTs. This feature is not available in the free version of chatGPT. You can also use other peoples’ GPTs, although the GPT store, which was scheduled to open this month, has been postponed to 2024.

ChatGPT Enterprise (API) Pricing

ChatGPT’s Enterprise API models and services are priced per 1,000 tokens. According to OpenAI, 1,000 tokens are roughly equivalent to 750 words.

GPT-4 Turbo pricing

GPT-4 Turbo is ChatGPT’s most powerful LLM and has a 128K context window, which is among the largest in the industry. At the moment, the per-token price is cheaper than it is for GPT-4.

GPT-4 Tubro pricing

GPT-4 pricing

GPT-4 is like GPT-4 Turbo’s incredibly similar little brother – they’re both very powerful, but crucially, GPT-4’s training data cuts off in September 2021, like GPT 3.5.

GPT-4 pricing

GPT-3.5 Turbo pricing

“GPT-3.5 Turbo,” OpenAI says, “is the flagship model of this family, supports a 16K context window and is optimized for dialog.” Along with the Turbo version, there’s an instruct model which supports a smaller context window of 4,000 tokens. As you can see, for both inputs and outputs, it’s a little cheaper than GPT-4 Turbo and GPT-4:

GPt 3.5 pricing

GPT base model pricing

The GPT base models offered by OpenAI are for businesses on a budget – they’re a lot more basic than the likes of GPT-3.5 and GPT-4 and, as OpenAI admits, they’re just not as good at instruction following. However, they’re more than capable of performing a narrow range of tasks if fine-tuned in the right way.

GPT base model pricing

Fine-tuning model pricing

Businesses can create custom versions of ChatGPT by inputting their own training data. ChatGPT will then learn things from this training data and be able to give much more useful responses than if it was just relying on the general dataset used to train all GPT LLMs.

You won’t be charged for inputting training data on these plans, just the input and output tokens used in requests. As you can see, you can fine-tune GPT-3.5 Turbo or, if you can’t afford it right now, a more basic language model.

fine-tuning chatgpt pricing

Assistants API pricing

ChatGPT’s Assistants API was announced at ChatGPT’s first DevDay, which took place in November. It essentially lets developers build their own, customized chatbots for businesses, which can live inside existing applications. It’s currently in beta, and OpenAI says they’re currently working to make it more functional. If you’re using the beta, OpenAI will only charge you for inputs at the moment:

chatgpt assistants api pricing

DALL-E Pricing (Images)

As we’ve covered, OpenAI will charge you per image to use DALL-E, and the prices vary depending on both the model you’re using and the resolution of the images you’re generating. Here’s the pricing table on OpenAI’s website:

DALL-E pricing structure offered by OpenAI

OpenAI Whisper Pricing (Audio)

OpenAI now provides an audio model called “Whisper”, which can convert plain text into audio speech/audio files. Here’s the current pricing as listed on OpenAI’s website:

OpenAI whisper pricing

Does ChatGPT Offer Discounts?

The short answer is “no”. While some software programs like monday.com offer discounts for startups and non-profits, OpenAI is currently not offering any discounts for any businesses. So, if you want access to the Plus or Enterprise capabilities, you’ll have to pay full price.

In business terms, this is quite understandable. The demand for ChatGPT is so high that OpenAI isn’t exactly crying out for new sign-ups, and besides, a free version is already available. It costs around $700,000 a day just to run the chatbot – so Sam Altman and Co. need every cent they can get.

ChatGPT Pricing vs Competitors

When ChatGPT launched in November 2022, it was virtually one of a kind. Now, there are several competitors on the market, and a couple of them outperform ChatGPT when it comes to some tasks while also keeping pace in other areas.

ChatGPT vs Claude

Claude 2 is the most recent LLM powering ChatGPT competitor Claude, a chatbot created by Anthropic. Anthropic is a startup like OpenAI and is financially backed by Amazon and Google, the latter of which owns a 10% stake in the company.

Claude is a highly capable language model and powers Jasper – it also has a 128K context window, meaning it can handle incredibly large inputs.

Like ChatGPT, there’s a free version of Claude, as well as a $20 per month premium version called Claude Pro. Claude also provides per-token pricing for businesses that want to use Claude 2 or 2.1, as well as a lightweight version called Claude Instant. As you can see from the pricing information below, Claude’s prices are set per million tokens, rather than per thousand:

claude pricing for december 2023

For more information on Anthropic’s pride and joy, check out our guide to Claude.

ChatGPT vs Bard

Google’s ChatGPT competitor Bard was rushed to launch in February 2023 – around the same time that OpenAI launched ChatGPT Plus. Then, it was powered by the LaMDa large language model, which a Google employee once said displayed sentience.

More recently, it’s been shifted to PaLM 2, a more powerful model, which Google says is faster and more efficient than LaMDA. Soon, it’ll be making the switch to Gemini, which creators DeepMind have said is more powerful and intelligent than GPT-4. For now, however, GPT-4 Turbo is widely considered to be the most powerful language model available.

Despite its reputation for the odd mistake, Bard’s big advantage is that it’s completely free. All you need is a Gmail account (personal or business), which doesn’t cost anything to create, and you’ll have access to its full functionality. To see how the two chatbots stack up against each other in a series of tests, check out our Bard vs ChatGPT comparison.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Aaron Drapkin is a Lead Writer at Tech.co. He has been researching and writing about technology, politics, and society in print and online publications since graduating with a Philosophy degree from the University of Bristol five years ago. As a writer, Aaron takes a special interest in VPNs, cybersecurity, and project management software. He has been quoted in the Daily Mirror, Daily Express, The Daily Mail, Computer Weekly, Cybernews, and the Silicon Republic speaking on various privacy and cybersecurity issues, and has articles published in Wired, Vice, Metro, ProPrivacy, The Week, and Politics.co.uk covering a wide range of topics.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today