GPT-4 Turbo vs GPT-4: What Is OpenAI’s ChatGPT Turbo?

OpenAI just announced a brand new language model, GPT-4 Turbo, as well as other key updates. Here's what you should know.

OpenAI held its annual DevDay conferenceand used it as an opportunity to announce a raft of changes to ChatGPT and other products, including wholesale price reductions for developers and a brand new language model for the chatbot called Turbo. Here’s what it is and how the key GPT-4 Turbo vs GPT-4 differences you should about.

GPT Turbo is a more advanced version of GPT-4 with a much larger context window. OpenAI has also launched an API you can use to build assistants and a way to make custom versions of ChatGPT.

However, the changes aren’t currently available to all ChatGPT users. For each of the below changes/announcements, we’ve provided information on which account holders can access the different language models.

What Is GPT-4 Turbo?

GPT-4 Turbo is the latest language model to be released by ChatGPT owner OpenAI. It’s more powerful than the previous two language models that were used to power ChatGPT, GPT-4 and GPT-3.5.

ChatGPT has famously struggled to give accurate answers on events that happened after its training data set was cut off, which was initially September 2021, but this was extended to January 2022.

However, OpenAI’s GPT-4 Turbo chatbot has knowledge of events up until April 2023. In the wake of Elon Musk’s xAI launching a chatbot boasting access to real time information, this is a key update in the budding Grok vs ChatGPT rivalry.

 

About Tech.co Video Thumbnail Showing Lead Writer Conor Cawley Smiling Next to Tech.co LogoThis just in! View
the top business tech deals for 2024 👨‍💻
See the list button

GPT-4 Turbo can accept images as inputs as well as text-to-speech prompts. However, the drop-down menu that ChatGPT Plus has been using to switch between other OpenAI apps like DALLE-3, is being retired. Now, ChatGPT will work out what sort of output you need based on your prompts.

GPT-4 Turbo also has an enlarged 128K context window, which helps it take prompts equivalent to around 300 pages of text. In short, GPT-4 Turbo vs GPT-4 is a straightforward win for the newer model, but there’s so much more to it than that.

Who can access GPT-4 Turbo?

OpenAI says that “GPT-4 Turbo is available for all paying developers to try by passing GPT-4-1106-preview in the API”, and revealed that the company plans to release “the stable production-ready model in the coming weeks.”

This means that the language model is only available as a preview right now. I the pattern of its prior releases continues, ChatGPT Plus and Enterprise customers will be the first to gain full access.

GPT 4 Turbo vs GPT-4 vs GPT-3.5 Turbo: How ChatGPT’s Models Compare

There are a number of key differences between OpenAI’s models. GPT-4 Turbo is a significant upgrade on its sister model GPT-4 – which itself differs quite greatly from GPT-3.5, the language model that powered ChatGPT when it was first launched back in November 2022.

Along with the release of GPT-4 Turbo, OpenAI has also released a new version of GPT-3.5, called GPT-3.5 Turbo, which has a 16K context window by default and exhibits improved instruction following.

Here are the key differences between GPT-3.5 Turbo, GPT-4 and GPT-4.5 Turbo:

0 out of 0
Name
Creator/Owner
Trained On Data Up Until
Accessible To
Prompt Inputs
Context Window

GPT-3.5

GPT-4

GPT-4 Turbo

OpenAI

OpenAI

OpenAI

January 2022

April 2023

April 2023

All ChatGPT users

ChatGPT Plus users

Paying developers (preview)

Text

Text
Images

Text
Images (stable release)
Text-to-Speech

16,385 tokens (GPT-3.5 turbo-1106)
4,096 tokens (GPT-3.5 turbo)

8,192 tokens (GPT-4)
32,000 tokens (GPT-4-32K)

128,000 tokens

What are Custom GPTs?

OpenAI is now rolling out a new product called “GPTs”, which they describe as “custom versions of ChatGPT that you can create for a specific purpose”. OpenAI envisages people building them for tasks at home and in the workplace, and then sharing these creations with others.

At the DevDay conference, OpenAI employees built their own chatbot agents – and it looks like the sort of thing that any knowledge worker could do. No coding knowledge is required.

OpenAI says you could create a custom GPT that conducts data analysis or even crawls the web for information. “Many power users maintain a list of carefully crafted prompts and instruction sets, manually copying them into ChatGPT,” the company said in a recent blog post. “GPTs now do all of that for you.”

OpenAI will have a GPT store within the month. Developers will have a brand-new way to make money with ChatGPT when this happens because OpenAI says it’ll let those who create the most popular GPTs earn money through the store.

Who can access ChatGPT’s custom GPTs?

OpenAI says that you start building GPTs today – but a post on OpenAI’s help portal confirms that it is only available to ChatGPT Plus and Enterprise customers.

If you’re a free user and you try and access one of the company’s example GPTs (there’s a Canva and a Zapier one that has already been built) you’ll be informed that you’ll have to wait a little bit longer for access. Paying customers can access these example versions.

What Is OpenAI’s Assistants API?

ChatGPT’s new Assistants API is built on the same technology as the new custom GPTs, with the goal of “helping people build agent-like experiences within their own applications”.

Use case examples given by OpenAI include a data analysis app, an assistant that helps with coding, and an AI-powered vacation planner.

You can augment your assistant with information and data from your organization, although OpenAI reiterates that the data you input into the models will not be used to train them and that developers can delete the data whenever they choose.

Who can access the Assistants API?

You can now access the Assistants API beta by logging in with the same credentials you use to access ChatGPT. Although it requires no coding, you’ll need a basic level of technical knowledge to use this tool effectively.

ChatGPT assistant playground

ChatGPT’s Reduced Pricing Model

ChatGPT has also announced that it will be reducing token prices, “passing on savings to developers” in the process.

Tokens – the basic units that large language models process – are now going to be a lot cheaper on several GPT models. OpenAI describes tokens as pieces of words; input tokens are the pieces of words that make up prompts, whereas output tokens make up responses.

GPT-4 Turbo input tokens are now three times cheaper than GPT-4 tokens. They cost just $0.01, while output tokens cost $0.03, which is half the price of what they cost for GPT-4.

GPT-3.5 Turbo tokens are also 3x cheaper than they were for the previous version of GPT-3.5 with the 16K context window at $0.001, while output tokens are also half price, costing just $0.002 per token.

Developers that are using the 4K context window version of GPT-3.5 Turbo will have their token prices reduced by 33% (now $0.001). These prices refer exclusively to the new 16K version on GPT-3.5 Turbo.

Using ChatGPT at Work

Workers across the globe are finding new, inventive ways to use ChatGPT every day. However, using such a powerful tool to cut down on the time you’re spending on tasks comes with a variety of different considerations.

For one, most business leaders believe that staff should be asking permission before using AI tools like ChatGPT at work. If you’re planning on using AI for any task, make sure to be transparent about it with your manager/head of department in order to avoid confusion and mistakes.

This is particularly important if you’re using it to generate anything that you’ll be sharing with clients or customers. As you may be aware, ChatGPT and other AI tools like Bard have a tendency to “hallucinate” – so proofreading and fact-checking the content they produce for you is essential, not optional.

It’s also important to be transparent about your usage because what ChatGPT does with your data depends on which product you’re using, and there are ways to opt out of it being used for training purposes. However, if you haven’t turned your chat history off, OpenAI has the right via their privacy policy to use your data in this way.

Your workplace’s guidelines on the type of task you can ask ChatGPT to help you with may be linked to the sort of data they’re happy with you sharing with it, so it’s always good to check.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Aaron Drapkin is Tech.co's Content Manager. He has been researching and writing about technology, politics, and society in print and online publications since graduating with a Philosophy degree from the University of Bristol six years ago. Aaron's focus areas include VPNs, cybersecurity, AI and project management software. He has been quoted in the Daily Mirror, Daily Express, The Daily Mail, Computer Weekly, Cybernews, Lifewire, HR News and the Silicon Republic speaking on various privacy and cybersecurity issues, and has articles published in Wired, Vice, Metro, ProPrivacy, The Week, and Politics.co.uk covering a wide range of topics.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today