GPT-4 Turbo vs GPT-4: What Is OpenAI’s New ChatGPT Turbo?

OpenAI held its annual DevDay conferenceand used it as an opportunity to announce a raft of changes to ChatGPT and other products, including wholesale price reductions for developers and a brand new language model for the chatbot called Turbo. Here’s what it is and how the key GPT-4 Turbo vs GPT-4 differences you should about.

GPT Turbo is a more advanced version of GPT-4 with a much larger context window, and it has knowledge of world events up to April 2023. OpenAI has also launched an API you can use to build assistants and a way to make custom versions of ChatGPT.

However, the changes aren’t currently available to all ChatGPT users. For each of the below changes/announcements, we’ve provided information on which account holders can access the different language models.

What Is GPT-4 Turbo?

GPT-4 Turbo is the latest language model to be released by ChatGPT owner OpenAI. It’s more powerful than the previous two language models that were used to power ChatGPT, GPT-4 and GPT-3.5.

ChatGPT has famously struggled to give accurate answers on events that happened after its training data set was cut off, which until this point was September 2021.

However, OpenAI’s GPT-4 Turbo chatbot has knowledge of events up until April 2023. In the wake of Elon Musk’s xAI launching a chatbot boasting access to real time information, this is a key update in the budding Grok vs ChatGPT rivalry.

Surfshark logoSurfshark logo🔎 Want to browse the web privately? 🌎 Or appear as if you’re in another country?
Get a huge 86% off Surfshark with this Tech.co Black Friday offer.See deal buttonSee deal button

GPT-4 Turbo can accept images as inputs, text-to-speech prompts, and integrates with DALLE-3. It also has an enlarged 128K context window, which means it can take prompts equivalent to around 100 pages of text. In short, GPT-4.5 Turbo vs GPT-4 is a straightforward win for the newer model, but there’s so much more to it than that.

Who can access GPT-4 Turbo?

OpenAI says that “GPT-4 Turbo is available for all paying developers to try by passing GPT-4-1106-preview in the API”, and revealed that the company plans to release “the stable production-ready model in the coming weeks.”

GPT 4.5 Turbo vs GPT-4 vs GPT-3.5 Turbo: How ChatGPT’s Models Compare

There are a number of key differences between OpenAI’s models. GPT-4 Turbo is a significant upgrade on its sister model GPT-4 – which itself differs quite greatly from GPT-3.5, the language model that powered ChatGPT when it was first launched back in November 2022.

Along with the release of GPT-4 Turbo, OpenAI has also released a new version of GPT-3.5, called GPT-3.5 Turbo, which has a 16K context window by default and exhibits improved instruction following. The company has confirmed that all applications using the old GPT-3.5 Turbo model will be updated on December 11, 2023.

Here are the key differences between GPT-3.5 Turbo, GPT-4 and GPT-4.5 Turbo:

What are Custom GPTs?

OpenAI is now rolling out a new product called “GPTs”, which they describe as “custom versions of ChatGPT that you can create for a specific purpose”. OpenAI envisages people building them for tasks at home and in the workplace, and then sharing these creations with others.

At the DevDay conference, OpenAI employees built their own chatbot agents – and it looks like the sort of thing that any knowledge worker could do. No coding knowledge is required.

OpenAI says you could create a custom GPT that conducts data analysis or even crawls the web for information. “Many power users maintain a list of carefully crafted prompts and instruction sets, manually copying them into ChatGPT,” the company said in a recent blog post. “GPTs now do all of that for you.”

Soon, OpenAI will have a GPT store within the month. Developers will have a brand-new way to make money with ChatGPT when this happens because OpenAI says it’ll let those who create the most popular GPTs earn money through the store.

Who can access ChatGPT’s custom GPTs?

OpenAI says that you start building GPTs today – but a post on OpenAI’s help portal confirms that it is only available to ChatGPT Plus and Enterprise customers.

If you’re a free user and you try and access one of the company’s example GPTs (there’s a Canva and a Zapier one that has already been built) you’ll be informed that you’ll have to wait a little bit longer for access. Paying customers can access these example versions.

What Is OpenAI’s Assistants API?

ChatGPT’s new Assistants API is built on the same technology as the new custom GPTs, with the goal of “helping people build agent-like experiences within their own applications”.

Use case examples given by OpenAI include a data analysis app, an assistant that helps with coding, and an AI-powered vacation planner.

You can augment your assistant with information and data from your organization, although OpenAI reiterates that the data you input into the models will not be used to train them and that developers can delete the data whenever they choose.

Who can access the Assistants API?

You can now access the Assistants API beta by logging in with the same credentials you use to access ChatGPT. Although it requires no coding, you’ll need a basic level of technical knowledge to use this tool effectively.

ChatGPT assistant playgroundChatGPT assistant playground

ChatGPT’s Reduced Pricing Model

ChatGPT has also announced that it will be reducing token prices, “passing on savings to developers” in the process.

Tokens – the basic units that large language models process – are now going to be a lot cheaper on several GPT models. OpenAI describes tokens as pieces of words; input tokens are the pieces of words that make up prompts, whereas output tokens make up responses.

GPT-4 Turbo input tokens are now three times cheaper than GPT-4 tokens. They cost just $0.01, while output tokens cost $0.03, which is half the price of what they cost for GPT-4.

GPT-3.5 Turbo tokens are also 3x cheaper than they were for the previous version of GPT-3.5 with the 16K context window at $0.001, while output tokens are also half price, costing just $0.002 per token.

Developers that are using the 4K context window version of GPT-3.5 Turbo will have their token prices reduced by 33% (now $0.001). These prices refer exclusively to the new 16K version on GPT-3.5 Turbo.

Using ChatGPT at Work

Workers across the globe are finding new, inventive ways to use ChatGPT every day. However, using such a powerful tool to cut down on the time you’re spending on tasks comes with a variety of different considerations.

For one, most business leaders believe that staff should be asking permission before using AI tools like ChatGPT at work. If you’re planning on using AI for any task, make sure to be transparent about it with your manager/head of department in order to avoid confusion and mistakes.

This is particularly important if you’re using it to generate anything that you’ll be sharing with clients or customers. As you may be aware, ChatGPT and other AI tools like Bard have a tendency to “hallucinate” – so proofreading and fact-checking the content they produce for you is essential, not optional.

It’s also important to be transparent about your usage because what ChatGPT does with your data depends on which product you’re using, and there are ways to opt out of it being used for training purposes. Also, your workplace’s guidelines on the type of task you can call on ChatGPT to help you with may be linked to the sort of data they’re happy with you sharing with it.

The post GPT-4 Turbo vs GPT-4: What Is OpenAI’s New ChatGPT Turbo? appeared first on Tech.co.

Originally published on Tech.co : Original article

Leave a Reply

Your email address will not be published. Required fields are marked *